Jump to content

Ios And Android


Vicious
 Share

Recommended Posts

It was bad wording on my part. What I meant was, iOS is really the only modern OS that uses Objective C. In iOS, you are allowed to use more or less any compiled language that will compile with the same object format as Xcode functions. As far as I know, you still can not use a Just-In-Time compiled language unless you pre-compile all binaries before submitting the code for review. This may have changed, as I have not done any iOS development in a few months. Either way, it discourages use of most other languages by developers.

I classified computer technicians in the same category because most (good) computer technicians have a more thorough understanding of hardware than any PM, BA, or Web designer in an IT org. Do they have the understanding to explain to me the difference between inheritance and polymorphism? Probably not. I am also speaking about a professional computer tech, not Billy Bob's computer repair shop. My company is actually one of the few that I have seen that requires a Bachelors degree for a technician. Perhaps a new trend?

Not really sure what you mean by "Also I think we should start seeing IT and CS as two separate things". Most any job family that could be classified as a computer science (ie. software/hardware development) is a only but a part of what makes up a proper IT organization for any major corporation. It doesn't make sense to separate them, as it is only one part of a total sum.

If you mean uses it extensively then yea, but still I do not see how that is a negative other than the syntax being completely different than other C-based languages. It is still not proprietary, so someone could make a program using Objective-c for a windows application. It would be more a question of why would you want to. Csharp and .NET would be a better example of proprietary.

That is my point. Software Development/Computer Science is very different from Information Technology. Companies group them, because it is easier for them, but I think we should respect both and keep them as separate and not a subset of one another.

Link to comment
Share on other sites

If you mean uses it extensively then yea, but still I do not see how that is a negative other than the syntax being completely different than other C-based languages. It is still not proprietary, so someone could make a program using Objective-c for a windows application. It would be more a question of why would you want to. Csharp and .NET would be a better example of proprietary.

That is my point. Software Development/Computer Science is very different from Information Technology. Companies group them, because it is easier for them, but I think we should respect both and keep them as separate and not a subset of one another.

Ummm the syntax is a huge negative. The learning curve adds overhead that frankly just doesn't exist with other C family languages.

I still fail to see your logic in separating Information Technology and CS. I am a developer, I am dependent on many other areas of my company's IT organization. I depend on server admins to build, maintain, and patch my application servers. I depend on DBAs to optimize and backup my applications databases. I depend on project managers to define the scope of work and set deadlines for projects. You see where I am going with this. By definition "Information Technology" is the application of computer systems to maintain and distribute data. This requires much more than just software developers.

Link to comment
Share on other sites

Ummm the syntax is a huge negative. The learning curve adds overhead that frankly just doesn't exist with other C family languages.

I still fail to see your logic in separating Information Technology and CS. I am a developer, I am dependent on many other areas of my company's IT organization. I depend on server admins to build, maintain, and patch my application servers. I depend on DBAs to optimize and backup my applications databases. I depend on project managers to define the scope of work and set deadlines for projects. You see where I am going with this. By definition "Information Technology" is the application of computer systems to maintain and distribute data. This requires much more than just software developers.

But that is what I have been saying this whole time. That is the only negative, which is a huge one. It is not because it is proprietary(which it is not) or because iOS is the only platform that uses it, which were the two points that your brought up.

Everyone relies on IT. As you said it is about maintaining and distributing data, but CS is not that. Obviously, CS is going to work very closely with IT, like any other department, but they have both grown to the point where they deserve to be recognized as independent.

My point was never that IT is just developers. My point was the developers are apart of computer science and DBAs, sysadmins, etc are apart of IT and that CS and IT are uniquely different things.

But this is extremely off topic, so I will drop the subject and let people go back to the apple bashing.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...