Objective-C Is Kicking Butt in the Programming World
Objective-C has all the benefits of Java (no memory management hassles for the programmer), but it has the deterministic resource release of C++. It accomplishes this feat of magic through Automatic Reference Counting (ARC). Also, this means that there are no "garbage collection stalls" or program stuttering because there isn't any garbage collection. Resources are released when no longer needed. ARC is built into the compiler/IDE (integrated development environment) tooling so the programmer doesn't have to keep track of things.
Objective-C builds are very fast and allow very quick testing/iteration.
Simplicity of Source Code, Artifacts
With Objective-C, developers can still typically have header files (.h) and implementation (.m), but the IDEs manage these nicely and the separation maintains the ability to publish an interface to proprietary code without giving away the crown jewels. With Java, you have to obfuscate to get the same effect.
Objective-C is compiled for a specific architecture so it's fast but doesn't run in a virtual machine (VM). However, this is hardly an issue. The same code compiles for Mac (Intel) and for iOS (ARM). There are different binaries, but that's just a matter of having a compiler. Native code will always win the speed/size race over VM's. The important point here is that the programmer doesn't care. They just select the desired hardware target and go. So the code can be written once and used elsewhere. That's admittedly not the same as running the identical binary anywhere, but if performance counts, you get a better result with native code.
Objective-C does dynamic linking.
Objective-C is one of the GNU languages and runs on many platforms (including Linux) although it is clearly dominant on the Mac and iOS platforms. One thing that is really nice is that the same tooling/knowledge gives developers access to both the Mac (Intel) and iOS (ARM) worlds. Developers get mobile and desktop with one investment in language. Many of the classes you write for one are completely portable to the other because, as a developer, you're programming both with the same language.
Standard Type System
Objective-C has standard types.
This is handled a bit differently in Objective-C but developers have the ability to query objects at runtime with respect to their capabilities. This allows for the same kind of generalization and framework creation.
Objective-C typically will blow the doors off C++/Java because it ends up being bad-fast C code under the covers, said Babcock. "So from my perspective, I get the performance of bad fast C code with the convenience/efficiency of programming in a sane OO [object-oriented] language."
Objective-C takes most if not all the memory management considerations off the back of the programmer. Developers can do "C" things, if needed, but they are rarely if ever needed. ARC lets the programmer just assign/create objects without worrying about allocation, "sizeof," free, etc. Developers don't have to remember to code a destructor. This is all done for you, so developers can focus on the objects and object-oriented behavior rather than managing memory.
Objective-C startup time is comparable to fast C codebecause it fundamentally is fast C code under the covers.
Objective-C has all the advantages of Cit uses up to two times less memory than Javawithout the warts.
Not only is destruction deterministic, it is automatic unless you specifically intervene for your own reasons. You can do that if it is required for some reason to keep an object around, but that's seldom the case. Deterministic destruction is useful for managing resources.
Objective-C plays very nicely with Cbecause it is fundamentally is Cand OS callbacks.
General Fundamental Support for Key Components
Objective-C features support for networking, XML, databases, etc.
Objective-C does threading very nicely with minimal burden on the programmer. But an even better solution is available in the Apple world since they built Grand Central Dispatch into the OS. That's the real answer, said Babcock. "You don't want to be bothered as a programmer with setting up threads," he said. "You just want to say: 'This piece of code can be run in parallel,' and perhaps provide additional information about other parallel pieces that are supposed to come before, after or wait upon one another. Ultimately, it's about optimal resource use. The OS by definition controls the resources, memory, CPU cores, etc. So by definition, the OS is the place where the "dispatching" of these resources should occur, not the individual program. At the moment, Objective-C on Apple platforms is the only game in town that allows the programmer to fully take advantage, whatever resources the machine has without a priori knowledge 'baked into' their code."