Apple's pile of awesome: Programmed with crap

posted by Jeff | Monday, June 2, 2014, 11:43 PM | comments: 0

There is no question that Apple has changed the world, and its most important influence has been in just the last eight years. My first Mac landed on my desk at work in 1998 or so, for the purpose of editing video. Then in 2003 or so, Stephanie scored an iBook with a G3. OS X was pretty cool. When they switched to Intel in 2006 and I could run OS X and Windows, I switched. I'm on my 4th Mac laptop now, and each one has been my favorite computer. Then there were the iPhones that I loved (I had the original and the 3GS), and the iPods before that. I never got into the iPad as much, but I do have one, and I get how important they are.

But with all of these awesome shiny objects, the one thing that has been remarkable is how completely awful Apple's development tools have been. The underlying API's and class libraries are a little convoluted, but not terrible. It's the Objective-C language that is something right out of a previous generation. It's too low level at times, hard to understand, hard to read, lacks language features that others take for granted... it's just awful. If that weren't bad enough, Xcode, Apple's IDE (integrated development environment), is probably the worst programming tool I've ever tried to use. The interface designer in particular is strikingly bad.

To be fair, sure, my experience is not incredibly broad. I started with Atari Basic, then Apple Basic, then MS Basic, a little Pascal, a little Perl, and eventually Javascript, VBscript, Visual Basic (the .NET version, not "classic"), a little Java, and the language that has more or less sponsored my career, C#. I've dabbled and done some "hello world" stuff with Ruby and Python, but never anything serious. The IDE I know best is obviously Visual Studio, and particularly with the ReSharper plug-in, it's like delicious coding and debugging magic. I've also used Eclipse a little, and recently Xamarin Studio.

Still, I don't think I have a bias problem. I mean, I totally get why people love Ruby, and I mostly used Notepad in my experimentation with it! I don't go deeper with it because the work (and admittedly, the money) is where I already have better skills. If I had to concentrate on Ruby, that would be OK, because I don't think it sucks. Objective-C is entirely different, in a bad way. Look around the Interwebs for questions asking if it has generics or async tasks or anything else we take for granted, and you'll get horrible solutions that get way too in the weeds for things that a compiler should take care of for you.

"But Jeff," you say, "Look at how many people code with this, and how many millions of apps there are!" Well of course people are doing it... because it's where the consumers are! They frankly don't have much of a choice if they want to sell to people using this platform. The people I've talked to who live in this world daily aren't crazy about it at all, especially if they have experience with Java or C#. That's an anecdote, sure, but who cares? My opinion is still that it's a terrible language.

Not only that, but there's the amazing evolution of open source around other languages that blows the mind. Throw in package management, and you can get to real shipping products pretty quickly.

And that's where a glimmer of hope surfaced for a new direction. Apple today announced a new language called Swift. My first impression is that it's far better than Objective-C, but certainly not as evolved as other languages. I guess the best I can do is interpret it as a forward moving change. It doesn't look like Xcode is getting any better aside from a "sandbox" mode that allows for real-time execution, though that's the kind of thing that to me is better vetted with unit testing.

But hey, you can write "iCode" now in C# if you're willing to pay for Xamarin, so that's an option.


Comments

No comments yet.


Post your comment: