My first desktop PC didn’t much resemble the PCs of today. It was a TRS-80 Color Computer II, with 16K of RAM, a single cartridge slot, and two joystick ports. If you’re like me, you also had a computer like this one — maybe a TI-99/4A, a Commodore 64, or a PET. Chances are, it booted into an interpreted BASIC command line prompt. For many of us writing software today, our first experiments in software development came from looking at the prompt and wondering, “What can I do here?”
I saw Google last week proclaiming the impending death of the desktop computer, in favor of ubiquitous mobile computing with computing power provided by the cloud. I see more people replacing their gaming PCs with consoles, their family desktops with notepads, and their notepads with iPads and iPhones.
As a geek who loves gadgets, I’m not opposed to this. I love progress, and I love the shiny new technology we have access to now. But I can’t help but look at what happens when a young kid first boots up a device like this.
Imagine a world where the iPad is the ubiquitous platform of choice. Where do you get applications for your iPad? From one vendor: Apple. How do you write an application for your iPad? You ask Apple for permission. You apply to be a developer, ponying up $99 a year to do so. You buy the specific hardware they support, so you can develop and test. You learn their proprietary SDK, write something, and then want to share it with your friends. How do you do that? You ask permission again, before they’ll put your software on their store.
These are not huge barriers to people who seriously want to develop iPhone or iPad software, as demonstrated by the 100,000+ applications currently available for download. But it’s a moderate barrier to the hobbyist, and an insurmountable one to the middle-school kid with a dream and some spare time, and a further insurmountable one to the person who just wants to experiment and share with friends and has no desire to publish to the world.
I don’t want to make Apple look like the villain here. Consumers are demanding easier to use devices that “just work” and companies like Apple, Sony, and Microsoft are stepping up. But I have to wonder what impact this is going to have on the future generation of software engineers who are being born today (or who were born 3 to 5 years ago). Will we have a generation of people who are expert users but have no inclination to build? Or will the definition of “build” change in some way?
(Thanks to Michael Surran for the completely awesome photo on flickr which I am using in this post. His use of the Creative Commons license has made it possible for me to show you exactly the image I wanted for this post with a clean conscience.)