Back in November 2012, I read a book by Neal Stephenson called “In the Beginning… was the Command Line.” He begins by chronicling parts of his own history with computing devices starting back in the 1970’s and his own journey through the life of being a geek/tech savvy user. What has always struck me about technology, and culture as a whole, is that everything we know and do is built upon the blocks of something that was there before us. Computing devices use analogies and ideas from telegraph technology, which was imitated by early mainframes and teletype devices to extending this concept to a “video” teletype, what we know as a monitor. Stephenson hits on this here and there throughout the book.
The same is true for many industries, innovations, products, and ideas that would never exist were it not for the work of someone before us.
The book was written back in 1999. The topics covered hot issues of the time such as Mac vs. Windows, whether Apple would still be around in the 2000’s, the relative newcomer and open source OS Linux, along with a nod to BeOS (I miss you still!). These things were huge areas of discussion in the tech world during my late high school and then college days. Reading and remembering when these issues were current reminded me that I am not as young as I appear. One thing that caught my attention is how the same analogies and challenges within the computing world have not changed much, if at all. Apple was seen as the closed box system from their operating system software to their desktop and laptop hardware from the beginning. Perhaps not so ironically, that mindset in the world of Apple is even stronger today. On the flip side, Microsoft was seen as the “free market” option because a consumer had choice on what hardware they wanted Windows to run on. Inexpensive components were made available and you could build your PC to your own needs.
14 years later, that core concept is still true in those two camps. However, Microsoft is showing signs that they want to control the world their OS runs on, perhaps thinking because it’s working so well for Apple it could work for them as well. Now Android is the most open “free market” OS for mobile hardware and software vendors to explore their creativity and show off. And after all these years, Apple has somehow become even more closed in their culture of the hardware/software ecosystem and going down a path where general computing devices like laptops and desktop computers lack the ability to be upgraded (RAM, storage) or maintained (non-removable batteries in laptops) after purchase by the consumer.
The reflections on culture within the realm of technology are vast and concerning to me. This black box mindset that the end user should never know or need to know what happens inside something they use as troubling. On one hand, the goal may be a simple, easy to use experience for the user which itself isn’t a terrible thing. However, to me it can assume a lack of user intelligence and catering to the lowest common denominator doesn’t push a culture or society forward. Perhaps worse, though, is that it conveys a mindset that the user shouldn’t need to learn anything beyond their current knowledge base and perpetuates everything should be simple and easy and that working hard is best left to others. I’m looking at you, Roku radio commercials (listen here.) And that is by far a larger cultural shift that should worry us all.
We’ve come to a place in technology and culture that assumes computing devices shouldn’t require any training or time to be able to figure out how to use them. We should simply be able to pick up a device, turn it on, and instantly know what to do. I have heard similar arguments at my job in the past where the expectations of employees is to know how to do their job, not know how to use their computers. Leave it to some strong marketing ideas or some unfounded concepts, but I fear for where this will take us in the long run.
There is already a push for this in cars that now automatically parallel park or protect the drivers if they fall asleep at the wheel (thanks Mercedes for subtly telling us your car owners are old and prone to doze off when driving.) How many more years will it be until driver’s education simply skips over driving tasks that not so long ago were considered essential skills to get behind the wheel?
Where is all this going? TL;DR: the average person is becoming lazier and we’re becoming more OK with other making our decisions and choosing what’s best for us. We’re fine with not knowing the “how” behind much of what drives our daily lives and perhaps more importantly, we’re not asking “why” these things are there in the first place.
So what do we do now? That, my dear reader, is for a future post.