The key concept is versatility. Given a computer - any computer - it can be programmed to solve problems in a specific class. The the programmer, it is highly logical that this be the case. We don't, usually, put much thought in just how amazing a feat it is. To us, the question is really to ask the question of how to achieve the output of a given program. We train on smaller problems by writing Tic-tac-toe, Chess games, hello world, twitter clones and so forth. We exercise mentally by writing 3d engines, raytracers, MP3 encoders, parallel K-means clustering in a quest to figure out how a program can be written. This is also a very software specific view of what a computer is.
But this is in the eye of the beholder. The reason I write this blog post is due to a hypothesis which has been lurking in my head for a couple of days:
To most non-programmers, the computer is non-universal. It is a specific machine for a specific purpose and you have to buy a new one once in a while to stay up to date.
In other words, the computer is simply an advanced hammer. If you need a screwdriver, you need to go buy a screwdriver, that is another computer for that purpose.
Now, I now that hardware changes and that newer computers are more powerful, have different input methods, better sensors and so on than what the older piece of hardware had. But this just muddles the mind. The point is that there may be a sizable portion of the human population who do not grasp the sheer versatility in the modern computer.
If the software was right, I could take a smartphone, plug it into a monitor and I would have a state-of-the-art computer from around 2006-2007. Also, I would not be limited to running iOS on Apple hardware only and I would be free to run whatever software I wanted on any phone out there. Again, commercial interests confuses people here.
Another thought experiment: "How many people believe that to run Facebook, you need a new computer?". How about getting the equivalent of Apple's Siri on an Android phone? I ask, because there are some out there who seems to believe that there is something magic to certain devices which enables them, as the only devices in the world, to carry out a mundane software-centric task.
There is also a nice example from the CPU world. Intel at least planned to enable software upgrades of their CPU hardware. That is, you download a program which then in turn unlocks your hardware so it can do what it was originally built to do. Another incident was with Creative Labs back in the day where they used a driver to artificially limit certain old hardware - in turn forcing the customer to upgrade his hardware, even though it worked.
All these incidents cements the purpose of the computer to be a concrete machine. So you have to buy a new one.
But here is the problem: decision makers, politicians in particular, who don't understand this can not make the right decisions. If you think there is any manufacturing needed in a future world, you are thinking wrongly. Micro-manufacture becomes a possibility with the 3d printer. The computers width in what it can do means that prototyping is virtually possible for anyone. You just need to rent a garage and get to work.
Except that if you don't understand what a computer is, you will have a very hard time grasping this. In the new world, the post-industrial age, it is cheap to design a new product or provide a new service. And these can utterly remove an older product or service very quickly.
Daniel Lemire makes this point better than I: A post-industrial point of view
And with that, I will stop my musing by posing questions:
- Am I too pessimistic when I view fellow humans as unable to grasp what a computer is?
- Am I too software-oriented? The hardware also plays a crucial role - is the hardware the primary driver or is the software?
- Do I worry too much about the fact that politicians, at least in Denmark, are old with virtually no-one having technical skill or merit whatsoever?
View comments