Charles Babbage invented ‘the computer’ in the early 19th Century. His machine fills a massive room at the Science Museum and performs one task. Differential calculus. Useful. A hundred years later and Alan Turing, the 2nd World War code-breaker, came up with his new, improved version. Which had more similarity to a tractor than to an iphone but he was on the right track.
Their machines were just that; mechanical devices geared in such a way that they could produce results by converting every stage of a calculation to a ‘binary’ process. One that is either ‘yes’ or ‘no’, then passes on to the next gate for the next yes/no decision. Its how computers work. I know because…
In the early 1970s I took ‘computer science’ as an optional in the sixth form at school. Probably the only ‘optional’ academic thing I ever volunteered for. And we would take a formula, a very basic formula, and create a problem ‘for the computer’ using that formula. A problem for which any kid could work out the answer on the back of a fag packet in about 6 seconds. And every schoolkid had a fag packet in those days. But we would translate that formula into a ‘computer language’, cos they didn’t speak English in those days, and then came the laborious task of ‘punching’ the translated formula onto cards, making little holes in the appropriate places. Essentially this was converting the formula to binary, but they never told us that. Took fucking hours. Then we’d drive half an hour to North East London Polytechnic at our allotted time-slot, to use their ‘main-frame’, inserting our little stacks of pre-punched cards in the correct place. Marked ‘here it is!’ on the side of an immense electro-mechanical structure. The next week we’d return and pick up the print out. “Yes!!!!” we’d cry, 4 plus 5 equals 9. Fucking brilliant machine. I love computing.
Then someone invented the ‘silicon chip’. The same silicon responsible for Pamela Anderson’s amazing chest, enabled all those binary gates to be compressed into thing no bigger than… no bigger than… a silicon chip. Which was very small.
40 years later and I’m sitting at my ipad, blue-toothed to its keyboard, wifi’d up by magic in such a way that every piece of information in the entire world is 3 clicks away and I can compute as I walk along the street and annoy everyone by getting in their way, bashing into doorways and dawdling at a snail’s pace.
The progress of computing is not linear. It is exponential, because new computers are designed by the old computers, not by people.
And now no lesser mortal (probably an inappropriate phrase in the circumstance) than Stephen bloody Hawking has said that Artificial Intelligence will result in the end of humans. We will become extinct because of the robots we’ve created. And this coming from a man who is one part the greatest living brain and six parts computer/robot. Bit ironic really. Biting the hand that feeds him, but literally so.
Its all a bit Terminator, a little 2001 Space Odyssey, with the robots taking over. And he may indeed have a point.
They just need to make computers and/or robots a bit more shaggable. Work to do there, then.
Happy wednesday
A xxxx
Leave A Comment