Sunday, January 25, 2009

Dark Roasted M.Christian

Here we go again: another article for the always-great Dark Roasted Blend. This time it's on supercomputers. Enjoy!


In Isaac Asimov's classic story, "The Last Question," a supercomputer is, again and again, asked how to deal with the eventual heat death of universe. After upgrade upon upgrade, it finally has an answer -- but, alas, no one is left to hear it, because the universe has ended. So it simply states its answer out loud: "Let there be light."

Put another way, imagine that sometime in the future someone asks the smart-machine-to-end-all-smart-machines: "Is there a God," and said machine answers, "There is now."

What exactly qualifies as the earliest example of a "computer" is a matter of debate: some say the abacus while others point to the Antikythera mechanism, and still others push the calendar up to the 1800s with Charles Babbage's difference engine. Whatever their origins, though, with the advent of the digital revolution, computers have truly become super.

One of the first early super computers has to be Konrad Zuse's series of machines. Created in 1930s and 40s, they were one of the very first computers to be programmable as well as multi-function. Soon after, the Brits, needing some serious number-crunching during the war, built the aptly named Colossus -- which was smashed to bits in the name of secrecy when its job was done.


Not that America also wasn't up to the task: the U.S. had its own long line of increasingly sophisticated, and powerful, devices. First there was the Model K, then the ABC, followed by the Automatic Sequence Controlled Calculator, and then came ENIAC.

ENIAC was considered state of the art, a true electronic brain capable of astounding feats of calculation. Now, alas, we can do the same things that ENIAC could with a cheap throwaway calculator. But in 1943, ENIAC was the tops.

After ENIAC came EDVAC, a change of much more than a few letters. Created by the brilliant John von Neumann, this series of computers was a monumental leap forward in computational ability, flexibility, and speed.

On a side note, as early 1945 or so, computers gave us the term "bug" for a problem with a machine. Coined by Grace Hopper, because -- quite literally -- a moth got caught in the circuitry.

The 60s, and the age of the transistor, gave us bigger and smarter machines. Lead by master builders like IBM, these machines became behemoths of blinking lights and whirling tape reels, able to handle the chaos of weather prediction as well as tax records with the greatest -- for the most part -- of ease.

But supercomputers seriously came into their own when they challenged ... well, okay, their "handlers" allowed them to challenge … man at his own game: namely chess.

The first human vs. machine challenge is also up for debate as more than likely a few early programmers tried their hands at defeating their own creations and even pitting computers against computers. Transistors, though, quickly became superior to squishy human brains. In 1981 Cray Blitz took the crown from Joe Sentef , and then in 1988 Deep Thought managed to share the glory with Tony Miles -- though some suspect the machine felt a tiny bit sorry for Tony and so allowed him to join it in the winner's circle. This suspicion is probably incorrect, however, as Garry Kasparov, who felt no such sympathy, actually beat the machine in two games. But In 1997, Deep Blue avenged its mechanical sibling and stomped Kasparov in six games. Ouch!

What really hurts is that humans now regularly lose to their computational betters. The question today is whether they'll even let us fleshy beings sit at the same table with them, let alone deem us worthy to play with them.

What's really interesting about the new generation of super machines is not that they're smart -- which they most definitely are -- but how, well, sexy they've gotten.


Just take a look at MareNostrum, which is a perfect combination of beauty as well as brains. Sure, the monster machine that lives in a deconsecrated chapel in Barceolona, might be only (ahem) the 8th most powerful of its super-smart digital kin, but it's certainly a star in the looks department: a series of imposing monoliths set inside a climate-controlled glass room, a perfect juxtaposition between its 21st century mind and the ancient architecture of the chapel. It's been used for everything from climate modeling to helping decipher the human genome -- all the while looking fantastic as it works.

Even the most optimistic of futurists know that it's just a matter of decades, or even just a few years, before we see our creations surpass us. All we can hope is that they look down on us poor, flesh-and-blood humans with affection -- or simply with benign indifference.

Either way, making something that eventually could say "Let there be light" is pretty damned amazing.

No comments: