Ray Richards is founder of Mindspan Consultants and a technology journalist hailing from Ottawa, Canada

Skip site navigation and move to main content of page.

Distributed Computing

Ok, you heard it here first: the "supercomputer" is dead.

"What?" you say, "didn't the US government just spend $110,000,000 on a 106 ton, 2 basketball court sized behemoth to model nuclear explosions?" Yep. IBM's RS/6000 SP based system dubbed ASCI White was switched on in August, much to the certain delight  of atoll dwellers and the dismay of cold war adversaries everywhere. The company has even commercialized variants of this system for delivery to several verticals, including biotechnology, aerospace and energy, all being well received within their respective markets. IBM knows what they're doing.

So am I off my head? Well, maybe – but I stand by my statement... with a slight revision: supercomputing as we know it is dead. "So" you say, "what is to take its place smart guy – the abacus?!" Nope, a billion people using a billion abaci... or at least the computerized equivalent, when compared to ASCI White. The future of supercomputing is distributed.

Distributed computing has been around for a very long time in technology terms. In 1973, researchers at Xerox's Palo Alto Research Center (PARC) created what they dubbed a worm, which bore a significantly less malign reputation than the current connotation. What it did was to roam the facility's network, replicating itself in memory on various machines and utilising unused CPU cycles to perform complex graphics computations.

So, while that seems very efficient – why didn't this technology take off? Well, to obtain  the answer to that, you must examine the computing model of the time. This was the era of mainframes and dumb terminals – very few institutions owning more than one (very expensive) true computer. Computers were usually purchased to perform a single or very few specific tasks, be it point of sale, inventory management or fluid dynamics equations. The more computational power required, the bigger the box; hence, the birth of the supercomputer. So, while this was interesting technology for those lucky few at PARC to play with, it was beyond the reach and mindset of most.

Then, in 1975 the computing world was turned on its head as the first personal computer, the Altair 8800 became the first personal computer on the market. At first, the microcomputer's debut made barely a ripple in the sea of mainframe dominated bits, but that was soon to change in a big way. When in 1977 and 1978 respectively the Apple II and TRS-80 Model 1 entered the marketplace, people began to take notice. By the end of the year Radio Shack owned 50 percent of the PC market, having sold just under 100,000 units. While, this number doesn't sound overly impressive by today's standards, you have to bear in mind that you purchased a microcomputer in those days to program. If you lacked the prerequisite skills (as most did), the machine was little more than an expensive doorstop.

Slowly, IBM began to wake up to a potential missed opportunity. When in 1980, then Satellite Software announced its first version of WordPerfect, finally, a universally required application met the new IBM PC's 8088 processor. By year end in a field of over 3000 competitors, Big Blue held 25 percent of the market, and the rest, as they say, is history.

So what does this short history lesson have to do with my opening statement? The PC revolution and the advent of the Internet has made massive distributed processing not only possible but will ultimately render it more practical than the current model in terms of both performance and cost.

The first project which utilised this pairing, distributed.net, is devoted to cryptography and has gained thousands of volunteer processors since its inception in 1997. The project has won numerous code cracking competitions and amply displayed the power of this model. In 1999 however, a new distributed computing project captured the imagination of the planet: the search for extra-terrestrial intelligence. What the application does is download approximately 300K of data which has been collected from the Arisibo Radio Telescope in Puerto Rico and analyse it for patterns of interest on a volunteer's PC when that machine is sitting idle. SETI@home has since gained over 3.5 million users of their program, resulting in approximately 9.5e+20 calculations... or the equivalent to over 700,000 years of single CPU time. Truly Sun's old adage "The network is the computer" has been realised.

Now this technology is being turned inward in hopes of finding a cure for cancer. Parabon has teamed with researchers and created a program which not only uses your computer in a similar fashion to SETI@home by running various gene research calculations, but additionally provides users with incentives including draws ranging  between 100 and 1,000 dollars. You may even opt to donate these funds to a selection of charities.

Next month, we will further explore this model and its ramifications for business, government and society in general. Until then, why not visit www.parabon.com or setiathome.ssl.berkeley.edu to download either of these applications and give them a try!

Originally Published in Ottawa Computes! magazine, September, 2001, by technology columnist, Ray Richards.

Sidebar

Article Index