Distributed Computing
Ok, you heard it here first: the "supercomputer" is dead.
"What?" you say, "didn't the US government just spend $110,000,000 on a 106 ton, 2 basketball court sized behemoth to model nuclear explosions?" Yep. IBM's RS/6000 SP based system dubbed ASCI White was switched on in August, much to the certain delight of atoll dwellers and the dismay of cold war adversaries everywhere. The company has even commercialized variants of this system for delivery to several verticals, including biotechnology, aerospace and energy, all being well received within their respective markets. IBM knows what they're doing.
So am I off my head? Well, maybe – but I stand by my statement... with a slight revision: supercomputing as we know it is dead. "So" you say, "what is to take its place smart guy – the abacus?!" Nope, a billion people using a billion abaci... or at least the computerized equivalent, when compared to ASCI White. The future of supercomputing is distributed.
Distributed computing has been around for a very long time in technology terms. In 1973, researchers at Xerox's Palo Alto Research Center (PARC) created what they dubbed a worm, which bore a significantly less malign reputation than the current connotation. What it did was to roam the facility's network, replicating itself in memory on various machines and utilising unused CPU cycles to perform complex graphics computations.
So, while that seems very efficient – why didn't this technology take off? Well, to obtain the answer to that, you must examine the computing model of the time. This was the era of mainframes and dumb terminals – very few institutions owning more than one (very expensive) true computer. Computers were usually purchased to perform a single or very few specific tasks, be it point of sale, inventory management or fluid dynamics equations. The more computational power required, the bigger the box; hence, the birth of the supercomputer. So, while this was interesting technology for those lucky few at PARC to play with, it was beyond the reach and mindset of most.
Then, in 1975 the computing world was turned on its head as the first personal computer, the Altair 8800 became the first personal computer on the market. At first, the microcomputer's debut made barely a ripple in the sea of mainframe dominated bits, but that was soon to change in a big way. When in 1977 and 1978 respectively the Apple II and TRS-80 Model 1 entered the marketplace, people began to take notice. By the end of the year Radio Shack owned 50 percent of the PC market, having sold just under 100,000 units. While, this number doesn't sound overly impressive by today's standards, you have to bear in mind that you purchased a microcomputer in those days to program. If you lacked the prerequisite skills (as most did), the machine was little more than an expensive doorstop.
Slowly, IBM began to wake up to a potential missed opportunity. When in 1980, then Satellite Software announced its first version of WordPerfect, finally, a universally required application met the new IBM PC's 8088 processor. By year end in a field of over 3000 competitors, Big Blue held 25 percent of the market, and the rest, as they say, is history.
So what does this short history lesson have to do with my opening statement? The PC revolution and the advent of the Internet has made massive distributed processing not only possible but will ultimately render it more practical than the current model in terms of both performance and cost.
The first project which utilised this pairing, distributed.net, is devoted to cryptography and has gained thousands of volunteer processors since its inception in 1997. The project has won numerous code cracking competitions and amply displayed the power of this model. In 1999 however, a new distributed computing project captured the imagination of the planet: the search for extra-terrestrial intelligence. What the application does is download approximately 300K of data which has been collected from the Arisibo Radio Telescope in Puerto Rico and analyse it for patterns of interest on a volunteer's PC when that machine is sitting idle. SETI@home has since gained over 3.5 million users of their program, resulting in approximately 9.5e+20 calculations... or the equivalent to over 700,000 years of single CPU time. Truly Sun's old adage "The network is the computer" has been realised.
Now this technology is being turned inward in hopes of finding a cure for cancer. Parabon has teamed with researchers and created a program which not only uses your computer in a similar fashion to SETI@home by running various gene research calculations, but additionally provides users with incentives including draws ranging between 100 and 1,000 dollars. You may even opt to donate these funds to a selection of charities.
Next month, we will further explore this model and its ramifications for business, government and society in general. Until then, why not visit www.parabon.com or setiathome.ssl.berkeley.edu to download either of these applications and give them a try!
Originally Published in Ottawa Computes! magazine, September, 2001, by technology columnist, Ray Richards.
Heading Level 3
Sidebar
Article Index
- Digital New Year's Resolutions - January 2009
- Networking Basics - June 1996
- Networking Basics Part 2 - July 1996
- The Media PC - April 2005
- WiMax - Metropolitan Networks - May 2005
- Digital Rights Management - June 2005
- Digital Rights Management - Part 2 - July 2005
- Adobe Creative Suite 2 Review - August 2005
- Windows Rant, Alpha Rave - August 1998
- DEC AlphaServer Lineup - August 1998
- The Year in Retrospect, 1996-1997 - August 1997
- Bluetooth & Wireless Networking - Nov. 2000
- How to Win Government Contracts - Oct. 1999
- Mobile Phone Plans Comaprison - August 2005
- Clones Versus Brand Name PCs - June 1998
- Adobe Illustrator vs. Corel Draw - March 2000
- Illustrator vs. Draw - Part 2 - March 2000
- The Death of Customer Service - August 2000
- Customer Service Solutions - September 2001
- Data To Diamonds - February 1998
- Data To Diamonds - Part 2 - March 1998
- The End of the Internet? - December 2000
- Your Digital Legacy - March 2008
- Disaster Recovery Planning - September 1997
- Threat and Risk Assessments - October 1997
- Dr. Jeff Williams Interview - November 1997
- Jeff Williams Interview - Part 2 - December 1997
- Magma's Data Center - October 2000
- Magma's ADSL Service Interview - January 1999
- Magma's ADSL Interview - Part 2 - January 1999
- Distributed Computing - September 2001
- Distributed Computing - Part 2 - October 2001
- Gaining Internet Exposure - Part 2 - May 1999
- Enterprise Resource Planning - October 1998
- Powering ERP Applications - April 1999
- Flash Versus LiveMotion - April 2001
- FreeBalance Financials - March 1999
- Globalization - May 2001
- Barriers and Benefits of Globalization - June 2001
- Google Desktop Review - May 2006
- Graphic Design Fundamentals - February 2000
- IBM Plant & Headquarters Tour - January 1997
- IM's Effect on Society & Culture - September 2005
- Compaq Servers Review - May 1998
- Citrix Winframe Review - May 1997
- Smart Cards Overview - July 1997
- Online Anonymity - October 2008
- An Introduction to Java - December 1996
- ERP: PeopleSoft - December 1998
- Photopaint vs. Photoshop - May 2000
- Photopaint vs. Photoshop - Part 2 - June 2000
- Starting a Small Business - Admin - July 1999
- SOHO Accounting Software - August 1999
- Accpac, Simply Accounting Review - October 1999
- Rogers Rant, Quickbooks Rave - November 1999
- Intuit Quickbooks Pro Review - December 1999
- Quickbooks Pro Review - Part 2 - January 2000
- SAP R/3 Review - November 1998
- How Standards Affect Everything - March 2001
- Teleworking - Your Office at Home - April 1998
- The Ultimate Office - February 2008
- Unicenter TNG - June 1997
- Virtual Private Networking - November 1998
- Web 3.0, The Semantic Web - July 2008
- Basic Web Design Principles - February 1999
- Women in High Tech - September 1995
- Windows Driver Nightmares - January 2001
- Post Y2K Commentary - February 2001
- Bored With Technology - July 2001