1. Industry
Send to a Friend via Email

Your suggestion is on its way!

An email with a link to:

http://outsourcing.about.com/od/disruptive/a/The-Supercomputer-Power-Of-Outsourcing.htm

was emailed to:

Thanks for sharing About.com with others!

The (Supercomputer) Power of Outsourcing

By

When you watch old science fiction movies, you can see the clichés of the 21st century. Of course, before they became clichés they were first innovative… sometimes frightening... ideas. One of the constant themes of “sci-fi” is imagining how the future will unwind. From the 30s and on, 20th Century science fiction produced a lot of stories about what it would be like to live in the 21st century, and beyond. The usual cast of characters in these stories included: computers that take over the world, flying cars or rockets to your home, robots that become self-aware (usually not a good thing), rockets into space and bases on the moon. We’re now in the 2nd decade of the 20th century, and it’s not quite what those sci-fi movies told us it would be. The US space program is on hold, and we retired our last US built working spacecraft in 2011. Moon bases (and missions to mars) are off the budget until the economy is much better. Don’t get me started on flying cars! I’d just like to go to an airport where I don’t have to take my shoes off! And yet, there is one thing that we were promised that we’re just getting. Supercomputers!

Even before the 21st Century, there were hundreds of millions of computers in use around the world. However, we’ve learned that as amazing as these machines are, they have their limits. It takes a massive amount of processing power to do some functions. Such as predicting the weather, which has huge implications for agriculture. And for avoiding weather related disasters like Hurricane Katrina. Or accurately predicting the path of a tsunami after an earthquake. After the devastating Indian Ocean Tsunami in 2004, perhaps as many as 300,000 people were killed. While the tsunami swept under the oceans, creating tidal waves where it touched land, one scientist was running a prototype application to predict where lives would be lost. Unfortunately, the only computer at his disposal was a laptop. As amazing as laptops are, it lacked the processing power to predict where the wave was headed. Falling further and further behind the data, it could only belatedly “predict” where lives had already been lost. If supercomputers were as available as laptops, things might have turned out a bit differently.

Over at Amazon Web Services, they have been hard at work developing a solution to the growing need for supercomputers. Amazon, the on-line marketplace, launched Amazon Web Services (AWS) in 2002 as a “Cloud Service” for software developers. Rather than buying and building their own server rooms, they could buy a slice of capacity from AWS. Because of Amazon’s massive purchasing capacity, they can buy the equivalent service from Amazon at a fraction of the cost of doing it themselves. Furthermore, by tapping into knowledge that ASW has in building CPU and storage capacity (arguably the greatest in the world, based on the amount they manage) their servers will be more efficiently set up and managed than anything that you could do yourself.

At some point around 2010-2011, AWS worked with its partners to develop virtual supercomputers. By bundling together CPU’s a virtual service can be created to simulate the computing power of one of the most powerful computers in the world. Before AWS, if you wanted this kind of computing power, you needed at least one of the lesser of these ultra-fast machines. That meant spending millions of dollars to purchase, and millions more to maintain one of these processing behemoths. Of course, in just a few years, your massively powerful supercomputer would be obsolete, and you would need to make yet another purchase. That’s why there are so few of these machines, and why many of them are in well financed experimental use, or in the military, weather services, etc. Universities, smaller hospitals and start-up biotech firms, all of which would derive enormous benefits from access to supercomputer capacity, just can’t afford to become users. However, AWS has completely changed the game.

AWS is changing the game because it is a cheap supercomputer. A typical supercomputer project might take just a few hours to process, which means just a few thousand dollars. That can fit just about any budget. For comparison, I was at an AWS conference earlier this year where one of the virtual supercomputer providers told of a project for their client. The client had a fairly large number of computers at work, where they were processing data on proteins, to find the best candidates for anti-cancer drugs. This process would take about three years to complete processing the data. The virtual supercomputer processed the data overnight. Not only was this cheaper than running the desktop level computers for three years, it moved ahead an important cancer study by three years. If supercomputers were available to all the heath-related studies today, decades of waiting for new treatments and new drugs could be eliminated.

While this virtual supercomputer is fast and cheap, the good news is that the computer is getting faster and cheaper as AWS expands. As of 2012, it is one of the 50 fastest computers in the world. It is expected to climb the list of fastest computers as AWS’s services continue to grow. That means that just about everyone who needs supercomputer power will be able to get it. This may not be quite the 21st century you were planning on, but the arrival of the “personal” supercomputer just might be a sign that our flying cars will arrive any day now!

©2014 About.com. All rights reserved.