calit2

Worlds Past, Present, and Future at Supercomputing 2002

SC2002 3.17.03 -- In Charles Dickens' "A Christmas Carol," Scrooge is visited by the ghosts of Christmas Past, Present, and Future. Those who attended Supercomputing 2002 were able to see or imagine supercomputing in these different time frames simultaneously and get a privileged overview of some of the strongest forces in science and technology.

Supercomputing Past is about Big Iron, stand-alone machines that get faster, perhaps as fast as a petaflop in the next few years. Supercomputing Present is about the growth of Grid computing, tying together hundreds, thousands, even millions of computing nodes via the wireline Internet to solve problems in a more parallel and distributed fashion. Supercomputing Future, while still fogged in the mists of time, seems to be tending towards billions of intelligent devices pumped full of data by billions of sensors using wireless technologies, carried by people and vehicles to every place.

In his novel Ubik (one of the inspirations for The Matrix trilogy), Philip K. Dick's protagonists, who started in the 21st century, experienced the technology around them regressing in time. Starting with doorknobs and appliances that demanded nickels and dimes to operate and riding on rocket-powered spacecraft, the heroes found themselves riding on metal aircraft, then wood, and then in Model Ts - all of which looked new and modern. Attending Supercomputing 2002 had that sort of Ubik feeling.

Supercomputing Past: Name, Rank, and Stockpile Stewardship

The Supercomputing 2002 conference was held November 2002 and featured the best academic, government, and commercial high-performance computing put to work in a gleaming grid of difference engines to move us "from Terabytes to Insights," the spot-on conference theme. The conference filled most of the Baltimore Convention Center, which stood stoic, grey, and dripping amidst the constant cold drizzle of early winter.

The conference complex is a short walk along linked second-story pedestrian walking tubes away from the neon-encrusted harbor. This harbor surrounds the National Aquarium where, at a gala buffet on the final night, attendees talked teraflops amidst penguins and otters (and compared the sharks in the massive tank to people and machines they knew). The harbor featured the archetypal example of the transition from the industrial age to the information age: A massive power plant with smokestacks turned into a Barnes & Noble superstore (from gigawatts into Harry Pott's) flanked to leeward by the ESPN café (from scores to digital billboards) and to starboard by the Hard Rock Café (from alimentation to aggrandizement).

In a world marinating in lies and hyperbole, the Supercomputing conference is a celebration of truth and authenticity, the kind that is achieved by throwing trillions of floating-point operations per second (teraflops) at simulating and visualizing atomic, biological, chemical, aerodynamic, and other phenomena that in reality may last mere fractions of a second.

Supercomputing 2002 is a temporary municipality where practically everyone could qualify for MENSA (if there was a point, but these people are too focused on modeling, simulating, and problem solving to care about static measures), and things make so much sense that you don't want to go back to a world that suddenly seems much less sensible than it could be - and will be, thanks to the power of Moore's Law. The optimism of Supercomputing 2002 comes from knowing that all this power, within a few years, will be available to many and, with a few decades, will be available to most.

Supercomputing keeps getting smarter on many levels because, perhaps more than any other field, it combines nearly complete competition (among individual designers and users, departments, laboratories, universities, states or provinces, even nations!) with nearly complete cooperation - but that's Supercomputing Present, not Past. One arena for competition is the Top 500 supercomputer sites (note the URL - http://www.top500.org - which puts supercomputing above all possible top-500 lists).

At No. 1 on the list during SC2002 was Japan's Earth Simulator (ES) made by NEC, which has a maximum sustained rate of 35 teraflops and a peak rate of 40 teraflops. (The ranking is based on sustained rate.) At No. 500 was Hewlett-Packard's SuperDome/HyperPlex/128 with a sustained rate of 195 gigaflops and a peak rate of 286 gigaflops. In fact, this same architecture took up the last 38 spots on the top 500, allowing HP to put up signs around its booth that there were more HP machines on the Top 500 list than any other company. That's an example of one thing among many that I like about these people: they can do amazing things with numbers, and many people can be "No. 1 in Supercomputing" at the same time.

The Earth Simulator has brought much prestige, praise, and mainstream attention to the Japanese government, which spent $5 million to host the Kyoto meeting for the emissions protocols that bear its name, and $350 million to back up that brand identity among scientists. Time magazine called the Earth Simulator "Cool Invention of the Year" in its issue dated Nov. 11, 2002 and answered its own question "why?" by writing, "In the hope that digitally cloning our home planet, we might just be able to save it."
The Earth Simulator is a big friend of big science because it is that most rare and valuable thing: a Sputnik, as in politicians saying, "our competitors have done X, so we must do 3X, or 10X by spending more money. Find me more reasons!"

And I'm not kidding: I stood only five feet away as the U.S. Secretary of Energy Spencer Abraham announced two new contracts with IBM for supercomputers. The secretary said three things in his short speech, the first by a secretary on the show floor:

1. The machines would be used to model what is known by the alliterative term stockpile stewardship. I got a translation from a friend in the crowd, "The U.S. built 30,000 nukes with an expected life of 12 years. Some of them are over 40 years old, and we have no good idea whether they work since we aren't supposed to do live testing. The simulations are estimates of whether they will work or not."

2. Secretary Abraham proudly proclaimed that the first computer to be delivered would be called Deep Purple (perhaps he, too, had been at the neighboring Hard Rock Café) and would be "three times as fast as the current fastest supercomputer." Of course, everyone knew he was referring to Japan's Earth Simulator, especially the Japanese researchers on the ES who were standing there with me, smiling nervously. The other machine would be called Blue Gene-L and would be - you guessed it - 10 times faster. Like I said: Earth Simulator = Sputnik for Supercomputers.

3. Secretary Abraham said, since he was spending hundreds of millions of dollars, he wished he had thought to call it Blue Gene-A (for his first initial), which IBM's Donofrio, the Senior VP for Technology and Manufacturing of IBM, started calling it thereafter. Donofrio then presented Secretary Abrahams with a plaque that included actual processing units but asked that the gift be kept safe because IBM might need the processors back.

I mention the above to offer a reader a chance to understand a few things about supercomputing.

Supercomputing is where the real money for science resides. Few areas of pure or applied science relate as directly to the world as maintaining a single huge security assumption: that America possesses a credible, functioning nuclear deterrent. The days of the amateur scientist puttering around his or her home like Isaac Newton are over. The low-hanging fruit is all picked. To solve really complex problems requires big budgets that can be spent for something that is both clearly useful and roughly comparable between scientists using quantitative rather and qualitative measures.

In sum, Supercomputing Past was (and is) about whose supercomputer is bigger, better, faster, and applications were typically about either politics in disguise (the Earth Simulator) or just the opposite (stockpile stewardship). Supercomputing Past is about the competitive advantage of nations in a dangerous world that some suspect, via global warming, is becoming more hellish and seek to be prepared and armored with Big Science, reminiscent of your grandparent's supercomputers, memorialized in movies like The Forbin Project that have a monochrome screen and synthesized voice.

When I say Supercomputing Past, I do not mean to imply that there isn't tremendous progress and improvement in the field, as a number of Masterworks lectures that featured the very top supercomputing designers and users proved. For example, Steve Scott, the chief researcher for Cray on their new X1, gave an impressive talk to a packed room. The buzz was that this would be the first architecture to exceed a petaflop.

However, there is some degree of hubris and a healthy dose of Not Invented Here attitude among the Supercomputing Past people, especially when they look to the future. This attitude was on display when I went to the Birds of a Feather (BOF) session on predicting performance for supercomputing in 2010. The speakers claimed that there were "no new ideas in supercomputing." No one asked questions, but I just couldn't let this go. So I asked, "Are you saying that there will be no new ideas between now and 2010?" The speaker affirmed that this is what he thought. I asked further, "Is anyone planning to design a future supercomputer so that this can help to simulate performance of future supercomputers?" He said that that this was an area that needed new ideas, and two other people produced follow-up questions.

Supercomputing, as a field, is still open for people to make breakthroughs, though, as we are about to see, the big changes are not in computer architecture, but in the user community as well as the manner - and mobility - with which computing resources are applied.

Supercomputing Present

While the big computers dominated the biggest booths, and competition and ranking were the substance of the biggest announcements, Supercomputing Present was and is about cooperation, collaboration, and Grid computing, which holds forth the bright promise of supercomputing as a public utility, available to many organizations. If IBM's Donofrio and Energy Secretary Abraham are the archetypal figures of Supercomputing Past, Dr. Ian Foster and Dr. Roscoe Giles are those of Supercomputing Present. Foster is the co-editor (with Carl Kesselman) of The Grid: Blueprint for a New Computing Infrastructure and is, to the Globus Grid computing software boomlet, comparable to what Linus Torvalds was to the boom in Linux-computing.

I took in Ian Foster's presentation on the Grid at the Sun Microsystems booth. He remembered me because I wrote the first review of Foster's book, The Grid, posted on Amazon and later invited me to his conference on Grid computing. I'm not an authority on supercomputing, and Ian's invitation is an example of the inclusive culture of Supercomputing Present vs. the "show-me-your-badge" secure military facility culture of Supercomputing Past.

The nicest researcher in a field filled with very nice researchers is Prof. Roscoe Giles, the conference chair, who, when I interviewed him, was wearing a lobster cap and pushing a cart stuffed with every alcohol imaginable (the kind you drink, not experiment with) to spread good will among vendors. The collegiality of years in supercomputing are apparent in Giles, who thinks deeply of the social implications of his field, and in his keynote, which emphasized the progress of the team, with no rankings for individual machines. Instead, Giles emphasized communications over computing, remarking that 175 miles of fiber, laid during the proverbial trade show miracle at the rate of 5 miles an hour, connected hundreds of high-end computers on the trade show floor, and, in effect, to thousands of other computers.

Collaboration is not just among researchers, government agencies, and inventors. The BOF sessions were much more formal at SC2002 than at VRML, SIGGRAPH, ISWC, and other academic conferences, as if people made a profession of even informal collaboration. The most useful BOF was "Federal funding for HPC," which included reps from DOE, DOD, NASA, NSA, NIST, and NSF. Wow! No wonder there are so many people here! There are huge amounts of money out there, much of it unallocated. I noted that the representative of NSF said that they wanted even more proposals, for instance.

Communication in Supercomputing Present also happens through great graphics. The face of Supercomputing Present is the big screen, running visualizations good enough to
make the Discovery Channel's top producers turn to universities in Illinois rather than to George Lucas or Disney Studios.

Think I'm exaggerating? The colliding black holes shown in the keynote presentations by Dr. Rita Colwell (and, later, by Ray Orbach, Director of the Office of Science at the Department of Energy) were so exquisitely rendered that they made the black hole images in recently released Treasure Planet (Disney) look like the Spider-Man of kindergartner's cartoon doodles compared to the Spider-Man of Sam Raimi's box office- topping movie. The simulations were so good that you would be excused if you cried at this exemplar of progress. Art and science blend until they are simply indistinguishable.

Among the more interesting demonstrations were those presented by the San Diego Supercomputer Center. I talked with Abel Lin about the Telescience workbench, which he's developing with Prof. Mark Ellisman, to make an increasing variety and quantity of scientific/analytic tools (such as microscopes, centrifuges, etc.) available in a distributed computing fashion. They've even Internet-enabled a microscope in Japan so that researchers in the U.S. can use it, and, in the process, become pioneers in the use of Internet Protocol version 6, thus advancing communications further. (See my article on iGrid for more on the growth of distributed computing and broadband.)

Supercomputing Future

Albert Einstein said, "The unleashed power of the atom has changed everything
save our modes of thinking, and thus we drift toward unparalleled catastrophe." After attending this conference centered on supercomputing - itself a technology that has been driven in large part by the need to understand the power of the atom in both leashed and unleashed form - I no longer agree with Einstein's claim. If he'd been with me, I think he would retract his statement. In any case, the experience was too good not to want to share it.

Supercomputing has changed everything about our mode of thinking, and, with the emergence of Grid computing, we take big, bold steps towards an ever smarter world. Supercomputing gives humanity superpowers: to hypothesize, test, model, simulate, verify, collaborate, visualize, and, in general, develop collective computational intelligence.

My archetypes for Supercomputing Future are Dan Reed and Donna Cox. Dr. Dan Reed, Director of the National Center for Supercomputing Applications (NCSA), is the right man for the job. Reed is far from satisfied with having a hundred times more computing power than the average academic research lab. He wants/needs billions of times more processing power for his colleagues. Reed makes a very compelling case that this Big Bang boost is necessary to more fully and accurately model the complexity of, for example, organic chemistry involved in cells, cells in organs, organs in organisms, and organisms in populations and changing environments. As a scholar of both exponential thinking and science fiction, I'm not easily awed by big thinking, but I was by an hour with Dan Reed.

Donna Cox, mentioned above as visualizing the physical world of black holes, was hired as the first artist at NCSA and, if you've seen stars in motion on the Discovery Channel, they were probably brought into being at least in part by Cox. She is now moving toward visualizing the developing wireless cyberspace itself. I was a trial user of the Intellibadge system, an NCSA project run by Cox, which were given to 900 of the Technical Program attendees. These badges (more like portable devices) were comprised of radio-frequency units a little bigger and heavier than a matchbox stuck into a plastic badge holder. The movements and interests - on the floor and in the conference sessions - of conference attendees were tracked, brought vividly to the attention of attendees via high-resolution graphics of anthills and a flower whose petals changed based on the numbers on seven-foot-square displays around the conference.

Crowds of people would stare at these boards, like John Nash in the movie A Dangerous Mind during the time he thought he was breaking codes for military intelligence, looking at columns of numbers and words and trying to find patterns embedded within. This image of geniuses staring at super-processed information about themselves symbolized for me the future of supercomputing as it moves from a focus on fluids and ballistics to a focus on crowds and power.

Supercomputing Future seems to be to supercomputing past what 4G is to analog wireless (1G). 4G is the fourth generation of wireless communication, in which anyone will be able to get any data anywhere. Just as the transition from Supercomputing Past to Supercomputing Present increased the number of "supercomputer equivalents" from hundreds to thousands and expanded supercomputing from national laboratories to universities and companies, Supercomputing Future will increase the supercomputing equivalents from thousands to millions and expand supercomputing to include individuals, whose choices, votes, and even movements will be the input for brilliant measurements, models, and movies that will surprise and delight us for decades to come.

If Dan Reed's vision for funding and technology are fulfilled, I think there is every possibility of a massive increase in intelligence, especially if we define intelligence as the ability for humans and machines to collectively solve problems. As we leave SC2002, it's worth reflecting on a few choice words of Albert Einstein that perfectly summarize what I felt after a week at Supercomputing 2002:

"All religions, arts and sciences are branches of the same tree. All these aspirations are directed toward ennobling man's life, lifting it from the sphere of mere physical existence and leading the individual towards freedom."

Knowledge is power, and supercomputing brings super-knowledge, and perhaps superpower, to solve super-problems.

 

----------------------------------------------------------------------
1. http://www.sc-conference.org/SC2002/ for multiple links, including to 2003 in Phoenix, AZ.
2. This phrase is inspired partly by Winston Churchill who said anecdotes were "the gleaming toys of history." The same could be said of supercomputers.
3. With human-sized tunnels through the base of the 20-story thick, brick tubes that soared upward, inducing vertigo. The industrial age never felt so dead and gone as amidst those brightly papered tables.
4. Starboard is the right side of a ship looking forward.
5. The black hole collision was computed by Ed Seidel's group and rendered by NCSA's visualization group led by Donna Cox using more than a terabyte of computed data.
6.See the book Computational Collective Intelligence