ENERGY | WIRELESS | NANOTECH | MEMS | OPTICS | QUANTUM | 3D | CHIPS | ALGORITHMS

Monday, November 21, 2005

"SUPERCOMPUTERS: Big computers, little apps"

From realistic predictive modeling of natural and man-made disasters to atomic-level explorations of photosynthetic bacteria, supercomputers are enabling next-generation applications in science and technology. Increasingly, the machines' modeling muscle is even being applied to the design of consumer products. Supercomputing 2005, held here last week, reported on achievements and trends in the field. Simulations remain the most important supercomputer applications. At SC/05, researchers discussed the use of supercomputer simulations to forecast the course of disasters, such as modeling wave heights or predicting the drift of a plume from a so-called dirty bomb. But the era of big-application supercomputer simulations is giving way to small ones, said Thomas Lange, director of modeling and simulation for Procter & Gamble. Last week's conference continued a common practice in the supercomputing world: the smashing of records. Bragging rights to the world's largest full-electron calculation were claimed by a team that said it had successfully simulated every atom and every electron in the photosynthetic reaction center of the Rhodopseudomonas viridis bacterium. NASA, meanwhile, announced it had crafted the first complete simulation of a space shuttle flight from liftoff to re-entry, a feat achieved using its year-old Columbia supercomputer. While simulating a shuttle flight or a bacterial structure proceeds largely from known, fixed inputs, simulating the airborne drift of a contaminant plume depends on random inputs of incomplete and unreliable data. Simulation of a plume from a toxic contaminant requires a dynamic data-driven (DDD) approach to make sense of first responders' reports, which are subject to error, according to a team under the auspices of Sandia.
Text: http://www.eetimes.com/showArticle.jhtml?articleID=174400146