#complexity scientists are already watering at the mouth for #exascale computing. This is a fabulous demonstration of what they can already do at petascale levels.
One reason computational uncertainty quantification is a relatively new science is that, until recently, the necessary computer resources simply didn't exist.
"Some of our latest calculations run on 163,000 processors simultaneously," Moin said. "I think they're some of the largest calculations ever undertaken."
Thanks to its close relationship with the Department of Energy, however, the Stanford PSAAP team enjoys access to the massive computer facilities at the Lawrence Livermore, Los Alamos and Sandia national laboratories, where their largest and most complex simulations can be run.
It takes specialized knowledge to get computers of this scale to perform effectively, however.
"And that's not something scientists and engineers should be worrying about," said Alonso, which is why the collaboration between departments is critical.
"Mechanical engineers and those of us in aeronautics and astronautics understand the flow and combustion physics of scramjet engines and the predictive tools. We need the computer scientists to help us figure out how to run these tests on these large computers," he said.
That need will only increase over the next decade as supercomputers move toward the exascale – computers with a million or more processors able to execute a quintillion calculations in a single second.
Modeling the Complexities of Hypersonic Flight