Categories
Learning & Education Researcher news Technology

The power of ManeFrame: SMU’s new supercomputer boosts research capacity

950x150 ManeFrame_v2 rev

The enormous capacity of SMU’s new supercomputer ranks it among the largest academic supercomputers in the nation.

ManeFrame, previously known as MANA, was relocated to Dallas from its previous location in Maui, Hawaii. (Courtesy of mauinow.com)
ManeFrame, previously known as MANA, was relocated to Dallas from its former location in Maui, Hawaii. (Courtesy of mauinow.com)

SMU now has a powerful new tool for research – one of the fastest academic supercomputers in the nation – and a new facility to house it.

With a cluster of more than 1,000 Dell servers, the system’s capacity is on par with high-performance computing (HPC) power at much larger universities and at government-owned laboratories. The U.S. Department of Defense awarded the system to SMU in August 2013.

SMU’s Office of Information Technology added the system to the University’s existing – but much smaller – supercomputer. The system is housed in a new facility built at the corner of Mockingbird and Central Expressway. In a contest sponsored by Provost and Vice President for Academic Affairs Paul W. Ludden, faculty and students chose the name “ManeFrame” to honor the Mustang mascot.

The enormous capacity and speed of HPC expands scientific access to new knowledge around key questions about the universe, disease, human behavior, health, food, water, environment, climate, democracy, poverty, war and peace.

“World-changing discoveries rely on vast computing resources,” says President R. Gerald Turner. “ManeFrame quintuples the University’s supercomputing capacity. Our scientists and students will keep pace with the increasing demand for the ever-expanding computing power that is required to participate in global scientific collaborations. This accelerates our research capabilities exponentially.”

ManeFrame potential
With nearly 11,000 central processing unit cores, ManeFrame boasts 40 terabytes (one terabyte equals a trillion bytes) of memory and more than 1.5 petabytes of storage (a petabyte equals a quadrillion bytes), says Joe Gargiulo, SMU’s chief information officer, who led the installation team.

The sciences and engineering primarily use supercomputers, but that is expanding to include the humanities and the arts. So far, SMU’s heavy users are researchers in physics, math, biology, chemistry and economics.

“This technologically advanced machine will have an impact on shaping our world,” says Thomas M. Hagstrom, chair of the Department of Mathematics in Dedman College and director of SMU’s Center for Scientific Computing. “This makes research that solves problems on a large scale much more accessible. ManeFrame’s theoretical peak would be on the order of 120 Teraflops, which is 120 trillion mathematical operations a second.”

Supercomputers can use sophisticated software and step-by-step procedures for calculations, called algorithms, to solve complex problems that can’t be managed in a researcher’s lab, Hagstrom explains.

“We can’t put the Earth’s climate system or study the evolution of the universe in a physical lab,” he says. “You can only study these and other systems in a comprehensive way using high-performance computing.”

Making SMU competitive
Supercomputing gave University physicists a role in the Higgs Boson research at the Large Hadron Collider in Geneva, Switzerland. Joining the collaboration with thousands of scientists around the world, SMU’s team was led by Physics Professor Ryszard Stroynowski. SMU’s physicists tapped the existing HPC on campus to quickly analyze massive amounts of data and deliver results to their international colleagues.

SMU’s team will use ManeFrame to keep pace with an even larger flood of data expected from the Large Hadron Collider.

“ManeFrame makes SMU – which is small by comparison with many of its peer institutions at CERN – nimble and competitive, and that lets us be visible in a big experiment like CERN,” says Stephen Sekula, assistant professor of physics. “So we have to have ideas, motivation and creativity – but having a technical resource like ManeFrame lets us act on those things.”

SMU physicist Pavel Nadolsky has conducted “big data” analyses of subatomic particles on the supercomputer as part of an international physics collaboration. Big data refers to probability distributions that depend on many variables. As users ranging from retailers to the health industry collect multitudes of transactional data every day, requirements for big data analysis are rapidly emerging.

“To keep up in our field, we need resources like ManeFrame,” says Nadolsky, associate professor of physics.

“The world is moving into big-data analysis, whether it’s Google, Facebook or the National Security Administration,” Nadolsky says. “We learn a lot about the world by studying multidimensional distributions: It tells about the origins of the universe; it can win elections by using data mining to analyze voting probabilities over time in specific geographical areas and targeting campaign efforts accordingly; and it can predict what people are doing. To make students competitive they must be trained to use these tools efficiently and ethically.”

ManeFrame will have a high-profile role in the U.S. Department of Energy experiment called NOvA, which studies neutrinos, a little-understood and elusive fundamental particle that may help explain why matter, and not just light, exists in the universe today. SMU will contribute four million processing hours each year to the experiment, says Thomas E. Coan, associate professor of physics and a member of the international team.

“We’re in good company with others providing computing, including California Institute of Technology and Harvard,” Coan says. “It’s one way for SMU to play a prominent role in the experiment. We get a lot of visibility among all the institutions participating in NOvA, which are spread out across five countries.”

Advancing discovery
One of the heaviest users of SMU’s HPC is John Wise, associate professor of biological sciences, who models a key human protein to improve chemotherapy to kill cancer cells. Wise works with the SMU Center for Drug Discovery, Design and Delivery in Dedman College, an interdisciplinary research initiative of the Biology and Chemistry departments and led by Professor of Biological Sciences Pia Vogel.

Within the Mathematics Department, Assistant Professor Daniel R. Reynolds and his team use high-performance computing to run simulations with applications in cosmology and fusion reactors.

Looking to the future, high-performance computing will be increasing in research, business and the arts, according to James Quick, associate vice president for research and dean of graduate studies.

“High-performance computing has emerged as a revolutionary tool that dramatically increases the rates of scientific discovery and product development, enables wise investment decisions and opens new dimensions in artistic creativity,” says Quick, professor of earth sciences. “SMU will use the computational power of ManeFrame to expand research and creativity and develop educational opportunities for students interested in the application of high-performance computing in their fields – be it science, engineering, business or the arts.” – Margaret Allen

Follow SMUResearch.com on twitter at @smuresearch.

SMU is a nationally ranked private university in Dallas founded 100 years ago. Today, SMU enrolls nearly 11,000 students who benefit from the academic opportunities and international reach of seven degree-granting schools. For more information see www.smu.edu.

SMU has an uplink facility located on campus for live TV, radio, or online interviews. To speak with an SMU expert or book an SMU guest in the studio, call SMU News & Communications at 214-768-7650.

Categories
Energy & Matter

New mathematical model aids simulations of early universe

LoRezcosmic_density.jpg

Scientists have made many discoveries about the origins of our 13.7 billion-year-old universe. But many scientific mysteries remain. What exactly happened during the Big Bang, when rapidly evolving physical processes set the stage for gases to form stars, planets and galaxies?

Now astrophysicists using supercomputers to simulate the Big Bang have a new mathematical tool to unravel those mysteries, says Daniel R. Reynolds, assistant professor of mathematics at SMU.

Reynolds collaborated with astrophysicists at the University of California at San Diego as part of a National Science Foundation project to simulate cosmic reionization, the time from 380,000 years to 400 million years after the universe was born.

Together the scientists built a computer model of events during the “Dark Ages” when the first stars emitted radiation that altered the surrounding matter, enabling light to pass through. The team tested its model on two of the largest existing NSF supercomputers, “Ranger” at the University of Texas at Austin and “Kraken” at the University of Tennessee.

LorezCMB_Timeline.jpg
Evolution of our universe. Source: NASA

The new mathematical model tightly couples a myriad of physical processes present during cosmic reionization, such as gas motion, radiation transport, chemical kinetics and gravitational acceleration due to star clustering and dark matter dynamics, Reynolds says.

The key characteristic of the model that differentiates it from competing work is that the researchers focused on enforcing a very tight coupling in the model between the different physical processes.

“By forcing the computational methods to tightly bind these processes together, our new model allows us to generate simulations that are highly accurate, numerically stable and computationally scalable to the largest supercomputers available,” Reynolds says.

They presented their research at a Texas Cosmology Network Meeting at UT in late October. Reynolds’ mathematical research also was published as “Self-Consistent Solution of Cosmological Radiation-Hydrodynamics and Chemical Ionization” in the October issue of the “Journal of Computational Physics.”

More SMU Research

New Paluxysaurus mount

3D dinosaur track

Hunt for Higgs boson

Simulation models typically consist of a complex bundle of mathematical equations representing physical processes. The equations are integrated to reflect interaction of the physical processes. Only supercomputers can simultaneously solve the equations. Scientific intuition and creativity come into play by developing the base model with equations with the best parameters, Reynolds says. Variables can be altered to describe different scenarios that might have occurred. The objective is to develop a simulation model with results that most closely resemble telescope observations and that predict a universe that looks like what we have. If that happens, scientists have discovered the set of physical processes that existed at the birth of the universe as it was evolving from one instant to the next.

Physical processes include the heating of various gases, gravity, the conservation of mass, the conservation of momentum, the conservation of energy, expansion of the universe, the transport of radiation, and the chemical ionization of different species such as Hydrogen and Helium, the primary elements present at the beginning of the universe. An additional equation running in the background describes and models the dynamics of dark matter — the majority of the matter in the universe — which gives rise to gravity and is attributed with helping the universe form stars, planets and galaxies.

“Supercomputers are so big, they hold so much data, you can build models that work with many processes at one time,” Reynolds says. “A lot of these processes behave nonlinearly. When they are put together, they inhibit each other, feed off each other, so you end up with many different processes when they are put together.”

A direct consequence of the tight coupling that the researchers enforce in their model is that the resulting system of equations is much more complex than those that must be solved by other models, Reynolds says.

“This paper describes both how we form the coupled model, as well as the mathematical methods that enable us to solve the systems of equations that result. These include methods that accurately track the different time scales of each process, which often occur at rates that vary by orders of magnitude,” he says. “However, perhaps the most important contribution of this paper is our description of how we pose the complex interaction of different models as a nonlinear problem with potentially billions of equations and unknowns, and solve that problem using new algorithms designed for next-generation supercomputers. We conclude by demonstrating that the new model lives up to the ideal, providing an approach that allows high accuracy, stability and scalability on a suite of difficult test problems.”

Only recently have mathematics algorithms been invented to solve basic problems — like diffusion of heat — using resources as large as those available on modern supercomputers, Reynolds says. There have been simple analytical solutions to many problems from mathematical physics for hundreds of years. However, those analytical solutions only work when scientists simplify the problem in some way or another. For example, he says, they may approximate the shape of a planet as a sphere, instead of an ellipsoid, or may assume that ocean water is incompressible, which only works for very shallow water, or assume the Earth is homogeneous, instead of formed using widely differing layers of rock.

“Scientists have been able to approximate a great many physical processes in such idealized situations. But the true frontier nowadays is to let go of these simplifying approximations and treat the problems as they really are, by modeling all of the geometric structure and the in-homogeneity,” Reynolds says. “To do that, you need to solve harder equations with lots of data, which is ideally suited to using supercomputers. The numerical methods that can allow us to use larger and larger computers have only just come out. The problems are getting more challenging and harder to solve, but the numerical methods are reaching greater capability, so you can really start moving them forward. These new computers make everything a new frontier.”

Besides Reynolds, other researchers were John C. Hayes, Lawrence Livermore National Laboratory, Livermore, Calif.; Pascal Paschos, Center for Astrophysics and Space Sciences, University of California at San Diego, La Jolla, Calif.; and Michael L. Norman, Center for Astrophysics and Space Sciences, and physics department, the University of California at San Diego, La Jolla. — Margaret Allen

Related information:
Daniel R. Reynolds
SMU Department of Mathematics
Reynolds: Cosmic Reionization of the Early Universe
video.jpg “In the Beginning: Modern Cosmology and The Origin of Our Universe”
video.jpg Michael Norman: “Computational Astrophysics”
SMU Dedman College