The festive event coincided with the kick-off of SMU’s Fall Semester and included Solar Eclipse Cookies served while viewing the rare astronomical phenomenon.
The eclipse reached its peak at 1:09 p.m. in Dallas at more than 75% of totality.
“What a great first day of the semester and terrific event to bring everyone together with the help of Dedman College scientists,” said Dedman Dean Thomas DiPiero. “And the eclipse cookies weren’t bad, either.”
Physics faculty provided indirect methods for observing the eclipse, including a telescope with a viewing cone on the steps of historic Dallas Hall, a projection of the eclipse onto a screen into Dallas Hall, and a variety of homemade hand-held devices.
Outside on the steps of Dallas Hall, Associate Professor Stephen Sekula manned his home-built viewing tunnel attached to a telescope for people to indirectly view the eclipse.
“I was overwhelmed by the incredible response of the students, faculty and community,” Sekula said. “The people who flocked to Dallas Hall were energized and engaged. It moved me that they were so interested in — and, in some cases, had their perspective on the universe altered by — a partial eclipse of the sun by the moon.”
A team of Physics Department faculty assembled components to use a mirror to project the eclipse from a telescope on the steps of Dallas Hall into the rotunda onto a screen hanging from the second-floor balcony.
Adjunct Professor John Cotton built the mount for the mirror — with a spare, just in case — and Professor and Department Chairman Ryszard Stroynowski and Sekula arranged the tripod setup and tested the equipment.
Stroynowski also projected an illustration of the Earth, the moon and the sun onto the wall of the rotunda to help people visualize movement and location of those cosmic bodies during the solar eclipse.
Professor Fred Olness handed out cardboard projectors and showed people how to use them to indirectly view the eclipse.
“The turn-out was fantastic,” Olness said. “Many families with children participated, and we distributed cardboard with pinholes so they could project the eclipse onto the sidewalk. It was rewarding that they were enthused by the science.”
Stroynowski, Sekula and others at the viewing event were interviewed by CBS 11 TV journalist Robert Flagg.
Physics Professor Thomas Coan and Guillermo Vasquez, SMU Linux and research computing support specialist, put together a sequence of photos they took during the day from Fondren Science Building.
“The experience of bringing faculty, students and even some out-of-campus community members together by sharing goggles, cameras, and now pictures of one of the great natural events, was extremely gratifying,” Vasquez said.
Sekula said the enthusiastic response from the public is driving plans to prepare for the next event of this kind.
“I’m really excited to share with SMU and Dallas in a total eclipse of the sun on April 8, 2024,” he said.
CERN’s Large Hadron Collider (LHC) and its experiments are back in action, now taking physics data for 2016 to get an improved understanding of fundamental physics.
Following its annual winter break, the most powerful collider in the world has been switched back on.
Geneva-based CERN’s Large Hadron Collider (LHC) — an accelerator complex and its experiments — has been fine-tuned using low-intensity beams and pilot proton collisions, and now the LHC and the experiments are ready to take an abundance of data.
The goal is to improve our understanding of fundamental physics, which ultimately in decades to come can drive innovation and inventions by researchers in other fields.
Scientists from SMU’s Department of Physics are among the several thousand physicists worldwide who contribute on the LHC research.
“All of us here hope that some of the early hints will be confirmed and an unexpected physics phenomenon will show up,” said Ryszard Stroynowski, SMU professor and a principal investigator on the LHC. “If something new does appear, we will try to contribute to the understanding of what it may be.”
SMU physicists work on the LHC’s ATLAS experiment. Run 1 of the Large Hadron Collider made headlines in 2012 when scientists observed in the data a new fundamental particle, the Higgs boson. The collider was then paused for an extensive upgrade and came back much more powerful than before. As part of Run 2, physicists on the Large Hadron Collider’s experiments are analyzing new proton collision data to unravel the structure of the Higgs.
The Higgs was the last piece of the puzzle for the Standard Model — a theory that offers the best description of the known fundamental particles and the forces that govern them. In 2016 the ATLAS and CMS collaborations of the LHC will study this boson in depth.
Over the next three to four months there is a need to verify the measurements of the Higgs properties taken in 2015 at lower energies with less data, Stroynowski said.
“We also must check all hints of possible deviations from the Standard Model seen in the earlier data — whether they were real effects or just statistical fluctuations,” he said. “In the long term, over the next one to two years, we’ll pursue studies of the Higgs decays to heavy b quarks leading to the understanding of how one Higgs particle interacts with other Higgs particles.”
In addition, the connection between the Higgs Boson and the bottom quark is an important relationship that is well-described in the Standard Model but poorly understood by experiments, said Stephen Sekula, SMU associate professor. The SMU ATLAS group will continue work started last year to study the connection, Sekula said.
“We will be focused on measuring this relationship in both Standard Model and Beyond-the-Standard Model contexts,” he said.
SMU physicists also study Higgs-boson interactions with the most massive known particle, the top-quark, said Robert Kehoe, SMU associate professor.
“This interaction is also not well-understood,” Kehoe said. “Our group continues to focus on the first direct measurement of the strength of this interaction, which may reveal whether the Higgs mechanism of the Standard Model is truly fundamental.”
All those measurements are key goals in the ATLAS Run 2 and beyond physics program, Sekula said. In addition, none of the ultimate physics goals can be achieved without faultless operation of the complex ATLAS detector, its software and data acquisition system.
“The SMU group maintains work on operations, improvements and maintenance of two components of ATLAS — the Liquid Argon Calorimeter and data acquisition trigger,” Stroynowski said.
Intensity of the beam to increase, supplying six times more proton collisions
Following a short commissioning period, the LHC operators will now increase the intensity of the beams so that the machine produces a larger number of collisions.
“The LHC is running extremely well,” said CERN Director for Accelerators and Technology, Frédérick Bordry. “We now have an ambitious goal for 2016, as we plan to deliver around six times more data than in 2015.”
The LHC’s collisions produce subatomic fireballs of energy, which morph into the fundamental building blocks of matter. The four particle detectors located on the LHC’s ring allow scientists to record and study the properties of these building blocks and look for new fundamental particles and forces.
This is the second year the LHC will run at a collision energy of 13 TeV. During the first phase of Run 2 in 2015, operators mastered steering the accelerator at this new higher energy by gradually increasing the intensity of the beams.
“The restart of the LHC always brings with it great emotion”, said Fabiola Gianotti, CERN Director General. “With the 2016 data the experiments will be able to perform improved measurements of the Higgs boson and other known particles and phenomena, and look for new physics with an increased discovery potential.”
New exploration can begin at higher energy, with much more data
Beams are made of “trains” of bunches, each containing around 100 billion protons, moving at almost the speed of light around the 27-kilometre ring of the LHC. These bunch trains circulate in opposite directions and cross each other at the center of experiments. Last year, operators increased the number of proton bunches up to 2,244 per beam, spaced at intervals of 25 nanoseconds. These enabled the ATLAS and CMS collaborations to study data from about 400 million million proton–proton collisions. In 2016 operators will increase the number of particles circulating in the machine and the squeezing of the beams in the collision regions. The LHC will generate up to 1 billion collisions per second in the experiments.
“In 2015 we opened the doors to a completely new landscape with unprecedented energy. Now we can begin to explore this landscape in depth,” said CERN Director for Research and Computing Eckhard Elsen.
Between 2010 and 2013 the LHC produced proton-proton collisions with 8 Tera-electronvolts of energy. In the spring of 2015, after a two-year shutdown, LHC operators ramped up the collision energy to 13 TeV. This increase in energy enables scientists to explore a new realm of physics that was previously inaccessible. Run II collisions also produce Higgs bosons — the groundbreaking particle discovered in LHC Run I — 25 percent faster than Run I collisions and increase the chances of finding new massive particles by more than 40 percent.
But there are still several questions that remain unanswered by the Standard Model, such as why nature prefers matter to antimatter, and what dark matter consists of, despite it potentially making up one quarter of our universe.
The huge amounts of data from the 2016 LHC run will enable physicists to challenge these and many other questions, to probe the Standard Model further and to possibly find clues about the physics that lies beyond it.
The physics run with protons will last six months. The machine will then be set up for a four-week run colliding protons with lead ions.
“We’re proud to support more than a thousand U.S. scientists and engineers who play integral parts in operating the detectors, analyzing the data, and developing tools and technologies to upgrade the LHC’s performance in this international endeavor,” said Jim Siegrist, Associate Director of Science for High Energy Physics in the U.S. Department of Energy’s Office of Science. “The LHC is the only place in the world where this kind of research can be performed, and we are a fully committed partner on the LHC experiments and the future development of the collider itself.”
The four largest LHC experimental collaborations, ALICE, ATLAS, CMS and LHCb, now start to collect and analyze the 2016 data. Their broad physics program will be complemented by the measurements of three smaller experiments — TOTEM, LHCf and MoEDAL — which focus with enhanced sensitivity on specific features of proton collisions. — SMU, CERN and Fermilab
New launch of the world’s most powerful particle accelerator is the most stringent test yet of our accepted theories of how subatomic particles work and interact.
Start up of the world’s largest science experiment is underway — with protons traveling in opposite directions at almost the speed of light in the deep underground tunnel called the Large Hadron Collider near Geneva.
As protons collide, physicists will peer into the resulting particle showers for new discoveries about the universe, said Ryszard Stroynowski, a collaborator on one of the collider’s key experiments and a professor in the Department of Physics at Southern Methodist University, Dallas.
“The hoopla and enthusiastic articles generated by discovery of the Higgs boson two years ago left an impression among many people that we have succeeded, we are done, we understand everything,” said Stroynowski, who is the senior member of SMU’s Large Hadron Collider team. “The reality is far from this. The only thing that we have found is that Higgs exist and therefore the Higgs mechanism of generating the mass of fundamental particles is possible.”
There is much more to be learned during Run 2 of the world’s most powerful particle accelerator.
“In a way we kicked a can down the road because we still do not have sufficient precision to know where to look for the really, really new physics that is suggested by astronomical observations,” he said. “The observed facts that are not explained by current theory are many.”
The LHC’s control room in Geneva on April 5 restarted the Large Hadron Collider. A project of CERN, the European Organization for Nuclear Research, the 17-mile LHC tunnel — big enough to ride a bicycle through — straddles the border between France and Switzerland.
Two years ago it made headlines worldwide when its global collaboration of thousands of scientists discovered the Higgs Boson fundamental particle.
The Large Hadron Collider’s first run began in 2009. In 2012 it was paused for an extensive upgrade.
The new upgraded and supercharged LHC restarts at almost twice the energy and higher intensity than it was operating at previously, so it will deliver much more data.
“I think that in the LHC Run 2 we will sieve through more data than in all particle physics experiments in the world together for the past 50 years,” Stroynowski said. “Nature would be really strange if we do not find something new.”
SMU is active on the LHC’s ATLAS detector experiment Within the big LHC tunnel, gigantic particle detectors at four interaction points along the ring record the proton collisions that are generated when the beams collide.
In routine operation, protons make 11,245 laps of the LHC per second — producing up to 1 billion collisions per second. With that many collisions, each detector captures collision events 40 million times each second.
That’s a lot of collision data, says SMU physicist Robert Kehoe, a member of the ATLAS particle detector experiment with Stroynowski and other SMU physicists.
Evaluating that much data isn’t humanly possible, so a computerized ATLAS hardware “trigger system” grabs the data, makes a fast evaluation, decides if it might hold something of interest to physicists, than quickly discards or saves it.
“That gets rid of 99.999 percent of the data,” Kehoe said. “This trigger hardware system makes measurements — but they are very crude, fast and primitive.”
To further pare down the data, a custom-designed software program culls even more data from each nano-second grab, reducing 40 million events down to 200.
Two groups from SMU, one led by Kehoe, helped develop software to monitor the performance of the trigger systems’ thousands of computer processors.
“The software program has to be accurate in deciding which 200 to keep. We must be very careful that it’s the right 200 — the 200 that might tell us more about the Higgs boson, for example. If it’s not the right 200, then we can’t achieve our scientific goals.”
The ATLAS computers are part of CERN’s computing center, which stores more than 30 petabytes of data from the LHC experiments every year, the equivalent of 1.2 million Blu-ray discs.
Flood of data from ATLAS transmitted via tiny electronics designed at SMU to withstand harsh conditions
An SMU physics team also collaborates on the design, construction and delivery of the ATLAS “readout” system — an electronic system within the ATLAS trigger system that sends collision data from ATLAS to its data processing farm.
Data from the ATLAS particle detector’s Liquid Argon Calorimeter is transmitted via 1,524 small fiber-optic transmitters. A powerful and reliable workhorse, the link is one of thousands of critical components on the LHC that contributed to discovery and precision measurement of the Higgs boson.
The custom-made high-speed data transmitters were designed to withstand extremely harsh conditions — low temperature and high radiation.
“It’s not always a smooth ride operating electronics in such a harsh environment,” said Jingbo Ye, the physics professor who leads the SMU data-link team. “Failure of any transmitter results in the loss of a chunk of valuable data. We’re working to improve the design for future detectors because by 2017 and 2018, the existing optical data-link design won’t be able to carry all the data.”
Each electrical-to-optical and optical-to-electrical signal converter transmits 1.6 gigabytes of data per second. Lighter and smaller than their widely used commercial counterpart, the tiny, wickedly fast transmitters have been transmitting from the Liquid Argon Calorimeter for about 10 years.
Upgraded optical data link is now in the works to accommodate beefed-up data flow
A more powerful data link — much smaller and faster than the current one — is in research and development now. Slated for installation in 2017, it has the capacity to deliver 5.2 gigabytes of data per second.
The new link’s design has been even more challenging than the first, Ye said. It has a smaller footprint than the first, but handles more data, while at the same time maintaining the existing power supply and heat exchanger now in the ATLAS detector.
The link will have the highest data density in the world of any data link based on the transmitter optical subassembly TOSA, a standard industrial package, Ye said.
Fine-tuning the new, upgraded machine will take several weeks
The world’s most powerful machine for smashing protons together will require some “tuning” before physicists from around the world are ready to take data, said Stephen Sekula, a researcher on ATLAS and assistant professor of physics at SMU.
The trick is to get reliable, stable beams that can remain in collision state for 8 to 12 to 24 hours at a time, so that the particle physicists working on the experiments, who prize stability, will be satisfied with the quality of the beam conditions being delivered to them, Sekula said.
“The LHC isn’t a toaster,” he said. “We’re not stamping thousands of them out of a factory every day, there’s only one of them on the planet and when you upgrade it it’s a new piece of equipment with new idiosyncrasies, so there’s no guarantee it will behave as it did before.”
Machine physicists at CERN must learn the nuances of the upgraded machine, he said. The beam must be stable, so physicists on shifts in the control room can take high-quality data under stable operating conditions.
The process will take weeks, Sekula said.
10 times as many Higgs particles means a flood of data to sift for gems
LHC Run 2 will collide particles at a staggering 13 teraelectronvolts (TeV), which is 60 percent higher than any accelerator has achieved before.
“On paper, Run 2 will give us four times more data than we took on Run 1,” Sekula said. “But each of those multiples of data are actually worth more. Because not only are we going to take more collisions, we’re going to do it at a higher energy. When you do more collisions and you do them at a higher energy, the rate at which you make Higgs Bosons goes way up. We’re going to get 10 times more Higgs than we did in run 1 — at least.”
SMU’s ManeFrame supercomputer plays a key role in helping physicists from the Large Hadron Collider experiments. One of the fastest academic supercomputers in the nation, it allows physicists at SMU and around the world to sift through the flood of data, quickly analyze massive amounts of information, and deliver results to the collaborating scientists.
During Run 1, the LHC delivered about 8,500 Higgs particles a week to the scientists, but also delivered a huge number of other kinds of particles that have to be sifted away to find the Higgs particles. Run 2 will make 10 times that, Sekula said. “So they’ll rain from the sky. And with more Higgs, we’ll have an easier time sifting the gems out of the gravel.”
Run 2 will operate at the energy originally intended for Run 1, which was initially stalled by a faulty electrical connection on some superconducting magnets in a sector of the tunnel. Machine physicists were able to get the machine running — just never at full power. And still the Higgs was discovered, notes SMU physics professor Fredrick Olness.
“The 2008 magnet accident at the LHC underscores just how complex a machine this is,” Olness said. “We are pushing the technology to the cutting-edge.”
Huge possibilities for new discoveries, but some will be more important than others
There are a handful of major new discoveries that could emerge from Run 2 data, Stroynowski said.
New physics laws related to Higgs — Physicists know only global Higgs properties, many with very poor understanding. They will be measuring Higgs properties with much greater precision, and any deviation from the present picture will indicate new physics laws. “Improved precision is the only guaranteed outcome of the coming run,” Stroynowski said. “But of course we hope that not everything will be as expected. Any deviation may be due to supersymmetry or something completely new.”
Why basic particles have such a huge range of masses — Clarity achieved by precision measurements of Higgs properties may help to shed light on the exact reason for the pattern of masses found in the known fundamental particles. If new particles are discovered in the LHC during Run 2, the mathematical theories that could explain them might also shed light on the puzzle of why masses have such diversity in the building blocks of nature
Dark matter — Astronomical observations require a new form of matter that acts only via gravity, otherwise all galaxies would have fallen apart a long time ago. One candidate theory is supersymmetry, which predicts a host of new particles. Some of those particles, if they exist, would fit the characteristics of dark matter. LHC scientists will be looking for them in the coming run both directly, and for indirect effects.
Quark gluon plasma — In collisions of lead nuclei with each other, LHC scientists have observed a new form of matter called quark gluon plasma. Thought to have been present in the cosmos near the very beginning of time, making and studying this state of matter could teach us more about the early, hot, dense universe.
Mini black-holes — Some scientists are looking for “mini black-holes” predicted by innovative physicist Stephen Hawking, but that is considered “a v-e-e-e-e-r-y long shot,” Stroynowski said.
Matter-antimatter — A cosmic imbalance in the amounts of matter and its opposite, antimatter, must be explained by particle physics. The LHC is home to several experiments and teams that aim to search for answers.
Rios was a graduate student in the SMU Department of Physics and as part of a team led by SMU Physics Professor Ryszard Stroynowski spent from 2007 to 2012 as a member of the ATLAS experiment at Switzerland-based CERN’s Large Hadron Collider, the largest high-energy physics experiment in the world. Rios and the SMU team were part of the successful search for the Higgs boson fundamental particle.
Rios is now a senior research engineer for Lockheed Martin at NASA’s Johnson Space Center.
By Glenn Roberts Jr.
Symmetry Magazine
As a member of the ATLAS experiment at the Large Hadron Collider, Ryan Rios spent 2007 to 2012 surrounded by fellow physicists.
Now, as a senior research engineer for Lockheed Martin at NASA’s Johnson Space Center, he still sees his fair share.
He’s not the only scientist to have made the leap from experimenting on Earth to keeping astronauts safe in space. Rios works on a small team that includes colleagues with backgrounds in physics, biology, radiation health, engineering, information technology and statistics.
“I didn’t really leave particle physics, I just kind of changed venues,” Rios says. “A lot of the skillsets I developed on ATLAS I was able to transfer over pretty easily.”
The group at Johnson Space Center supports current and planned crewed space missions by designing, testing and monitoring particle detectors that measure radiation levels in space.
Massive solar flares and other solar events that accelerate particles, other sources of cosmic radiation, and weak spots in Earth’s magnetic field can all pose radiation threats to astronauts. Members of the radiation group provide advisories on such sources. This makes it possible to warn astronauts, who can then seek shelter in heavier-shielded areas of the spacecraft.
Johnson Space Center has a focus on training and supporting astronauts and planning for future crewed missions. Rios has done work for the International Space Station and the robotic Orion mission that launched in December as a test for future crewed missions. His group recently developed a new radiation detector for the space station crew.
Rios worked at CERN for four years as a graduate student and postdoc at Southern Methodist University in Dallas. At CERN he was introduced to a physics analysis platform called ROOT, which is also used at NASA. Some of the particle detectors he works with now were developed by a CERN-based collaboration.
Fellow Johnson Space Center worker Kerry Lee wound up a group lead for radiation operations after using ROOT during his three years as a summer student on the Collider Detector at Fermilab, or CDF experiment.
The enormous capacity of SMU’s new supercomputer ranks it among the largest academic supercomputers in the nation.
SMU now has a powerful new tool for research – one of the fastest academic supercomputers in the nation – and a new facility to house it.
With a cluster of more than 1,000 Dell servers, the system’s capacity is on par with high-performance computing (HPC) power at much larger universities and at government-owned laboratories. The U.S. Department of Defense awarded the system to SMU in August 2013.
Book a live interview
To book a live or taped interview with Dr. Thomas Hagstrom in the SMU News Broadcast Studio call SMU News at 214-768-7650 or email news@smu.edu.
SMU’s Office of Information Technology added the system to the University’s existing – but much smaller – supercomputer. The system is housed in a new facility built at the corner of Mockingbird and Central Expressway. In a contest sponsored by Provost and Vice President for Academic Affairs Paul W. Ludden, faculty and students chose the name “ManeFrame” to honor the Mustang mascot.
The enormous capacity and speed of HPC expands scientific access to new knowledge around key questions about the universe, disease, human behavior, health, food, water, environment, climate, democracy, poverty, war and peace.
“World-changing discoveries rely on vast computing resources,” says President R. Gerald Turner. “ManeFrame quintuples the University’s supercomputing capacity. Our scientists and students will keep pace with the increasing demand for the ever-expanding computing power that is required to participate in global scientific collaborations. This accelerates our research capabilities exponentially.”
ManeFrame potential
With nearly 11,000 central processing unit cores, ManeFrame boasts 40 terabytes (one terabyte equals a trillion bytes) of memory and more than 1.5 petabytes of storage (a petabyte equals a quadrillion bytes), says Joe Gargiulo, SMU’s chief information officer, who led the installation team.
The sciences and engineering primarily use supercomputers, but that is expanding to include the humanities and the arts. So far, SMU’s heavy users are researchers in physics, math, biology, chemistry and economics.
“This technologically advanced machine will have an impact on shaping our world,” says Thomas M. Hagstrom, chair of the Department of Mathematics in Dedman College and director of SMU’s Center for Scientific Computing. “This makes research that solves problems on a large scale much more accessible. ManeFrame’s theoretical peak would be on the order of 120 Teraflops, which is 120 trillion mathematical operations a second.”
Supercomputers can use sophisticated software and step-by-step procedures for calculations, called algorithms, to solve complex problems that can’t be managed in a researcher’s lab, Hagstrom explains.
“We can’t put the Earth’s climate system or study the evolution of the universe in a physical lab,” he says. “You can only study these and other systems in a comprehensive way using high-performance computing.”
Making SMU competitive
Supercomputing gave University physicists a role in the Higgs Boson research at the Large Hadron Collider in Geneva, Switzerland. Joining the collaboration with thousands of scientists around the world, SMU’s team was led by Physics Professor Ryszard Stroynowski. SMU’s physicists tapped the existing HPC on campus to quickly analyze massive amounts of data and deliver results to their international colleagues.
SMU’s team will use ManeFrame to keep pace with an even larger flood of data expected from the Large Hadron Collider.
“ManeFrame makes SMU – which is small by comparison with many of its peer institutions at CERN – nimble and competitive, and that lets us be visible in a big experiment like CERN,” says Stephen Sekula, assistant professor of physics. “So we have to have ideas, motivation and creativity – but having a technical resource like ManeFrame lets us act on those things.”
SMU physicist Pavel Nadolsky has conducted “big data” analyses of subatomic particles on the supercomputer as part of an international physics collaboration. Big data refers to probability distributions that depend on many variables. As users ranging from retailers to the health industry collect multitudes of transactional data every day, requirements for big data analysis are rapidly emerging.
“To keep up in our field, we need resources like ManeFrame,” says Nadolsky, associate professor of physics.
“The world is moving into big-data analysis, whether it’s Google, Facebook or the National Security Administration,” Nadolsky says. “We learn a lot about the world by studying multidimensional distributions: It tells about the origins of the universe; it can win elections by using data mining to analyze voting probabilities over time in specific geographical areas and targeting campaign efforts accordingly; and it can predict what people are doing. To make students competitive they must be trained to use these tools efficiently and ethically.”
ManeFrame will have a high-profile role in the U.S. Department of Energy experiment called NOvA, which studies neutrinos, a little-understood and elusive fundamental particle that may help explain why matter, and not just light, exists in the universe today. SMU will contribute four million processing hours each year to the experiment, says Thomas E. Coan, associate professor of physics and a member of the international team.
“We’re in good company with others providing computing, including California Institute of Technology and Harvard,” Coan says. “It’s one way for SMU to play a prominent role in the experiment. We get a lot of visibility among all the institutions participating in NOvA, which are spread out across five countries.”
Advancing discovery One of the heaviest users of SMU’s HPC is John Wise, associate professor of biological sciences, who models a key human protein to improve chemotherapy to kill cancer cells. Wise works with the SMU Center for Drug Discovery, Design and Delivery in Dedman College, an interdisciplinary research initiative of the Biology and Chemistry departments and led by Professor of Biological Sciences Pia Vogel.
Within the Mathematics Department, Assistant Professor Daniel R. Reynolds and his team use high-performance computing to run simulations with applications in cosmology and fusion reactors.
Looking to the future, high-performance computing will be increasing in research, business and the arts, according to James Quick, associate vice president for research and dean of graduate studies.
“High-performance computing has emerged as a revolutionary tool that dramatically increases the rates of scientific discovery and product development, enables wise investment decisions and opens new dimensions in artistic creativity,” says Quick, professor of earth sciences. “SMU will use the computational power of ManeFrame to expand research and creativity and develop educational opportunities for students interested in the application of high-performance computing in their fields – be it science, engineering, business or the arts.” – Margaret Allen
SMU is a nationally ranked private university in Dallas founded 100 years ago. Today, SMU enrolls nearly 11,000 students who benefit from the academic opportunities and international reach of seven degree-granting schools. For more information see www.smu.edu.
SMU has an uplink facility located on campus for live TV, radio, or online interviews. To speak with an SMU expert or book an SMU guest in the studio, call SMU News & Communications at 214-768-7650.