<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=1875728222659097&amp;ev=PageView&amp;noscript=1">

High Performance Computing (HPC): The Tree and the Fruit

Home » Blog » High Performance Computing (HPC): The Tree and the Fruit

The Networking and Information Technology Research Development (NITRD) program, which is the primary federally funded work on advanced Information Technology (IT) and Networking in the USA, held a symposium in Washington DC. At this meeting, two decades of game changing breakthroughs and the emergence of IT in every day life were presented and discussed. Prof. David Keyes, from Columbia University, gave a speech at this event and he presented and discussed the emergence and influence of High Performance Computing (HPC) in science and engineering. His speech (Keyes 2012) revealed the maturity of IT and its invaluable contribution to the scientific endeavor and technological progress. Taking thrust from this presentation, the current article addresses the impact of HPC and how it is utilized in various fields across science and engineering. 

The HPC impact

The IT infrastructure around us today has been built by scientists and engineers, starting with Alan Turing, from all around the world and diverse research establishments.  The world is not the same as twenty years ago. Internet is part of every aspect of our social and personal lives. Art is deeply influenced and has evolved along with IT. Inevitably, science and engineering have transformed - because of IT, and the present computational capabilities. Space missions to Mars and current satellite systems have become affordable, and even feasible, due to the birth and evolution of computational sciences. The economic impact is unprecedented as commerce and trade have overcome past limitations and restrictions due to physical reasons. The complex world has become simpler and much easier for many people on the planet to travel, communicate and enjoy products from every part of the world. 

These social and economic transformations, and achievements to a great extent, have become feasible as the technological restrictions of the past have been overcome due to HPC systems. Computer simulation is now "de facto" in every scientific and engineering process. The cost of computing has dropped dramatically leading to the adoption of techniques and methods permitting invaluable insight and prediction capabilities for physical phenomena and product developments.

The scientists, engineers, and mathematicians who have designed the IT infrastructure around us today, have used a small slice of just one-tenth of one percent of the federal budget from the NITRD agencies. These experts, notably the computer scientists, have transformed this investment to improve the daily lives of citizens.

Corporate and US government organizations have transformed this knowledge and world-leading IT capability into products and services, that have led to an increase in US productivity over the past twenty years. These IT capabilities have also contributed heavily to its exports.

The key purpose of David Keyes’ presentation, was to take an ‘inward’ look at how these same scientists and engineers, used these NITRD-funded capabilities to transform the world of science and engineering too, ultimately creating a new field – computational science and engineering.

The Third Pillar of Scientific Discovery?

Since the early 1980s computer simulation has been acknowledged as the third pillar of scientific discovery. But, in his presentation Prof. Keyes does not support fully this perspective. He presents computer simulation to be a merge between theory and experiments, a hybrid of these two, that greatly assists and enhances the scientific and, eventually, the engineering progress as well. Our technological civilization and knowledge about the universe are mainly due to the experimental method, but the knowledge of the physical world is constantly enriched because of the current IT and computing capabilities. The formation of the stars and the planets, the study of black holes are just two examples of where computer simulation has proved its potential as a pillar of the scientific discovery. 

Screen Shot 2017-02-09 at 21.00.46.png

 

Despite Prof. Keyes' very respectable opinion and perspective on this, the current practice of computer simulation - especially in the bio-engineering and physics field, proves its stand as a third pillar of science. The engineering design of complex systems relies heavily on IT and the dramatic cost reduction achieved in the development of new products is a fact that strengthens the capacity of high fidelity predictions of physical processes.

Simulation Driven by Price & Capability

The Gordon Bell Prize is an award to recognize achievements in HPC and can be used as a gauge of progress in this field. By this award, simulations 'cost per performance' has met an improvement of about a million times across two decades. The performance of real applications over a wide a diverse range of field has also met an improvement more than million times as it is depicted in the following figure.

 Screen Shot 2017-02-09 at 21.02.11.png

To put this progress into perspective, Prof. Keyes makes a number of whimsical remarks to show an active comparison, and measured by Bell Prizes, since 1988. If similar improvements in storage had been realized in the publishing industry, a regular office bookcase could hold the book portion of the collection of the U.S. Library of Congress (22million volumes). In the education industry, a similar reduction in cost would have resulted in a tuition room and board at a U.S. college, costing just $0.20 per year.

 Screen Shot 2017-02-09 at 21.03.48.png

A thought experiment (using peanut butter) shows how any quantity whose price is predictably improving could be used. At $1,150/ton, it is simply a luxury, yet if we were sure it would fall to just $0.115/ton in 2024, we would find a way to use it for road paving! Computing has been on this curve for twenty years and promises - despite some major technical challenges such as exascale computing – to remain on this same course. For this reasons, computer scientists will remain committed to increasing its widespread use.

Moreover, in every day life, the emergence of HPC hardware is so common now that HPC can be considered a commodity available to almost everyone able to buy a PC desktop or a laptop.  The cost reduction in processor manufacturing, and furthermore, the IT infrastructure itself around the globe has offered to every scientist and engineer the capacity to carry out complex simulations using multi-core processors, and even graphics processor units for very demanding applications, at a budget of around 2000 dollars. 

CFD Penetration In-Practice 

Computational Fluid Dynamics (CFD) is a field that has greatly enhanced the development of products and processes, especially the last two decades. Its penetration in the aerospace industry has expanded greatly (see EXN/Aero NASA) and Boeing since the 1980s has used it extensively in the design and development of its aircraft. The following figures present the utilization of this engineering discipline in the design of B767 in 1979 and B787 in 2005, proving the embrace of CFD by the aerospace industry (see CFD in aerospace). 

Diagrams comparing CFD penetration in a B767 in 1979, and a B787 in 2005, show how [companies such as] Boeing has embraced the software opportunities, reducing wind tunnel testing. In the latter diagram, the potential for further development is clear, highlighted by the red and blue areas.

 Screen Shot 2017-02-09 at 21.05.17.pngScreen Shot 2017-02-09 at 21.05.23.png

 

The penetration is evident with the corresponding performance in real world applications and cost and time reduction in the development process. However, there is still room for improvement as wind tunnel testing is still necessary. 

High-Performance CFD 

Never before in the field of CFD the state-of-the-art computer hardware has been exploited in such a manner that, as of now, the simulation of complex flow settings to be done at a fraction of the time compared to the capabilities of recent past years. A Canadian engineering simulations company (www.envenio.ca) from Fredericton, New Brunswick, has achieved to create and commercialize a general purpose CFD solver capable of complex flow simulations around three dimensional geometries at the fraction of the turnaround of the already available CFD codes. Their creation is the beginning of a new era in CFD, that of real High-Performance CFD (HPCFD). A novel algorithm that permits the parallel solution of flow problems not only in space but in time as well, along with the utilization of cutting-edge computer hardware has provided a design tool capable of revolutionizing the design and optimization of complex engineering systems. 

 Buyer Driving Factors in HPC

As developments in computer science occur, more scientists and engineers want to make use of it, and the ‘ability to do new and better science’ is named as the top reason for acquiring HPC systems (in an IDC survey). The top five reasons are given below;

  1. Ability to do new/better science.
  2. Ability to run larger or higher resolution problems.
  3. Performance on applications.
  4. Throughput.
  5. Price/Performance.

Furthermore, the emergence of cloud computing is giving the capacity to scientists and engineers all around the world to use immense HPC power cheaper and at a higher availability, thus avoiding to spend thousands of dollars for investing on personal HPC infrastructure.

Simulation Complements Experiments

There are a number of instances where simulation plays a vital role, even proving cheaper, safer, or more feasible than experimentation. Prof. Keyes points out that simulation should complement experimentation, and highlights those instances where experimentation alone is not ideal or even possible;  

  1. Environmental Disaster: Experiments would be dangerous.
  2. Applied Physics (e.g. nuclear weapons): Experiments would be prohibited or impossible.
  3. Biology (e.g. drug design): Experiments would be controversial.
  4. Engineering (aerodynamics): Experiments difficult to instrument.
  5. Physics & Energy (LHC, ITER): Experiments are expensive.

The study of the response of a facility under blast loading conditions is an example of where computer simulation is vital due to the insight it provides, but above all the avoidance of danger inherent in experiments with explosions (TMS 2017).  Another example is the simulation of the entry of a spacecraft in Marsian atmosphere where the the physical experimentation on earth is almost impossible to perform.

 The Scientific Discovery – Balance Shift

The following diagram shows how a century of scientific discovery has transformed the way research organizations work, and where the ratio of simulation vs. experiment is heading. The advancements in cost-per-flop and heightened productivity expectations will lead to a greater ratio of simulations to experiments. 

Screen Shot 2017-02-09 at 21.10.36.png

Crash-tests for vehicles, mold designs, metal forming techniques, nuclear engineering processes, explosion studies and many other applications have met significant progress due to HPC simulations. When digital computing was first invented, it was primarily used to try to understand things that had already been discovered experimentally and are described by non linear mathematical equations. At present, large and small organizations use more computer simulations to predict, while using expensive experiments only to confirm.    

A progress law in real applications

Prof. Keyes sees real life applications progress through Moore’s Law. This progress has followed very closely the advancement of computer hardware in several demanding cases like clean combustion and fusion energy simulations. ln these cases Prof. Keyes sees a "Moore's law", that is a trend of progress tightly connected to the number of transistors packed in a chip.

Screen Shot 2017-02-09 at 21.11.35.png

Similar occurrences have been documented in physical and chemical areas, and the example below shows Moore's Law for clean combustion simulations through improved hardware, resulting in more detailed resolution, higher fidelity physics, more multi-physics and occasionally a mathematical or data structure breakthrough, that leads to shooting above the Moore’s Law curve.

Screen Shot 2017-02-09 at 21.12.55.png

In effect, Moore’s Law has influenced the ‘bottom line’ of many companies, as the increased computational capability and accuracy, and progress in algorithmic solutions results in less testing, lower cost, and better products. This is highlighted by Boeing's example, where they have been able to utilize the benefits of CFD and reduce wind tunnel, structural and flight testing.

Screen Shot 2017-02-09 at 21.20.06.png

Moore’s law has also played a key role in the stockpile stewardship program at the Department of Energy, allowing for devices whose quality cannot be ascertained by experimentation, to be simulated. This allows the design of new devices to then take place through computation.

Note; While Moore's Law has no doubt had a hugely influential impact on computer science, experts currently show that computers should reach physical limits of Moore's Law sometime in the next decade. While a software or hardware breakthrough could keep the Moore's law dream alive, computer scientists and software companies are preparing for new courses.

Historical References

History has played a key role in the advances we see today, and the time line shows a number of the key influences who are attributable to the success.

Screen Shot 2017-02-09 at 21.25.16.png

The current trend is perhaps the incorporation of best practices from commercial software engineering into the primarily freely downloadable DOE, NSF, and other agency-generated software libraries, which allow a much wider audience to access and use them effectively. Prof. Keyes emphasizes the need to "build on all four of these histories in promoting computational science through this ‘phase transition’ to a fully predictive and increasingly quantified science". He also encourages the use of an interchangeable system, reflecting on the success of the U.S. system, and highlighting how idea-oriented and mission-oriented organizations make strong partners. "In some countries, the barrier between the 'basic' and the 'applied' can be vast, even at university level" he says. He adds how such a divide could be "critical to innovation". 

Treasury of Ideas

By placing basic research deposits into the so-called 'treasury of ideas', scientists have the option and ability to come back to ideas when they are required, and even at a time when they may prove vital. "When the algorithmic advances are driven by applications, we get out of our academic sandboxes and try things that are often difficult", Keyes says. The diagram below shows a few examples which were invented as theoretical constructs, and had very little practical interest at the time, but have since been re-used at a time when they were most needed.

Screen Shot 2017-02-09 at 21.28.26.png 

Prof. Keyes recognizes how we are running out of "performance" all the time in science and engineering. Whether it’s because of the need for higher resolution for multi-scale phenomena, higher fidelity simulations, or for the optimization of ever more complex systems and processes, the need for exascale computing remains, as portrayed in the report of the scientific committee on exascale computing from the Office of Science of the U.S. Department of Energy (ASCAC report).

[Prof. Keyes recognizes how we are running out of 'performance' all the time in science and engineering. Whether it’s because of the need for [massive] high resolution for multi-scale phenomena, high fidelity simulations, or the desire to run optimization loops around the fundamental forward problems to do inverse problems for parameter estimation - these all take pedascale apps and make us want to throw exascale resources at them in order to use them in the truly scientific mode.

The Future of Simulation

Another pillar of scientific discovery has been stated in Prof. Keyes presentation that of data-enabled science. The importance of data-enabled science, and response to the needs of scientific computation has given impetus to the creation of a new tool for scientific discovery and thus engineering progress, which relies on the data representation of the models and processes that confront the mathematical and statistical challenges faced by the scientists and engineers. Prof. Keyes mentions his excitement to bring the third and fourth paradigms together, and acknowledges that while economic, health care and social effects of computing are important factors, the scientific aspects are just as important.

The Tree and The Fruit

HPC is a phenomenally productive tree in the pursuit of scientific knowledge and technological advance. It is clear to see why Prof. Keyes describes HPC also as a fruit - an exciting fusion of computer science, mathematics and a lot of inspiration and imagination that grows extraordinarily well all around the world. There is no doubt that NITRD has envisioned and provisioned HPC for over two decades, [that it] and looks forward to an ever-changing and exciting future.

About Professor David Keyes

Click here to read the full presentation.

Screen Shot 2017-02-09 at 21.48.40.png

Prof. David E. Keyes was the Fu Foundation Professor of Applied Mathematics in the Department of Applied Physics and Applied Mathematics at Columbia University and is a faculty affiliate at several national laboratories of the U.S. Department of Energy. He became the inaugural Chair of the Division of Mathematical and Computer Sciences and Engineering at KAUST, the King Abdullah University of Science & Technology in Saudi Arabia, in 2009. He is the author of over 100 publications in computer science and engineering. 

 
2018-10-15 | Categories: CFD, simulations, HPC

Popular Posts