1 Why parallel computing?

This chapter covers

  • What parallel computing is and why it’s growing in importance
  • Where parallelism exists in modern hardware
  • Why the amount of application parallelism is important
  • Software approaches to exploit parallelism

In today’s world, you’ll find many challenges requiring extensive and efficient use of computing resources. Most of the applications requiring performance traditionally are in the scientific domain. But artificial intelligence (AI) and machine learning applications are projected to become the predominant users of large-scale computing. Some examples of these applications include

  • Modeling megafires to assist fire crews and to help the public
  • Modeling tsunamis and storm surges from hurricanes (see chapter 13 for a simple tsunami model)
  • Voice recognition for computer interfaces
  • Modeling virus spread and vaccine development
  • Modeling climatic conditions over decades and centuries
  • Image recognition for driverless car technology
  • Equipping emergency crews with running simulations of hazards such as flooding
  • Reducing power consumption for mobile devices

1.1 Why should you learn about parallel computing?

1.1.1 What are the potential benefits of parallel computing?

1.1.2 Parallel computing cautions

1.2 The fundamental laws of parallel computing

1.2.1 The limit to parallel computing: Amdahl’s Law

1.2.2 Breaking through the parallel limit: Gustafson-Barsis’s Law

1.3 How does parallel computing work?

1.3.1 Walking through a sample application

1.3.2 A hardware model for today’s heterogeneous parallel systems

1.3.3 The application/software model for today’s heterogeneous parallel systems

1.4 Categorizing parallel approaches

1.5 Parallel strategies