4 Multithreading basics

 

This chapter covers

  • The basics of threads
  • Starting threads
  • Waiting for threads
  • Accessing shared data and using locks
  • The basics of deadlocks

Chapter 1 discussed how a system can run multiple pieces of code simultaneously—much more than the number of CPU cores—by quickly switching between them. This functionality is made possible by a hardware timer inside the CPU. Each time the timer ticks, the operating system can pause the currently running code and switch to another piece of code. If the switching is fast enough, it creates the illusion that all threads are running simultaneously.

This chapter explores how to use threads for parallel execution and discusses key aspects of concurrent programming. In the next chapter, we will connect these topics to the async/await feature.

When a process starts, it begins with one thread that runs the Main method (along with a few other system-controlled threads, which we will set aside for now). This initial thread is referred to as the main thread. We will now look at how to utilize additional threads to allow multiple pieces of code to run simultaneously.

4.1 Different ways to run in another thread

Now that we’ve decided we want to run code in parallel, we need to talk about how to do it. This section covers the three most common ways to run code in another thread in C#. We will start with the oldest and most flexible option—creating your own thread.

4.1.1 Thread.Start

4.1.2 The thread pool

4.1.3 Task.Run

4.2 Accessing the same variables from multiple threads

4.2.1 No shared data

4.2.2 Immutable shared data

4.2.3 Locks and mutexes

4.2.4 Deadlocks

4.3 Special considerations for native UI apps

4.4 Waiting for another thread

4.5 Other synchronization methods

4.6 Thread settings

4.6.1 Thread background status

4.6.2 Language and locale

4.6.3 COM Apartment

4.6.4 Current user

4.6.5 Thread priority