chapter four

4 Multithreading basics

 

This chapter covers

  • Understanding threads
  • Starting threads
  • Waiting for threads
  • Accessing shared data and using locks
  • Understanding Deadlocks

In Chapter One, we discussed how a system can run multiple pieces of code simultaneously—much more than the number of CPU cores—by quickly switching between them. This is made possible by a hardware timer inside the CPU. Each time the timer ticks, the operating system can pause the currently running code and switch to another piece of code. If this switching happens quickly enough, it creates the illusion that all threads are running at the same time.

In this chapter, we will explore how to use threads for parallel execution and discuss key aspects of concurrent programming. In the next chapter, we will relate this to the async-await feature.

When a process starts, it begins with one thread that runs the Main method (along with a few other system-controlled threads, which we will set aside for now). This initial thread is referred to as the main thread. We will now look at how to utilize additional threads to allow multiple pieces of code to run at the same time.

4.1 Different ways to run in another thread

Now that we’ve decided we want to run code in parallel we need to talk about how to do it, in this section we are going to cover the three most common ways to run code in another thread in C#. We will start with the oldest and most flexible option – creating your own thread.

4.1.1 Thread.Start

4.1.2 The thread pool

4.1.3 Task.Run

4.2 Accessing the same variables from multiple threads

4.2.1 No shared data

4.2.2 Immutable shared data

4.2.3 Locks and mutexes

4.2.4 Deadlocks

4.3 Special considerations for native UI apps

4.4 Waiting for another thread

4.5 Other synchronization methods

4.6 Thread Settings

4.6.1 Thread background status

4.6.2 Language and locale

4.6.3 COM Apartment

4.6.4 Current User

4.6.5 Thread Priority

4.7 Summary