Chapter 6. Concurrency

 

In this chapter

  • Running code with goroutines
  • Detecting and fixing race conditions
  • Sharing data with channels

Often a program can be written as one linear path of code that performs a single task and finishes. When this is possible, always choose this option, because this type of program is usually simple to write and maintain. But there are times when executing multiple tasks concurrently has greater benefit. One example is with a web service that can receive multiple requests for data on individual sockets at the same time. Each socket request is unique and can be independently processed from any other. Having the ability to execute requests concurrently can dramatically improve the performance of this type of system. With this in mind, support for concurrency has been built directly into Go’s language and runtime.

Concurrency in Go is the ability for functions to run independent of each other. When a function is created as a goroutine, it’s treated as an independent unit of work that gets scheduled and then executed on an available logical processor. The Go runtime scheduler is a sophisticated piece of software that manages all the goroutines that are created and need processor time. The scheduler sits on top of the operating system, binding operating system’s threads to logical processors which, in turn, execute goroutines. The scheduler controls everything related to which goroutines are running on which logical processors at any given time.

6.1. Concurrency versus parallelism

6.2. Goroutines

6.3. Race conditions

6.4. Locking shared resources

6.5. Channels

6.6. Summary

sitemap