13 Recipes and design patterns for successful concurrent programming

 

This chapter covers

  • Twelve code recipes that answer common problems in parallel programming

The 12 recipes presented in this chapter have broad applications. You can use the core ideas as a reference when you’re facing a similar problem and require a quick answer. The material demonstrates how the functional concurrent abstractions covered throughout this book make it possible to solve complex problems by developing sophisticated and rich functions with relatively few lines of code. I’ve kept the implementations of the recipes as simple as possible, so you’ll need to deal from time to time with cancellations and exception handling.

This chapter shows you how to put together everything you’ve learned so far to combine concurrent programming models using the functional programming abstraction as a glue to write efficient and performant programs. By the end of this chapter, you’ll have at your disposal a set of useful and reusable tools for solving common concurrent coding problems.

Each recipe is built in either C# or F#; for the majority of the code implementation, you can find both versions in the downloadable code online. Also, keep in mind that F# and C# are .NET programming languages with interoperability support to interact with each other. You can easily use a C# program in F# and vice versa.

13.1 Recycling objects to reduce memory consumption

13.1.1 Solution: asynchronously recycling a pool of objects

13.2 Custom parallel Fork/Join operator

13.2.1 Solution: composing a pipeline of steps forming the Fork/Join pattern

13.3 Parallelizing tasks with dependencies: designing code to optimize performance

13.3.1 Solution: implementing a dependencies graph of tasks

13.4 Gate for coordinating concurrent I/O operations sharing resources: one write, multiple reads

13.4.1 Solution: applying multiple read/write operations to shared thread-safe resources

13.5 Thread-safe random number generator

13.5.1 Solution: using the ThreadLocal object

13.6 Polymorphic event aggregator

13.6.1 Solution: implementing a polymorphic publisher-subscriber pattern

13.7 Custom Rx scheduler to control the degree of parallelism

13.7.1 Solution: implementing a scheduler with multiple concurrent agents

13.8 Concurrent reactive scalable client/server

13.8.1 Solution: combining Rx and asynchronous programming

13.9 Reusable custom high-performing parallel filter‑map operator

13.9.1 Solution: combining filter and map parallel operations

13.10 Non-blocking synchronous message-passing model

13.10.1 Solution: coordinating the payload between operations using the agent programming model

13.11 Coordinating concurrent jobs using the agent programming model

13.11.1 Solution: implementing an agent that runs jobs with a configured degree of parallelism

13.12 Composing monadic functions

13.12.1 Solution: combining asynchronous operations using the Kleisli composition operator

Summary