Deep Learning with JAX cover
welcome to this free extract from
an online version of the Manning book.
to read more
or

welcome

 

Thank you for purchasing the MEAP for Deep Learning with Jax!

JAX is a Python mathematics library with a NumPy interface developed by Google. It is heavily used for machine learning research, and it seems that JAX has already become the #3 deep learning framework (after TensorFlow and PyTorch). It also became the main deep learning framework in companies such as DeepMind, and more and more of Google’s own research use JAX.

JAX promotes a functional programming paradigm in deep learning. It has powerful function transformations such as taking gradients of a function, JIT-compilation with XLA, auto-vectorization, and parallelization. JAX supports GPU and TPU and provides great performance.

JAX gives you a strong foundation for building your neural networks, but the real power comes from its constantly growing ecosystem. There are many machine learning-related libraries, including high-level deep learning libraries Flax (by Google) and Haiku (by DeepMind), a gradient processing and optimization library called Optax, libraries for graph neural networks, reinforcement learning, evolutionary computations, federated learning, and so on. Together with its ecosystem, JAX provides an exciting alternative to the two current state-of-the-art deep learning frameworks — PyTorch and TensorFlow.

We are going to cover all these topics in the book.

sitemap