7 Bilingual PySpark: blending Python and SQL code

 

This chapter covers:

  • How PySpark’s own data manipulation module takes inspiration from SQL’s vocabulary and way of doing things.
  • How to register data frames as temporary views or tables to query them using Spark SQL.
  • How the catalog stores metadata about registered tables and views, how to list the existing references and delete them.
  • How common data manipulations are expressed in PySpark and Spark SQL and how you can move from one to the other.
  • How to use SQL-style clauses inside certain PySpark methods.

My answer to "Python versus SQL, which one should I learn?" is "yes".

When it comes to manipulating tabular data, SQL is the reigning king. For multiple decades now, it has been the workhorse language for relational databases, and even today, learning how to tame it is a worthwhile exercise. Spark acknowledge the power of SQL heads on: you can use a mature SQL API to transform data frames. On top of that, you can also seamlessly blend SQL code withing your Spark or PySpark program, making it easier than ever to migrate those old SQL ETL jobs without reinventing the wheel.

This chapter is dedicated on SQL interop with PySpark. I will cover how we can move from one language to the other. I will also cover how we can use a SQL-like syntax within data frame methods to speed up your code and some of trade-offs you can face. Finally, we’ll blend Python and SQL code together to get the best of both worlds.

7.1  Banking on what we know: pyspark.sql vs plain SQL

7.2  Using SQL queries on a data frame

7.2.1  Promoting a data frame to a Spark table

7.2.2  Using the Spark catalog

7.3  SQL and PySpark

7.4  Using SQL-like syntax within data frame methods

7.4.1  Select and where

7.4.2  Group and order by

7.4.3  Having

7.4.4  Create tables/views

7.4.5  Union and join

7.4.6  Subqueries and common table expressions

7.4.7  A quick summary of PySpark vs. SQL syntax

7.5  Simplifying our code: blending SQL and Python together

7.5.1  Reading our data

7.5.2  Using SQL-style expressions in PySpark

7.6  Conclusion

7.7  Summary

sitemap