7 Bilingual PySpark: blending Python and SQL code

 

This chapter covers

  • Drawing a parallel between PySpark’s instruction sets and the SQL vocabulary.
  • Registering data frames as temporary views or tables to query them using Spark SQL.
  • Using the catalog to create, reference, and delete registered tables for SQL querying.
  • Translating common data manipulations instructions from Python to SQL and vice-versa.
  • Using SQL-style clauses inside certain PySpark methods.

My answer to "Python versus SQL, which one should I learn?" is "yes".

When it comes to manipulating tabular data, SQL is the reigning king. For multiple decades now, it has been the workhorse language for relational databases, and even today, learning how to tame it is a worthwhile exercise. Spark acknowledge the power of SQL heads on. You can seamlessly blend SQL code within your Spark or PySpark program, making it easier than ever to migrate those old SQL ETL (extract, transform, load) jobs without reinventing the wheel.

This chapter is dedicated to using SQL with, and on top of PySpark. I cover how we can move from one language to the other. I also cover how we can use a SQL-like syntax within data frame methods to speed up your code and some of trade-offs you can face. Finally, we blend Python and SQL code together to get the best of both worlds.

7.1 Banking on what we know: pyspark.sql vs plain SQL

7.2 Using SQL queries on a data frame

7.2.1 Promoting a data frame to a Spark table

7.2.2 Using the Spark catalog

7.3 SQL and PySpark

7.4 Using SQL-like syntax within data frame methods

7.4.1 Get the rows and columns you want: select and where

7.4.2 Grouping similar records together: group by and order by

7.4.3 Filtering after grouping, using having

7.4.4 Creating new tables/views using the CREATE keyword

7.4.5 Adding data to our table, using UNION and JOIN

7.4.6 Organize your SQL code better through subqueries and common table expressions

7.4.7 A quick summary of PySpark vs. SQL syntax

7.5 Simplifying our code: blending SQL and Python together

7.5.1 Using Python to increase the resiliency and simplifying the data reading stage

7.5.2 Using SQL-style expressions in PySpark

7.6 Conclusion

7.7 Summary

7.8 Exercises