7 Bilingual PySpark: Blending Python and SQL code

 

This chapter covers

  • Drawing a parallel between PySpark’s instruction set and the SQL vocabulary
  • Registering data frames as temporary views or tables to query them using Spark SQL
  • Using the catalog to create, reference, and delete registered tables for SQL querying
  • Translating common data manipulations instructions from Python to SQL, and vice versa
  • Using SQL-style clauses inside certain PySpark methods

My answer to the question “Python versus SQL, which one should I learn?” is “both.”

When it comes to manipulating tabular data, SQL is the reigning king. For multiple decades now, it has been the workhorse language for relational databases, and even today, learning how to tame it is a worthwhile exercise. Spark acknowledges the power of SQL head-on. You can seamlessly blend SQL code within your Spark or PySpark program, making it easier than ever to migrate those old SQL ETL jobs without reinventing the wheel.

This chapter is dedicated to using SQL with, and on top of, PySpark. I cover how we can move from one language to the other. I also cover how we can use a SQL-like syntax within data frame methods to speed up your code, and some of the trade-offs you may face. Finally, we blend Python and SQL code to get the best of both worlds.

7.1 Banking on what we know: pyspark.sql vs. plain SQL

7.2 Preparing a data frame for SQL

7.2.1 Promoting a data frame to a Spark table

7.2.2 Using the Spark catalog

7.3 SQL and PySpark

7.4 Using SQL-like syntax within data frame methods

7.4.1 Get the rows and columns you want: select and where

7.4.2 Grouping similar records together: group by and order by

7.4.3 Filtering after grouping using having

7.4.4 Creating new tables/views using the CREATE keyword

7.4.5 Adding data to our table using UNION and JOIN

7.4.6 Organizing your SQL code better through subqueries and common table expressions

7.4.7 A quick summary of PySpark vs. SQL syntax

7.5 Simplifying our code: Blending SQL and Python

7.5.1 Using Python to increase the resiliency and simplifying the data reading stage

7.5.2 Using SQL-style expressions in PySpark

sitemap