This talk shows you best practices for unit testing PySpark code. Unit tests help you reduce production bugs and make your codebase easy to refactor. You will learn how to create PySpark unit tests that run locally and in CI via GitHub actions. You will learn best practices for structuring PySpark code so it’s easy to unit test. You’ll also see how to run integration tests with a cluster for staging datasets. Integration tests provide an additional level of safety.
Talk By: Matthew Powers, Staff Developer Advocate, Databricks
Here’s more to explore:
Big Book of Data Engineering: 2nd Edition: dbricks.co/3Xp...
The Data Team's Guide to the Databricks Lakehouse Platform: dbricks.co/46n...
Connect with us: Website: databricks.com
Twitter: / databricks
LinkedIn: / data…
Instagram: / databricksinc
Facebook: / databricksinc
18 сен 2024