Skip to main content

Databricks Jobs make it simple to run notebooks, Jars and Python eggs on a schedule. Our customers use Jobs to extract and transform data (ETL), train models and even email reports to their teams. Today, we are happy to announce a streamlined UI for jobs and new features designed to make your life easier.

The most obvious change is that instead of a single page containing all the information; there are two tabs: Runs and Configuration. You use the Configuration tab to define the Job, whereas the Runs tab contains active and historical runs. This small change allowed us to make room for new features:

New Databricks Jobs UI has a cleaner look.

While doing the facelift, we added a few more features. You can now easily clone a job -- useful if you want to, say, change the cluster type but not make changes to a production job. Also, we added the ability to pause a job’s schedule, which you can use to make changes to a job while preventing new runs until the changes are done. Lastly, as shown above, we also added parameter variables, e.g. {{start_date}}, which are interpreted when a job run starts and is replaced with an actual value e.g. “2021-03-17”. A handful of parameter variables are already available, and we plan on expanding this list.

We are excited to be improving the experience of developing jobs while adding useful features for data engineers and data scientists. The update to the Jobs UI also sets the stage for some exciting new capabilities we will announce in the coming months. Finally, we would love to hear from you -- please use the feedback button in the UI to let us know what you think. Stay tuned for more updates!

Try Databricks for free

Related posts

How We Performed ETL on One Billion Records For Under $1 With Delta Live Tables

Today, Databricks sets a new standard for ETL (Extract, Transform, Load) price and performance. While customers have been using Databricks for their ETL...

Near Real-Time Anomaly Detection with Delta Live Tables and Databricks Machine Learning

Why is Anomaly Detection Important? Whether in retail, finance, cyber security, or any other industry, spotting anomalous behavior as soon as it happens...

5 Steps to Implementing Intelligent Data Pipelines With Delta Live Tables

September 8, 2021 by Awez Syed and Amit Kara in
Get an early preview of O'Reilly's new ebook for the step-by-step guidance you need to start using Delta Lake Many IT organizations are...
See all Product posts