Skip to main content

Introducing Databricks Assistant Quick Fix

Automatically Fix SQL and Python Errors
Share this post

Summary

Introducing Databricks Assistant Quick Fix, a new feature designed to automatically correct common, single-line code errors like syntax mistakes, unresolved columns, and type conversions!
Users can now streamline debugging with AI for faster error resolution. Skip the usual steps—fix errors in one smooth motion. See it in action

Today, we're excited to introduce Databricks Assistant Quick Fix, a powerful new feature designed to automatically correct common, single-line errors such as syntax mistakes, unresolved columns, type conversions, and more.

Our research shows that over 70% of errors are simple mistakes that don’t need lengthy explanations or extensive documentation searches to fix. With Assistant Quick Fix, we've created a more integrated solution to streamline your debugging process, harnessing the power of AI to enhance your coding efficiency. 

How does Assistant Quick Fix Work

Assistant Quick Fix leverages the Databricks Assistant to suggest error fixes but is optimized to quickly fix specific errors that users encounter frequently during SQL or Python authoring. A key goal is that Quick Fix is fast. Suggestions are returned quickly and you can accept without taking your hands off the keyboard. 

1

What types of errors do we catch?

Assistant Quick Fix is capable of resolving a wide range of SQL and Python errors, specifically including:

  • Trailing commas
  • Mistyped column, table names, or functions
  • Missing GROUP BY clauses
  • Syntax errors
  • Data type mismatch (ex. parsing strings into timestamps)

Keyboard shortcuts and UX

We designed Quick Fix to be as minimally intrusive as possible.  Within 1-3 seconds, you'll receive an inline, single-line suggestion that you can accept (Cmd+’), accept and run (Cmd+ENTER), or reject (ESC).

Optimizing Quick Fix 

We tuned Quick Fix to focus on a specific subset of common errors that users encounter frequently. Here are some techniques we leveraged:

  • Fuzzy matching / semantic search: For misspelled table and column names we use the Intelligent Search API to find the right tables in real-time. Intelligent search leverages recently used and popular tables to find the right match.
  • Post-processing to validate fixes: We run the generated fix through code linters (Antlr and LSP) to ensure suggestions are valid Python or SQL before displaying it to the user.
  • Guardrails for nonsensical fixes: LLMs sometimes produce illogical suggestions, like replacing variables with themselves ("A = A") or commenting out lines. We remove these fixes during post-processing to ensure suggestions are useful.
  • Custom post-processing for specific errors: For errors like "UNRESOLVED_COLUMN.WITH_SUGGESTION," we verify that the suggested fix addresses the unresolved column issue directly, rather than applying unrelated or incorrect fixes.
  • Different strategies for SQL vs. Python errors: For SQL, we focused on schema-aware fixes like matching tables and columns using real-time search, whereas for Python, we emphasized identifying undefined variables and correcting type mismatches by analyzing the active code context.

After making these adjustments, we saw the following increases in acceptance rates:

Error Type

Language

% Improvement over Diagnose Error

Missing/incorrect columns 

SQL

14.55%

PARSE_SYNTAX_ERROR 

SQL

12.31%

TABLE_OR_VIEW_NOT_FOUND 

SQL

20%

NameError 

Python

13.89%

TypeError 

Python

16.67%

On top of this, we gathered additional feedback that helped us determine the optimal maximum wait time, patterns for managing active suggestions, and the best way to implement keyboard shortcuts. As a result, we were able to raise our internal acceptance rate by 25%.

Future Improvements

We’re continuing to tune what errors can be automatically resolved with Quick Fix. Future enhancements will include fixing multiple errors at once, fixing errors while you type, and adding support for the SQL Editor. 

Try Databricks Assistant Today!

To see Databricks Assistant in action check out our demo video to see how you can use Assistant to build data pipelines, SQL queries, and data visualizations. Learn other ways to use the Databricks Assistant to increase your developer productivity by checking out our blog on Tips and Tricks on using the Databricks Assistant.

 

Try Databricks for free

Related posts

Introducing Databricks Assistant Autocomplete

We are excited to introduce Databricks Assistant Autocomplete now in Public Preview. This feature brings the AI-powered assistant to you in real-time, providing...

Databricks Assistant Tips & Tricks for Data Engineers

The generative AI revolution is transforming the way that teams work, and Databricks Assistant leverages the best of these advancements. It allows you...

Research Survey: Productivity benefits from Databricks Assistant

In the fast-paced landscape of data science and engineering, integrating Artificial Intelligence (AI) has become integral for enhancing productivity. We’ve seen many tools...
See all Platform & Products & Announcements posts