Collaborative Notebooks - Databricks

Collaborative Notebooks

Collaborative data science with familiar languages and tools

Perform quick exploratory data science or build machine learning models using collaborative notebooks that support multiple languages, built-in data visualizations, automatic versioning, and operationalization with jobs.

SQL
Scala
R studio
Python
Java

Benefits

WORK TOGETHER

Share notebooks and work with peers in multiple languages (R, Python, SQL and Scala) and libraries of choice. Real-time coauthoring, commenting, and automated versioning simplify collaboration while staying in control.

SHARE INSIGHTS

Quickly discover new insights with built-in interactive visualizations or any library like matplotlib or ggplot. Export results and notebooks in html or ipynb format, or build and share dashboards that always stay up to date.

OPERATIONALIZE AT SCALE

Schedule notebooks to automatically run Machine Learning and data pipelines at scale, and create multi-stage pipelines using notebooks workflows. Set up alerts and quickly access audit logs for easy monitoring and troubleshooting.

Features

 

Data Access: Quickly access available data sets or connect to any data sources, on-premises or in the cloud.

Multi-Languages Support: Explore data using interactive notebooks with support for multiple programming languages within the same notebook, including R, Python, Scala, and SQL.

Interactive Visualizations: Visualize insights through a wide assortment of point-and-click visualizations. Or use powerful scriptable options like matplotlib, ggplot, and D3.

Real-Time Coauthoring: Work on the same notebook in real-time while tracking changes with detailed revision history.

Comments: Leave a comment and notify colleagues from within shared notebooks.

Automatic Versioning: Tracking changes and versioning automatically happens so that you can continue where you left off or revert changes.

Git Versioning in Git: Integrate with Git for resilient and advanced versioning capabilities.

Runs Sidebar: Automatically log experiments, parameters and results from notebooks directly to MLflow as runs, and quickly see and load previous runs and code versions from the sidebar.

Dashboards: Share insights with your colleagues and customers, or let them run interactive queries with Spark-powered dashboards.

Run Notebooks as Jobs: Turn notebooks or JARs into resilient production jobs with a click or an API call.

Jobs Scheduler: Execute jobs for production pipelines on a specific schedule.

Notebook Workflows: Create multi-stage pipelines with the control structures of the source programming language.

Notifications and Logs: Set up alerts and quickly access audit logs for easy monitoring and troubleshooting.

Permissions Management: Quickly manage access to each individual notebook, or a collection of notebooks, and experiments, with one common security model.

Clusters: Quickly attach notebooks to auto-managed clusters to efficiently and cost-effectively scale up compute at unprecedented scale.

Integrations: Connect to Tableau, Looker, PowerBI, RStudio, SnowFlake etc allowing data scientists and engineers to use familiar tools.

How It Works

 

Shared and interactive notebooks, experiments, and extended files support allow data scientists teams to organize, share, and manage complex data science projects more effectively throughout the lifecycle. APIs and Job Scheduler allow data engineering teams to quickly automate complex pipelines, while business analysts can directly access results via interactive dashboards. 
 

Customer Stories

 

Reimagining Devon Energy’s Data Estate with a Unified Approach to Integrations, Analytics, and Machine Learning.

See how Devon Energy is leveraging the scale of Microsoft Azure and Databricks’ Unified Data Analytics Platform to help reimagine their integration, data warehousing and analytics landscape and improve agility while moving their workloads to the cloud.

LEARN MORE

Ready to get started?

TRY DATABRICKS FOR FREE


Follow the Quick Start Guide

Documentation

Resources