Consumers are increasingly expecting retailers to recognize preferences and make recommendations tailored to their needs. A variety of techniques may be employed to accomplish this but each requires careful consideration of how to overcome challenges of scale. In this session, Bryan Smith, Technical Director at Databricks, will explore some foundational approaches to building recommenders that allow organizations of any size to deliver personalized recommendations to their consumers.
In this virtual workshop, we’ll walk through how biomedical researchers are using the Databricks Unified Data Analytics Platform to efficiently curate, query, and learn from vast quantities of data in the cloud. We will look at end-to-end drug discovery, and show the opportunity to efficiently identify high quality targets and develop new, well characterized leads with a unified approach to data and AI. We will then show you how to curate your data into a central research data lake followed by a discussion with Biogen on how they are advancing research with data and advanced analytics.
Join us in this Financial Services Leadership Forum and learn Data+AI strategies on how to leverage advanced analytics to scale your data analytics and feed customer-centric use cases, including personalization, segmentation and recommendation.
Come and learn how to leverage best practices for implementing a complete data science lifecycle, enabling data teams to scale effectively using Azure Databricks, MLflow and other Azure services.
In this virtual event, we’ll cover best practices for organizations to use powerful open source technologies to build and extend your Azure investments to make your data lake analytics ready. You’ll learn about the advantages of cloud-based data lakes in terms of security and cost. And finally, you’ll learn how data professionals are having a huge impact - lowering costs, changing time to market, and even revolutionizing industries.
Join the media, entertainment, and marketing experts from the industry front lines to learn how they’ve implemented modern analytic solutions that help organizations continuously capture audience response from multiple data sources so they can deliver the right content to the right person at the right time.
In this workshop, we’ll cover best practices for organizations to use powerful open source technologies to build and extend your Azure investments to make your data lake analytics ready. You’ll learn about the advantages of cloud-based data lakes in terms of security and cost. And finally, you’ll learn how data professionals are having a huge impact - lowering costs, changing time to market, and even revolutionizing industries.
Join us on December 8 for a live launch event where you’ll learn how the Lakehouse architecture delivers the best of both worlds — data warehouse performance with data lake economics — for SQL workloads. See how Delta Lake — the open standard for data reliability, quality and performance that simplifies data pipelines — now allows for traditional BI directly on your data lake.
This in-person workshop will dive deep in the Lakehouse architecture, which combines the best elements of data lakes and data warehouses, giving your organisation a single consolidated system that democratises data and insight. This is suited for SQL analysts/ developers and enterprise architects who want to get hands-on experience and a deeper knowledge of the benefits of using SQL on the Databricks Unified Data Analytics Platform. Seating is limited, available on a first-come, first-served basis (via registration).
The widespread adoption of Apache Spark™, the first unified analytics engine, has helped data professionals make great strides in data science and machine learning. Yet, their upstream data lakes still face reliability challenges when it comes to building production data pipelines at scale to power these initiatives. Join this virtual guided workshop to learn how Delta Lake can help you build robust production data pipelines at scale. You'll get to see Delta Lake in action with a demo and guided code walk-through as well as ask Databricks experts your most challenging data questions.