Skip to main content
Page 1

Introducing Structured Outputs for Batch and Agent Workflows

Many AI use cases now depend on transforming unstructured inputs into structured data. Developers are increasingly relying on LLMs to extract structured data...

Fast, Secure and Reliable: Enterprise-grade LLM Inference

Introduction After a whirlwind year of developments in 2023, many enterprises are eager to adopt increasingly capable generative AI models to supercharge their...

Implementing LLM Guardrails for Safe and Responsible Generative AI Deployment on Databricks

Introduction Let’s explore a common scenario – your team is eager to leverage open source LLMs to build chatbots for customer support interactions...

Build GenAI Apps Faster with New Foundation Model Capabilities

Following the announcements we made last week about Retrieval Augmented Generation (RAG) , we're excited to announce major updates to Model Serving. Databricks...

Introducing Llama2-70B-Chat with MosaicML Inference

Llama2-70B-Chat is a leading AI model for text completion, comparable with ChatGPT in terms of quality. Today, organizations can leverage this state-of-the-art model...