Dust preview image

What is Dust?

Build powerful workflows on top of LLMs and Semantic Search. Deploy to API or use directly from DUST. Access versions, cache model interactions, and connect to Data Sources.

Features

  • Chained LLM Apps: Chain arbitrarily between calls to models, code execution and queries to external services.
  • Multiple Inputs: Iterate on LLM app design on several inputs simultaneously to avoid overfitting.
  • Model Choice: Design against models served by OpenAI, Cohere, AI21 and more. Switch models seamlessly.
  • Version History: Access iterations, model outputs and few-shot examples saved automatically.
  • Caching: Speed up iterations and reduce costs with cached model interactions.
  • Easy Deployment & Use: Deploy to an API endpoint or use directly from DUST.
  • Data Sources: Fully managed semantic search engines you can query from your workflows.
  • Community Example Apps: Discover apps created by the community to get started with DUST.

Use Cases:

  • Semantic Search for IPCC AR6 Report: Use semantic search to answer questions about the IPCC AR6 report.
  • Wedding Thank You Notes: Solve the blank page problem for wedding thank you notes.
  • Web Search Assistant: Answer questions with high factual accuracy by searching online and compiling responses based on content from websites.
  • Generating Code for Maths Questions: Generate code to answer maths questions.
  • Teaching LLM to Teach Itself: Teach LLM to teach itself new tasks by prompting itself with few-shot examples from high-level task descriptions.

DUST is a powerful tool for anyone looking to build and deploy language model apps. It offers a wide range of features, easy deployment, and access to managed data sources.