Meilisearch can read, search, and understand your documents and formalize answers based on the output documents found. The underlying LLM decides what to search for and, if necessary, how to do so. Meilisearch exposes chat workspaces to configure different chat behaviors with Meilisearch, which enables A/B testing.

Meilisearch's chat completion route aims to be compatible with as many OpenAI SDKs as possible. We achieve this by mimicking the OpenAI chat completion route behavior. Users have to change the base URL parameter of their favorite SDK to make it talk to Meilisearch instead of OpenAI. The following example is inspired by the official OpenAI one but tweaked for a Meilisearch instance.

import OpenAI from 'openai';

// We define the Meilisearch API Key to use to talk to our instance
// We also support tenant tokens which is CoOoOol!
// Finally, w define the base URL to our instance + our chat workspace
const client = new OpenAI({
  apiKey: process.env['MEILI_API_KEY'],
  baseUrl: "<https://ms-abcd1234-123.heaven.meilisearch.io/chats/MyWorkspace>", 
});

const completion = await client.chat.completions.create({
  model: 'gpt-4o',
  messages: [
    { role: 'developer', content: 'Talk like a pirate.' },
    { role: 'user', content: 'Are semicolons optional in JavaScript?' },
  ],
});

console.log(completion.choices[0].message.content);

Meilisearch will hopefully be compatible with the following APIs (collapsed). To use the base URL described in the dedicated guides/documentation pages in the chat workspace settings, follow the instructions: Meilisearch's chat workspace base URL.

Compatible and incompatible (yet) LLM APIs

API Definition

You can ask for a chat completion in two forms:

You can switch between both by using the stream boolean parameter.

The input and output comply with the OpenAI API, allowing you to use the same parameters to receive the corresponding output.

When reaching the conversation token limit, the underlying LLM and therefore, Meilisearch Will return an error and refuse to process the completion. The user will have to manage the conversation size by dropping old messages until the conversation is small enough. We recommend dropping 1/3rd of old messages and trying again, until the request passes.

<aside> ℹ️

The OpenAI, Mistral, and Meilisearch /chat/completions route is stateless. The whole conversation must be included when sending requests to the engine. Nothing is kept on the Meilisearch side.

</aside>

Routes

This section lists the new routes available to talk and dialog with Meilisearch, including the new settings and OpenAI-compatible chat completions route.

List the chat workspaces

This route lists the different chat workspaces available on the chat completion route. It has the same shape as the index listing route and supports pagination.

// GET /chats/$workspace

{
  "results": [
    { "uid": "withMistral" },
    { "uid": "withGemini" },
    { "uid": "experimentingStrangePromptsWithOpenAi" }
  ],
  "offset": 0,
  "limit": 20,
  "total": 3
}

Create or update the chat workspace settings

It permits the creation or update of a chat workspace. That’s the starting point of being able to talk to Meilisearch. Provide the API Key of the LLM you plan to use. If you decide to use OpenAI, you don’t have to do anything else to make it work, OpenAi is the default source. However, you must change the Base API URL to use an OpenAI-compatible LLM provider.