Welcome to the new DocsGPT 🦖 docs! 👋
Guides
️🤖 How to use different LLM's

Setting Up Local Language Models for Your App

Setting up local language models for your app can significantly enhance its capabilities, enabling it to understand and generate text in multiple languages without relying on external APIs. By integrating local language models, you can improve privacy, reduce latency, and ensure continuous functionality even in offline environments. Here's a comprehensive guide on how to set up local language models for your application:

Steps:

For cloud version LLM change:

Step 1

Visit the chat screen and you will be to see the default LLM selected.

Step 2

Click on it and you will get a drop down of various LLM's available to choose.

Step 3

Choose the LLM of your choice.

Video Demo

prompts

For Open source llm change:

Step 1

For open source you have to edit .env file with LLM_NAME with their desired LLM name.

Step 2

All the supported LLM providers are here application/llm and you can check what env variable are needed for each List of latest supported LLMs are https://github.com/arc53/DocsGPT/blob/main/application/llm/llm_creator.py (opens in a new tab)

Step 3

Visit application/llm and select the file of your selected llm and there you will find the speicifc requirements needed to be filled in order to use it,i.e API key of that llm.

For OpenAI-Compatible Endpoints:

DocsGPT supports the use of OpenAI-compatible endpoints through base URL substitution. This feature allows you to use alternative AI models or services that implement the OpenAI API interface.

Set the OPENAI_BASE_URL in your environment. You can change .env file with OPENAI_BASE_URL with the desired base URL or docker-compose.yml file and add the environment variable to the backend container.

Make sure you have the right API_KEY and correct LLM_NAME.


MIT 2024 © DocsGPT