AnythingLLM Self-hosted
Local Docker
AnythingLLM Installation Local Docker

Get Started with AnythingLLM in Docker

Pull the latest image

docker pull mintplexlabs/anythingllm:latest

Run the image

️⚠️

If you do not use the command below - all of your data will be lost when the container is restarted!

The -v ${STORAGE_LOCATION}:/app/server/storage is required to persist your data on your host machine in a persistent way.

Open the application

To access the full application, visit http://localhost:3001 in your browser.

Other information

About UID and GID in the ENV

  • The UID and GID are set to 1000 by default. This is the default user in the Docker container and on most host operating systems.
  • If there is a mismatch between your host user UID and GID and what is set in the .env file, you may experience permission issues.

Build locally from source not recommended for casual use

  • git clone this repo and cd anything-llm to get to the root directory.
  • touch server/storage/anythingllm.db to create empty SQLite DB file.
  • cd docker/
  • cp .env.example .env you must do this before building
  • docker-compose up -d --build to build the image - this will take a few moments.

Your docker host will show the image as online once the build process is completed. This will build the app to http://localhost:3001.


Common questions and fixes

Cannot connect to service running on localhost!

Please see How to connect to localhost services.

Having issues with Ollama?

See Ollama Connection Troubleshooting and also read about How to connect to localhost services. This is 100% of the time the issue.

Still not working?

Ask for help on our Discord Community Server (opens in a new tab)


Other Deployment Options

Use the Midori AI Subsystem to Manage AnythingLLM

️💡

Note! ➤➤ Midori AI Subsystem Manager is currently in BETA. If you encounter any issues with the Subsystem Manager, please contact their team (opens in a new tab)

The Midori AI Subsystem manager is not maintained by Mintplex Labs and is a community lead project. As such, any issues using this message should be directed to the discord link found in the link above.

Follow the setup found on Midori AI Subsystem guide (opens in a new tab) for your host OS.

After setting that up, install the AnythingLLM docker backend to the Midori AI Subsystem.

Once that is done, you are all set!