Ollama Troubleshooting
AnythingLLM and Ollama Setup Guide

Ollama Connection Troubleshooting

Ensure Ollama is Running

Before attempting any fixes or URL changes, verify that Ollama is running properly on your device:

  1. Open your web browser and navigate to http://127.0.0.1:11434
  2. You should see a page similar to this:
Ollama running in background

If you don't see this page, troubleshoot your Ollama installation and ensure that it is running properly before moving forward.

Automatic URL Detection (LLM & Embedding Providers)

ℹ️

AnythingLLM features automatic URL detection for Ollama. Manual configuration is only necessary if auto-detection fails.

URL Successfully Detected When selecting the Ollama provider, AnythingLLM attempts

to auto detect your Ollama URL. If the option to input the base URL is hidden, the URL was automatically detected by AnythingLLM.

Ollama URL automatically detected

URL Detection Failed When manual endpoint input is expanded, the URL was not

able to be detected.

Ollama URL failed detection

If Ollama was not started when AnythingLLM tried to detect the URL, start up Ollama then press the Auto-Detect button. This should automatically detect the URL and allow you to begin selecting the Model and Max Tokens values. ## Setting the Correct Ollama URL

🚨

If AnythingLLM was unable to detect your URL automatically, this is most likely an issue with your Ollama setup/configuration NOT AnythingLLM.

If you have confirmed 100% that your Ollama installation is running properly and is not being blocked by any firewalls etc, you can choose to set the URLs manually.

Choose your AnythingLLM version to find the correct Ollama URL:

Desktop Version

Use: http://127.0.0.1:11434

Correct Ollama Base URL for AnythingLLM Desktop Version

AnythingLLM Desktop: Built-in vs. Standalone Ollama

AnythingLLM Desktop offers two Ollama options:

  1. Built-in AnythingLLM LLM Provider:

    • Runs a separate Ollama instance internally.
    • Models downloaded to standalone Ollama won't appear here.
  2. Standalone Ollama:

    • Run Ollama separately on your system.
    • Use the URL http://127.0.0.1:11434.
AnythingLLM built-in Ollama provider

Troubleshooting

If you're still experiencing issues:

  1. Confirm you're using the correct URL for your setup.
  2. Check for firewall or network issues blocking the connection.
  3. Restart both Ollama and AnythingLLM.
💡

If problems persist after trying these steps, please contact visit our Discord to ask your questions.