Date
Feb. 7th, 2025
 
2025年 1月 10日

Post: Using Ollama Behind a Proxy

Using Ollama Behind a Proxy

Published 23:02 Feb 06, 2025.

Created by @ezra. Categorized in #AI, and tagged as #AI.

Source format: Markdown

Table of Content

Introduction

Ollama is a fantastic tool for running large language models (LLMs) locally. However, in some environments, you might need to access the internet through a proxy server. This blog post will guide you on how to configure Ollama to work seamlessly behind a proxy, whether you're using it directly on your system or within a Docker container.

Why Use a Proxy?

There are several reasons why you might need to use a proxy:

  • Network Restrictions: Your organization's network might require all internet traffic to go through a proxy.
  • Security: Proxies can add an extra layer of security by masking your IP address.
  • Content Filtering: Proxies can be used to filter out certain types of content.

Configuring Ollama to Use a Proxy

Ollama primarily needs a proxy for pulling models from the internet, which are done via HTTPS requests. The key environment variable to use is HTTPS_PROXY.

Important Note: Avoid setting HTTP_PROXY, as it's not used for model pulls and might interfere with client connections to the Ollama server.

Setting HTTPS_PROXY on Different Platforms

The method for setting environment variables varies depending on your operating system:

macOS

If you're running Ollama as a macOS application:

  1. Use launchctl to set the HTTPS_PROXY environment variable:

    launchctl setenv HTTPS_PROXY "https://your.proxy.server:port"
    
  2. Restart the Ollama application.

Linux

If Ollama is running as a systemd service:

  1. Edit the Ollama service file:

    systemctl edit ollama.service
    
  2. Add the following line under the [Service] section:

    [Service]
    Environment="HTTPS_PROXY=https://your.proxy.server:port"
    
  3. Save the file and exit the editor.

  4. Reload systemd and restart Ollama:

    systemctl daemon-reload
    systemctl restart ollama
    

Windows

On Windows, Ollama inherits user and system environment variables:

  1. Quit Ollama from the taskbar.
  2. Search for "environment variables" in the Settings (Windows 11) or Control Panel (Windows 10).
  3. Click on "Edit environment variables for your account".
  4. Create a new variable named HTTPS_PROXY and set its value to your proxy server's address (e.g., https://your.proxy.server:port).
  5. Click "OK" to save.
  6. Restart the Ollama application.

Using Ollama with a Proxy in Docker

If you're running Ollama inside a Docker container, you have two main options:

1. Pass HTTPS_PROXY to the Container

When starting the container, use the -e flag to pass the HTTPS_PROXY environment variable:

docker run -d -e HTTPS_PROXY=https://your.proxy.server:port -p 11434:11434 ollama/ollama

2. Configure the Docker Daemon

Alternatively, you can configure the Docker daemon itself to use a proxy. Docker provides documentation for doing this on macOS, Windows, Linux, and for the Docker daemon with systemd.

Handling Self-Signed Certificates

If your proxy uses a self-signed certificate, you'll need to install it as a system certificate. For Docker, this might involve creating a custom Docker image:

FROM ollama/ollama
COPY my-ca.pem /usr/local/share/ca-certificates/my-ca.crt
RUN update-ca-certificates

Then, build and run the image:

docker build -t ollama-with-ca .
docker run -d -e HTTPS_PROXY=https://your.proxy.server:port -p 11434:11434 ollama-with-ca

Conclusion

Configuring Ollama to work behind a proxy is straightforward, thanks to its use of environment variables. By setting HTTPS_PROXY correctly, you can ensure that Ollama can download models and function correctly even in environments with network restrictions. Remember to install any necessary certificates if your proxy uses them, and you'll be up and running with Ollama in no time!

Pinned Message
HOTODOGO
The Founder and CEO of Infeca Technology.
Developer, Designer, Blogger.
Big fan of Apple, Love of colour.
Feel free to contact me.
反曲点科技创始人和首席执行官。
开发、设计与写作皆为所长。
热爱苹果、钟情色彩。
随时恭候 垂询