-
-
Notifications
You must be signed in to change notification settings - Fork 108
2.3.46 Satellite LocalAI
av edited this page Apr 19, 2025
·
1 revision
Handle:
localai
URL: http://localhost:34451
A drop-in replacement for OpenAI API that runs models locally and supports various model formats and architectures.
# [Optional] Pre-pull the image
# ⚠️ These images are VERY large (~18Gb for CUDA version)
harbor pull localai
# Start the service
# --open is optional, to open in browser
harbor up localai --open
See troubleshooting guide if you encounter any issues.
- Unfortunately
localai
doesn't reuse cache with other services providing inference, however you can manually place the models into$(harbor home)/localai/data
folder and follow "Run Models Manually" guide - If you have sufficient Disk space - consider switching to AIO image version for all supported kinds of models pre-configured in the service
- Harbor will detect
nvidia
/rocm
capability on the host and use appropriate image tag (configurable, see below) - Harbor will share your HuggingFace token with the service if it is set - to download the gated/private models
You'll perform most actions in the LocalAI WebUI, run harbor open
to access it.
- After initial start, go to the Models Gallery and download a model of your choice.
- After downloading, model will appear in LocalAI home page, where you can start a chat with it
See official Environment Variables guide for reference.
Following options can be set via harbor config
:
# The port on the host where LocalAI endpoint will be available
LOCALAI_HOST_PORT 34451
# Docker image to use for the LocalAI service
LOCALAI_IMAGE localai/localai
# Docker tag when no Nvidia/ROCm support is detected
LOCALAI_CPU_VERSION latest-cpu
# Docker tag when Nvidia support is detected
LOCALAI_NVIDIA_VERSION latest-gpu-nvidia-cuda-12
# Docker tag when ROCm support is detected
LOCALAI_ROCM_VERSION latest-gpu-hipblas
# Location on the host to store files for the LocalAI service
# Should be either relative to $(harbor home) or absolute
LOCALAI_WORKSPACE ./localai/data
See environment configuration guide to set arbitrary environment variables for the service.