-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support Ollama Registry #21637
Comments
You can refer https://github.com/stonezdj/harbor/releases/tag/v2.11.0-ollama, left your comment if any feature need to add |
Can you break down your requirement in details? If the ollama models are OCI compatible, then Harbor can support that. |
when we deploy ollama models: ollama list These models are downloaded from the Internet, which is very slow. And we have many nodes that need to pull these models. So, we want to pull the model through a node, and then push it to the harbor, so that other nodes can point to harbor to pull. This is similar to using docker to pull images. |
you can use ramalama to push the models to Harbor. Currently using ❯ ollama push demo.goharbor.io/harborbupd/olulama:latest
retrieving manifest
Error: Get "harbor?nonce=bZ3Q5RVQd3azeO5mSPqEfA&scope=&service=&ts=1739954090": unsupported protocol scheme "" But pushing this using ramalama seems to work fine. |
Eventhough If we support Ollama in Harbor. Ollama doesn't currently support pull from Private Registry. |
Can you support ollama models?
The text was updated successfully, but these errors were encountered: