config.toml ignored ? #3767
Replies: 4 comments
-
Dear @VictorFoxSub To provide you with a comprehensive overview, I have successfully configured the However, upon further investigation and enabling the debug logs, I discovered that the Tabby server is seemingly disregarding my
It is intriguing to note that although the config file is being read to some extent (as I have successfully activated logging within the I would greatly appreciate your guidance on resolving this matter and enabling the Tabby server to recognize and deploy the models as detailed in the Thank you for your time and attention to this matter. I eagerly await your insightful suggestions and recommendations to rectify this situation promptly. Warm regards, King Divers |
Beta Was this translation helpful? Give feedback.
-
same here. Can't get it working, see absolutely nothing in the server logs related to this |
Beta Was this translation helpful? Give feedback.
-
update: turned out in my case it was a typo in the volume mount, so config was not found. If you're using docker try I'd like so much if tabby server was more verbose and told me that the config is not found. Would've saved me tons of time |
Beta Was this translation helpful? Give feedback.
-
I think I don't understand how tabby works with ollama. I still need to run I tried to run it with or without specifiying --model and --chat-model using the values from the config.toml but without success. |
Beta Was this translation helpful? Give feedback.
-
I've set up up config.toml as specified in tabby's ollama http api reference , I've
ollama run
the 3 models, emptied the endpoint url in my pycharm setting (has the ui says to do that to use config.toml), but I still get "Cannot connect to Tabby server, please check your settings."After activating the debug logs I found the following logs showing that it ignores my config.toml (all the ports are 11434) :
{"level":20,"time":1737987490917,"pid":5838,"hostname":"myvHost.local","tag":"TabbyApiClient","msg":"Health check request: GET http://localhost:8080/v1/health. [...]"}
BUT it is not totaly ignored as I activated the log in the config.toml...
What should I do to allow tabby to use the models I specified in config.toml ?
Beta Was this translation helpful? Give feedback.
All reactions