Package: LLM Connect
Hello developer I wanna know do the mod support ollama and local ai?
please use package issue tracker or cdb threads for this
converted review into a thread
Hello Mscript
Yes, it is possible to use both Ollama and Local Ai with this mod.
To do this, you need to go to Settings -> Mods -> LLM Connect in the main menu and set the API url to http://localhost:11434/v1/chat/completions (for Ollama) or http://localhost:8080/v1/chat/completions (for Local Ai).
These are the default values for both.
You then need to specify the correct model. If you haven't defined an API key, just enter something like '1234'.
I hope I could help you.
Hello developer I wanna know do the mod support ollama and local ai?
please use package issue tracker or cdb threads for this
converted review into a thread
Hello Mscript
Yes, it is possible to use both Ollama and Local Ai with this mod.
To do this, you need to go to Settings -> Mods -> LLM Connect in the main menu and set the API url to http://localhost:11434/v1/chat/completions (for Ollama) or http://localhost:8080/v1/chat/completions (for Local Ai).
These are the default values for both.
You then need to specify the correct model. If you haven't defined an API key, just enter something like '1234'.
I hope I could help you.