Hi MScript, have you already added secure.http_mods = llm_connect to minetest.conf? You also need to enter the API key, endpoint, and model in the main menu under Settings -> Mods -> LLM Connect (please remove "" - I'll fix it soon):
It should then look like this by default for Ollama:
Hello Mscript
Yes, it is possible to use both Ollama and Local Ai with this mod.
To do this, you need to go to Settings -> Mods -> LLM Connect in the main menu and set the API url to http://localhost:11434/v1/chat/completions (for Ollama) or http://localhost:8080/v1/chat/completions (for Local Ai).
These are the default values for both.
You then need to specify the correct model. If you haven't defined an API key, just enter something like '1234'.
I hope I could help you.
Hi MScript, have you already added secure.http_mods = llm_connect to minetest.conf? You also need to enter the API key, endpoint, and model in the main menu under Settings -> Mods -> LLM Connect (please remove "" - I'll fix it soon):
It should then look like this by default for Ollama:
llm_api_key: 1234 (or your custom key)
llm_api_url: http://127.0.0.1:11434/v1/chat/completions
llm_model: your model (maybe llama3.2:3b)
For Local-AI, follow the same pattern, but with the Local-AI API_URL: http://127.0.0.1:8080/v1/chat/completions.
Please let me know if you have any questions, and have fun ;)