Hi MScript, have you already added secure.http_mods = llm_connect to minetest.conf? You also need to enter the API key, endpoint, and model in the main menu under Settings -> Mods -> LLM Connect (please remove "" - I'll fix it soon):
It should then look like this by default for Ollama:
Hello Developer good moring I can't rate mod before test but the mod not work and thx
converted review into a thread
please use package issue tracker or cdb threads for bug reports
Hi MScript, have you already added secure.http_mods = llm_connect to minetest.conf? You also need to enter the API key, endpoint, and model in the main menu under Settings -> Mods -> LLM Connect (please remove "" - I'll fix it soon):
It should then look like this by default for Ollama:
llm_api_key: 1234 (or your custom key)
llm_api_url: http://127.0.0.1:11434/v1/chat/completions
llm_model: your model (maybe llama3.2:3b)
For Local-AI, follow the same pattern, but with the Local-AI API_URL: http://127.0.0.1:8080/v1/chat/completions.
Please let me know if you have any questions, and have fun ;)