OpenMoxie: Running Moxie on a local LLM

Chris

RATH Rascal
Admin
It is possible to use a freely available LLM with Moxie such as DeepSeek and have it run locally on your computer- rather than have all your data go to OpenAI through the cloud.

I’m thinking of setting up my two Moxie on OpenMoxie but would prefer to implement a local LLM if possible.

Anyone successfully set this up and can share some tips or code snippets that I can splice into the existing code? I’m pretty non-technical and am guessing from what I’ve learned through Grok, it might just be more convenient and easy to go with the existing ChatGPT integration. Although I dislike paying to have my data harvested if there are free alternatives that work just as well and give me privacy.

Any advice on where I can download reliable sources of these free LLM’s along with examples where to change the code would be much appreciated. Thanks.

To view this content we will need your consent to set third party cookies.
For more detailed information, see our cookies page.

To view this content we will need your consent to set third party cookies.
For more detailed information, see our cookies page.
 
Back
Top Bottom