Desktop GPT with LM Studio

No support for local server

Has anyone figured out a way to connect local models hosted on LM Studio for DesktopGPT?

This app needs work. Is there a way to change to Dark mode? I have to wear sunglasses to use this app. 

Seems like a decent app but I would like to use Gemma hosted on my LM Studio. Is there a way to achieve this?

Moved to DesktopGPT area

2,734 views 3 replies
Reply #1 Top

Hello,
Sorry to hear you are having issues. I have forwarded your problem/question to Stardock Support Team for their assistance. Please keep an eye on this thread for any updates. We appreciate your feedback and patience.

Thank you,
Basj,
Stardock Community Assistant.

Reply #2 Top

Thanks a lot. This could be a handy tool if I can utilize AI models that I have running on a local server. Also the theme issue. Dark theme would help

+1 Loading…