Desktop GPT with LM Studio
No support for local server
Has anyone figured out a way to connect local models hosted on LM Studio for DesktopGPT?
This app needs work. Is there a way to change to Dark mode? I have to wear sunglasses to use this app.
Seems like a decent app but I would like to use Gemma hosted on my LM Studio. Is there a way to achieve this?
Moved to DesktopGPT area