The override should work as they designed it to be integrated with the OpenAI python library as a URL override. But I believe they do not support sending empty queries to check API key.
This is what the endpoint errors out on when trying to set the API key. Then it says API key invalid and cannot continue.
A solution I can think of is to override this judgement and proceed anyways, but ultimately it is up to the devs how they want to proceed. LM Studio could try to fix this endpoint issue instead as well.
Same things happen to me. Not working LM Studio. Is there anything wrong here Recording 2024-02-25 140411.mp4 - Google Drive
Local versions do not work, it doesn’t matter whether it’s LM Studio or Ollama, Cursor requires an API key which is not available. However, I managed to run popular versions of LLM through openrouter.ai, it’s not free, but quite cheap.
The issue is not with whether the service provides an API key or not. The LM Studio API server mimics the OpenAI API exactly such that you can use the OpenAI python library with a custom URL and it works. You set the API key to whatever you want (I use the string “not-needed”). The developers of Cursor just need to make sure their service works similar to how the OpenAI python library works.
Hello! Inferences happens through our backend which cannot access servers running locally on your computer. You’ll need to provide a publicly accessible URL.
So we can’t use Local LLM, Right? So either we need to use OpenAI or any other hosted LLM services like openrouter.ai etc.
Thank you for the transparency. I understand how to resolve this issue now.
If you understand how to resolve than help others also by describing the way. I think you are talking about adding 127.0.0.1 example.com on host file C:\Windows\System32\drivers\etc\hosts
No I think you need to host your own server publicly, then set up an API key so only Cursor backend can access it. Takes a bit of technical know-how. I might upload a github repository that does this with ollama if I have some time.