Please allow max number of tokens supported by the models

Hi,
Apologies in advance if I’m wrong: but I am under the impression that regardless of the model we use, Cursor is limiting the number of tokens (i.e. context window) to 10k. This is based on various messages that I came across in the forum. My understanding is even when we use our own subscriptions, the limit is there. This takes away the advantages of improvements to models and in some cases it is simply not possible to use the API for the task even though it would support it otherwise.
I’d gladly pay to not to have a limit while using my own subscription.

2 Likes