I’m loving Cursor.
Though I don’t understand what “fast” requests are?
I see that there are only 500 per month, which obviously won’t be enough.
What happened after that?
How can one “save” fast requests and use regular gpt4 requests?
I’m loving Cursor.
Though I don’t understand what “fast” requests are?
I see that there are only 500 per month, which obviously won’t be enough.
What happened after that?
How can one “save” fast requests and use regular gpt4 requests?
Fast requests get first priority on our backend. Slow requests will be queued behind fast requests in times of particularly high load. Otherwise, they’ll run the same speed.
Thanks for clarifying - is there any way to save fast for the times that we feel is critical for us, when we think that we might need fast responses.
Yeah, when creating code using Command + K, it’d be helpful to have an option to toggle fast or slow…similar to when you have the option to toggle between gpt-3.5 and gpt-4 using the chat
it’s already there. I don’t know when, but it appeared couple of times for me.
(I think it appears if you have gpt-4 fast requests available)
After taking a look at this again, I realised that the setting gpt-3.5 or gpt-4 on the Chat works for all commands not just the chat
Hey @truell20,
I’m now using Pro for the first time and loving the experience!
Is there a way to toggle between gpt-4 fast and slow queries? It’d be nice to have an extra option for the Pro subscription next to the usual gpt-3.5 and gpt-4 i.e. gbt-4-fast
This way I can choose when I need to use up my 500 per month fast queries
Maybe the name should be more like “priority” rather than fast, since fastness as a concept is probably already hogged by gpt 3.5 vs gpt 4.
Plus, if there are no prioritized tasks in the queue, a non-prioritized task will not be slower, if I understand right
We don’t currently have a way for users to toggle between slow/queued requests and fast/priority requests. This seem to be a decent number of requests for this, so we will likely will add it in a future release
I second this feature and would also like it to be called priority queue/request rather than slow vs fast.
Thx ^^