Dude, please do a longer video series building technical applications to really showcase the power of the tools you present. It’s kind of hard to understand what is possible with these tools when all you build is minesweeper. Love and appreciate the content, just food for thought with this comment.
Thanks for thee Supermaven info. I used ollama, but small models are not good enough for the auto-completion. The only small LLM that worked was deepseek-coder v1 1.3B. But it was too heavy for my 8GB MacBook RAM 😅 Then I ran Ollama on my desktop PC use API.
actually lama 3.1 8b outputs great code with the right settings, parameters and system prompt. I am a programmer so I know the quality of the output, this one can be runned locally and has nice speed.
firrst, lesgoo. i actually tried this stack yesterday after my cursor pro ran out.. i regret to say that i honestly liked the experience on cursor more. my issue is with aider i have formatting issues consistently and also think it is actually working out to be more expensive than cursor (skill issue)
@@dytra_io like the search / replace blocks not being correct so it has to retry. Also it not sticking to the coding style of the rest of the file so every line is replaced in git even existing lines
Pure sauce! My first impressions with cursor was pretty good. The bubble vanished after i went to AI settings and tried to set API keys. Then i realized what this project is.
I'm almost at the end of my cursor trial, and i hate that i like it so much.. i feel the same as you, but it just works. no hassle, just a very good productivity boost, resulting in more free time for me. I can do the same or more work in less time. Thats the bottom line imo. And yes, i belive that is they made a similar "cursor'esk" integration for aider, it would be better. For me that 20 usd per month is chump change in relation to the gains ive made using the product.
Exactly $20 per month for a major jump in productivity. I'm in. But, it's cool that King brings us alternatives. Good to keep on top of the evolving tech and learn about new things.
Pretty cool. Gota figure out a way to get the 'Petals' inference setup more public. Peer to Peer Distributed models would be the ultimate Open Source setup for this.
Holy shiz looking at the leaderboards has anybody else noticed how deepseek is basically the second best coder is an insane 53 times cheaper then the other models at the top,that's insane, why is this not discussed more, you can get the second best coder for such a tiny price and save a fortunate compared to claude with no rate limit
Depends which leaderboards you are talking about. Livebench shows deepseek coding capabilities as being FAR behind Sonnet. For me I'll pay the extra for Sonnet to get to working solutions faster. Time is money.
which deepseek? right now i see Deepseek v2 the small version is barely able to handle decent autocomplete ( i wonder what cursor is using, because their autocomplete is actually quite good, much better than deepseek and github copliot or REplit)
Personally i cant caluclate lower cost than 20$ . a kinda simple code request cost 4 cents for me in claude dev so its the same price for 500 request just like in cursor. But in Cursor its only for Fast Premium queue limited. After that u still have unlimited request for sonnet 3.5. I mean do most users need more than 500 request. Probally not. many much more costefficient Models can do the Basic stuff. But its still nice to have those unlimited request.
Yeah 20$ doesnt seem too expensive but I guess it would be cheaper with Continue and switching between models. Only use Claude for harder requests, and a free model for anything else
I feel that people pay for convinence and while it might annoy you that you can manually do all these things for free convinence for many people rules. Its the Apple vs Samsung argument, Apple is built on opensource too but its convinent and hence people use it but there is another schol that prefers Android. Both are cool. I appreciate your videos just wanted to give another viewpoint.
I wonder if we can build an extension into VS Code which makes it easier to do this with the already existing gui. It would be nice to be able to give it access to the full codebase through o1.
Double the price for 10x the productivity. I already paid my annual Copilot sub, been using it since Alpha...but I've already recovered that lost sub in time saved using Cursor.
How does this work for adding a feature to an existing codebase? These multi-file additions are something people drool over with Cursor, although allegedly the more unusual the codebase is the less it works.
Hi! Liked your content, i tried this stack - worked pretty well. But the main feature of Cursor that it rewrites your code, instead generating new. For example if i pass variable to template, it now all the props. Is it possible to make with your solution? Currently i use DeepSeek for chat and Mistral for autocomplete. Thanks for reply.
You know what’s stupid? Paying for cursor for the inference with their models and getting rate limited by some kind of firewall or something because it also blocks my own api key.
I am using Cursor and there is a feature in it I really like I haven’t you demo in your setup. The feature is where if I for example go into a file and make a change to, say a typescript type, and then go into another file using it, then Cursor will propose changes based on that recent change I just did in the other file and I can just “tab” to accept. Super neat and efficient. Can I get that feature with this setup?