Тёмный

Amazing New VS Code AI Coding Assistant with Open Source Models 

Dave Gray
Подписаться 318 тыс.
Просмотров 42 тыс.
50% 1

Web Dev Roadmap for Beginners (Free!): bit.ly/DaveGrayWebDevRoadmap
Learn how you can use an amazing new vs code AI Coding Assistant with open source models and keep your code 100% local. We'll use Ollama, an open source LLM trained on code, and an open source AI Code Assistant integrated with vscode.
💖 Support me on Patreon ➜ / davegray
⭐ Become a full-stack web dev with Zero To Mastery Courses:
- Complete Next.js Developer: bit.ly/CompNextJSDev
- Advanced React: bit.ly/AdvReactDev
- Junior to Senior Dev Roadmap: bit.ly/WebDevRoadmap-JrtoSr
🚩 Subscribe ➜ bit.ly/3nGHmNn
📬 Course Updates ➜ courses.davegray.codes/
❓ Questions - Please post them to my Discord ➜ / discord
☕ Buy Me A Coffee ➜ www.buymeacoffee.com/davegray
👇 Follow Me On Social Media:
GitHub: github.com/gitdagray
Twitter: / yesdavidgray
LinkedIn: / davidagray
Nextjs Server Actions Just Got Better with next-safe-action
(00:00) Intro
(00:23) Install Ollama
(01:20) Add an Open Source Model / LLM
(03:26) Continue for VS Code
(06:19) Code Chats & Autocompletions
(09:32) Wrap up
🔗 Blog Post: www.davegray.codes/posts/bye-...
📚 Tutorial References:
🔗 VS Code: code.visualstudio.com/
🔗 Ollama: ollama.com/
🔗 EvalPlus Leaderboard: evalplus.github.io/leaderboar...
🔗 CodeQwen: ollama.com/library/codeqwen
🔗 Continue: www.continue.dev/
Was this tutorial about how to set up your own local VS Code AI Coding Assistant helpful? If so, please share. Let me know your thoughts in the comments.
#ai #coding #assistant #vscode

Опубликовано:

 

2 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 126   
@MunyaradziRangaIncrease
@MunyaradziRangaIncrease 3 дня назад
Exactly what I was looking for. Thank you very much!
@BrettCooper4702
@BrettCooper4702 6 дней назад
Thanks, that worked really well. Used it to write some nodejs / express / ejs code, a combo i hand not done and it got the code running nicely
@DaveGrayTeachesCode
@DaveGrayTeachesCode 6 дней назад
Glad to hear that!
@aghil_shoja
@aghil_shoja 7 дней назад
Thank you Dave for these great tutorials, as a non-native English speaker you speak English so evident that I can comprehend I'm stiil learning HTML and CSS Dave with your tutorials, your tutorials are so invaluable 🌻❤
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
Glad I can help!
@yacineelhakimhaddouche6805
@yacineelhakimhaddouche6805 7 дней назад
Thank you Dave, good quality content right there ❤
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
You're welcome!
@MOJICA7257
@MOJICA7257 7 дней назад
Great work Dave!!! 🎉🎉🎉🎉❤❤❤
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
Thanks so much!!
@louicoder
@louicoder 4 дня назад
Thanks so very much, Local LLMs are the next big thing. so thankful for this video. Just earned a new sub here 🌟🌟🌟🌟🌟
@DaveGrayTeachesCode
@DaveGrayTeachesCode 4 дня назад
You're welcome!
@alexanderkomanov4151
@alexanderkomanov4151 7 дней назад
Thanks Dave!
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
You're welcome!
@boopfer387
@boopfer387 6 дней назад
Great Dave!
@DaveGrayTeachesCode
@DaveGrayTeachesCode 6 дней назад
Thank you!
@bwsstha8864
@bwsstha8864 7 дней назад
Thanks alot Dave, As per the request for upcoming videos I would be happy if you could provide honoJS, react / nextJs and postgress tutorial
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
Many possibilities there! Thanks for the request!
@mhl_5
@mhl_5 6 дней назад
thank you dave, you are amazing
@DaveGrayTeachesCode
@DaveGrayTeachesCode 6 дней назад
You're welcome!
@helloworldcsofficial
@helloworldcsofficial 7 дней назад
Great! More of this please!
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
Thanks for the feedback! I do want to compare more of these solutions. I really like this one because it is local and you choose your own model - which allows you to upgrade as models improve.
@helloworldcsofficial
@helloworldcsofficial 7 дней назад
@@DaveGrayTeachesCode 🙏
@mikevaleriano9557
@mikevaleriano9557 7 дней назад
Will take a look, but I'm finding hard to believe there's something out there to replace supermaven. Copilot is trash compared to it now.
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
Yes so many new things. I like this because you can keep changing the models as they improve and the extension is constantly being updated as well.
@Getfit-us
@Getfit-us 7 дней назад
I agree supermaven is awesome. Switched from copilot
@RabahTaib-mn4fs
@RabahTaib-mn4fs 7 дней назад
I tried it and instantly loved it-it's even better than my Tabnine Pro subscription! Thanks for letting me know it exists
@coders_rant
@coders_rant 6 дней назад
Whats the models name?
@lazymass
@lazymass 6 дней назад
Supermaven? It's fast, but the code it produces is... Not that great... It completely neglects my coding style. Cursor is much better
@bernardiho
@bernardiho 5 дней назад
From Nigeria Africa! Love your content.
@DaveGrayTeachesCode
@DaveGrayTeachesCode 5 дней назад
Thank you!
@twd2
@twd2 7 дней назад
Awesome 😍....
@aymenbachiri-yh2hd
@aymenbachiri-yh2hd 7 дней назад
thank you so much
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
You're welcome!
@MunyaradziRangaIncrease
@MunyaradziRangaIncrease 3 дня назад
With the advent of *Apple Intelligence*, I am thinking there may be a way to integrate your code editor into the model like this. If and when its possible PLEASE make a video on that. 😄
@slimbennasrallah2351
@slimbennasrallah2351 6 дней назад
Thank you this is an amazing video! 🎉 I have a question concerning it's ability or not to understand the full context of a codebase/project/folder or is it limited to selected code/limited number of open files
@mhl_5
@mhl_5 6 дней назад
from continue extension docs: Completions don't know about my code We are working on this! Right now Continue uses the Language Server Protocol to add definitions to the prompt, as well as using similarity search over recently edited files. We will be improving the accuracy of this system greatly over the next few weeks.
@DaveGrayTeachesCode
@DaveGrayTeachesCode 6 дней назад
Which model you choose and how many tokens it supports will impact that, but yes, continue supports that. Reference: docs.continue.dev/walkthroughs/codebase-embeddings
@JonBrookes
@JonBrookes 7 дней назад
thanks Dave, your always as ever on the money so to speak. Local LLMs may well be the next thing and privacy is king so this is well worth the time and investment. Thanks agai, your a ⭐cheers
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
You're welcome and I agree!
@JonBrookes
@JonBrookes 7 дней назад
@@DaveGrayTeachesCode yep, I've got deepseek-coder-v2 running already, thanks to your prompting me to take a look on now windows up to now I've been running on WSL so in order to get this to work I had to first stop ollama in WSL with syttemctl commands to stop and disable but that aside, running in windows now which is terrific, its even working with dart / flutter which is amazing
@BilalAulakh23
@BilalAulakh23 7 дней назад
Legend
@canardeur8390
@canardeur8390 7 дней назад
You definitely deserve your entry to the Nirvana! (The thing is: you bring so much enlightenment to our world that if you do so, you will be missed a lot! Hopefully you volunteer to reincarnate on this planet to bring your light!)
@aakashswastik9458
@aakashswastik9458 3 дня назад
Can we do video on detail of ecma script used forjs
@zenguitarankh
@zenguitarankh 3 дня назад
Mine is "indexing" for my first code generation. I'm guessing it's a one time thing though.
@hbela1000
@hbela1000 7 дней назад
Thanks.Which is the best Ollama LLM for nextJS 14 free / licensed ?
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
I don't think you can target a framework like that unless someone specifically creates an LLM for it. Just go with the rankings for coding like I do with EvalPlus in this video.
@MilindP
@MilindP 7 дней назад
Can we set the token size since there is a limit with copilot. Thank you another wonderful tutorial.
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
It will only be limited by the limits of the model you choose.
@MilindP
@MilindP 7 дней назад
@@DaveGrayTeachesCode Thank you.
@RobertMcGovernTarasis
@RobertMcGovernTarasis 7 дней назад
Cheers. Haven’t tried CodeQwen yet. I prefer LM Studio to Ollama if nothing else because it keeps the downloaded file in a for. That’s usable by other programs. Where as Ollama just hides the download intoweirdly named files.
@DaveGrayTeachesCode
@DaveGrayTeachesCode 6 дней назад
Good info! Thanks!
@togya4
@togya4 7 дней назад
Dave will ever make a course about the new nextjs futures
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
I'm waiting until Next.js 15 is promoted beyond release candidate. Then I will consider it.
@louicoder
@louicoder 3 дня назад
I tried this and it's amazing. The only drawback is the response is really slow. Do you have any idea about that? I'm running on a MacBook Pro 2019 16inch Intel with 32GB RAM, 2.6GHZ 6-core CPU
@mikrowizja1130
@mikrowizja1130 11 часов назад
I have the same issue. Did you manage to fix that?
@timtanhueco1990
@timtanhueco1990 5 дней назад
Hi! Thanks for this video! Is the Ollama and codeqwen 100% free, has IntelliSense and autocomplete aside from code chat?
@DaveGrayTeachesCode
@DaveGrayTeachesCode 5 дней назад
Yes to all
@habib.prodev
@habib.prodev 6 дней назад
Hey Dave, can you share the hardware specs of the machine where you tried this? I am curious if my MBA M1 with 16GB can handle it or not.
@DaveGrayTeachesCode
@DaveGrayTeachesCode 5 дней назад
I'm using a Windows PC that's about 3 years old. Your m1 is good, but RAM might be a concern. I added a lot of RAM to render my videos faster.
@user-kf4zl7lu8o
@user-kf4zl7lu8o 7 дней назад
I was wondering if you could make a postgresql tutorial
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
Nice request! I've been thinking about that 🙌
@Peacemaker.404
@Peacemaker.404 7 дней назад
hey dave, do you use linux and if you do which distro do you use, i'm trying to switch from windows.
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
I used to. My favorite for a long time was Debian. Then everyone went to Ubuntu. Last I knew, Ubuntu was easiest to switch to.
@Peacemaker.404
@Peacemaker.404 7 дней назад
@@DaveGrayTeachesCode thanks, i'll try that.
@drkgumby
@drkgumby 7 дней назад
Everybody has an opinion on which Linux distro is best, and all of us are 100% right. :) I use PopOS as a daily driver and suggest you give it a try. Zorin and Mint are also often suggested for somebody just coming over from Windows. Ubuntu is a good choice as well. Depending on your hardware, you should be able to boot from a USB stick and try any of these before you commit to an installation.
@Peacemaker.404
@Peacemaker.404 7 дней назад
​@@drkgumby Yes, exactly. I was confused between PopOS and Mint, also I've used Ubuntu in the past.
@drkgumby
@drkgumby 6 дней назад
​@@Peacemaker.404 If you can boot from USB for a test drive then it's easier to try them all and pick the one that best meets your needs. Unless you are doing something that requires some specific software that requires Windows, you will probably find them to all work for you. Then you can pick the one that looks/feels the best.
@EdmundCiegoBelize
@EdmundCiegoBelize 6 дней назад
Would this be able to access all the code in the codebase? Or only the opened files?
@DaveGrayTeachesCode
@DaveGrayTeachesCode 6 дней назад
Yes you can use @codebase or @folder. Reference: docs.continue.dev/walkthroughs/codebase-embeddings
@visheshbajpayee9308
@visheshbajpayee9308 7 дней назад
I am using MacBook m1 and did everything as mentioned in the video. It seems like vscode is lagging after applying all the configuration. Also, auto suggestion is not working for me. Is there anything m missing?
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
I didn't add it to my Mac yet to compare, but installed locally shouldn't create a lag. I think you've got plenty of power, too. Maybe a quick restart of VS Code? As mentioned, I did have to restart Windows.
@nix7705
@nix7705 7 дней назад
Hello, I can see that you have python and js lessons on your channel, but what language has more opportunities to start job at least for free or for food:/? I learned python and Django before, but dropped it because people said that it's too slow, and i'm practicing MERN now, of course it feels a bit harder than Django. And i'm a bit confused now, was it right choice or wrong to drop python web
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
There are jobs for both. Difficult to say which would be better. Both are among the most popular programming languages.
@hornickt
@hornickt 7 дней назад
Thank you for the great content. I have a private ollama server running in my company with dedicated video card. Is it possible to connect Continue to this server in my LAN? Ollama is not installed on my local workstation, it is on a server here in my environment.
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
I don't know, but if you find out, please share here. Interesting!
@user-nl4fd3yd9s
@user-nl4fd3yd9s 7 дней назад
Does it have inline suggestion?
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
Yes it does
@snivels
@snivels 7 дней назад
Does this all happen locally? No posting your code to some server in Vietnam somewhere?
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
Right! And yes, 100% local.
@andromilk2634
@andromilk2634 7 дней назад
@@DaveGrayTeachesCode This assumes we have a really good computer?
@tusharphb6596
@tusharphb6596 7 дней назад
Want to see docker, docker compose from scratch
@rahu1gg
@rahu1gg 6 дней назад
Will the Assistan able to auto complete react and next.js code ??
@DaveGrayTeachesCode
@DaveGrayTeachesCode 6 дней назад
Yes, it has auto complete. The setting I change in the video is tab auto complete.
@christerjohanzzon
@christerjohanzzon 7 дней назад
I've tried a few models locally, and I really like having an AI assistant locally. Too bad I can't run it on my laptop. Anyone know of any good free or low cost alternatives?
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
Yeah running locally will take some power. If your laptop runs short on that, you might want to look at services that don't run locally.
@christerjohanzzon
@christerjohanzzon 7 дней назад
As an answer to myself, I just found out about Chat RTX from Nvidia...it can run local LLM's and train on your local data as well. Now I only need a extension to integrate it into VS Code.
@yoJuicy
@yoJuicy 6 дней назад
Dave, you missed the entire install on VSCode 4:18
@DaveGrayTeachesCode
@DaveGrayTeachesCode 6 дней назад
I installed at 3:56 so not sure what you mean? I do mention that I previously had it installed so I did not get the splash screen again. I talk about what you should see.
@yoJuicy
@yoJuicy 6 дней назад
@@DaveGrayTeachesCode Thanks for the reply, its a screen that tells users to install llama3 and run it as well as starcoder2. Easy spot to get lost. thanks!
@Mrplayall8849
@Mrplayall8849 7 дней назад
Hello sir l'm learn the python from your channel but i forgot everything after some days and after a long time I'm learning again python please give some suggestions to learn python
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
Don't rush. Just learn one thing and then try to apply it. The more you use it, the easier it is to remember. Just learn a little something new every day.
@Mrplayall8849
@Mrplayall8849 7 дней назад
@@DaveGrayTeachesCode ok thank you sir but in python course I couldn't understand the operators you wouldn't explain the some operators . How to understand that and were you learn python and other new technologies
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
@@Mrplayall8849 there are many python resources available. If what I said or taught did not stick with you, sometimes it is good to reference other sources of information. Putting all of these together will help your understanding.
@douglaskipyegon2183
@douglaskipyegon2183 6 дней назад
@@Mrplayall8849 you don't have to understand everything. Be kind to yourself and everything will fall into place the more you code. So don't stress if you don't understand something
@eleah2665
@eleah2665 7 дней назад
Did I miss something? I did not see code completion as you type your own code.
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
There is tab auto completion. That is the setting I changed in the config - but yeah, trying to fit it all into 10 minutes or less, I didn't demo everything.
@nileshgosavii
@nileshgosavii 7 дней назад
It is so cool but it crashes my pc every few minutes. Even though my PC meets the requirements.
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
Hmm, I'm on a PC and that hasn't happened. Strange indeed. It does seem to be a bit power hungry according to some comments, but mine is far from a recent PC.
@personone6881
@personone6881 6 дней назад
Excuse me please - *Oi!* - :D - thank u hello! Ummm... so at 3:07 u explain "...go back to your Terminal in VS Studio Code and run that with Ollama" - and I understand that, and that's fine... But what NOBODY EVER EXPLAINS or more accurately what it seems more like to be WHAT EVERYONE FAILS TO RESPONSIBLEY ADVISE is - {C:\in\which\gosh\darnit\someone\tellme\please\directory} ??? - it's either one of these so let's try multiple choice for ease and efficiency for all: Do I run the install command here (*): A) .../Users//> * B) .../Users//CODE/> * C) .../Users//CODE/Projects/> * D) .../Users//CODE/Projects/myProjectsRootDir> * or, simply even here: E) C:/ * ! ! ! Or are model packages installed as global dependencies so it matters none what directory your terminal is currently pointing to when sending the ollama run command? If so, is this true for all (or "most") ai chat models or just in this case for ollama? THANK YOU IN ADVANCE TO ANYONE HOW CAN CLARIFY THIS FOR ME
@personone6881
@personone6881 6 дней назад
wow - erm there's a part in there that translates a bit harsh lol - wasn't tryna be... wasn't pointing fingers at anyone... sorry if it came across any bit a little fiery
@DaveGrayTeachesCode
@DaveGrayTeachesCode 6 дней назад
Someone else commented about how Ollama stores the LLM models with weird filenames, but they didn't say where. They did mention another choice of theirs that makes the models available to other software, too. Might look for that comment from earlier today.
@darwinmanalo5436
@darwinmanalo5436 7 дней назад
You need a high end computer to run that locally though. My mac m1 lags lol so I'll stick to copilot
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
Hmm, I have a mid PC running it without issues. I do have extra RAM though. Might make a difference.
@Kricke87
@Kricke87 7 дней назад
Can confirm as well. I did the same test as Dave and it took about 1-2 minutes for the entire code to be written using codeqwen. Don't know if it's CPU or GPU that slows it down. I have 32gb RAM a 9400k and a AMD 580RX. So I guess at least my PC is not as fast as Dave's. But fun little piece of software to use. I also use Cody Free version as an alternative to Copilot, as I'm still learning programming and only do a little coding at my current job, so I don't feel it's worth forking out $ yet.
@alexeyfilippov42
@alexeyfilippov42 6 дней назад
Спасибо огромное
@DaveGrayTeachesCode
@DaveGrayTeachesCode 6 дней назад
You're welcome!
@mrelqori7931
@mrelqori7931 7 дней назад
but you need strong strong CPU
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
Mine isn't too strong. I'd say mid.
@9622AX
@9622AX 7 дней назад
Well its good. But takes away many system resources.
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
Could be a drawback of keeping everything local depending on machine power. I'm not usually running many other tasks while coding/chatting with it.
@ashutosh9
@ashutosh9 18 часов назад
This is lagging so much in my M2 MacBook Air
@DaveGrayTeachesCode
@DaveGrayTeachesCode 16 часов назад
From what I'm hearing in the comments, it seems like more RAM helps. I have a PC that's a few years old but lots of RAM.
@vivekkaushik9508
@vivekkaushik9508 7 дней назад
I think Codeium is better even with free version
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
You've already had time to compare both? I want to compare others. Can you choose your own free model with Codeium? If so, it comes down to the extension features and UI comparison.
@vivekkaushik9508
@vivekkaushik9508 7 дней назад
Good sir@@DaveGrayTeachesCode 1. Codeium doesn't require me to install and run a local ollama instance which hogs the compute and memory of my MBA making it unbearable to code. 2. Codeium free version doesn't give the ability to choose models but the Pro model has ability to choose GPT 4 models. Haven't tried that, don't have that kinda money but free is fast and good enough for my use case - Web dev. 3. Codeium setup is just 1 click. 🙂
@unknotmiguel
@unknotmiguel 5 дней назад
The goal usually is to run locally due to privacy of code concerns, I believe...
@vivekkaushik9508
@vivekkaushik9508 4 дня назад
@@unknotmiguel Umm if someone actually reads through the privacy section of these Copilots-as-service apps, they don't send your code back to their server lol. But they send metrics and that too depending on Paid version can be turned off!!! Also, PRIVACY is a MYTH. And privacy in code is a joke. My poor 2 cents.
@palashjyotiborah9888
@palashjyotiborah9888 7 дней назад
It's old news. 😢
@DaveGrayTeachesCode
@DaveGrayTeachesCode 7 дней назад
Definitely not for everyone. But yeah, we do hear about things at different times.
@nuttbaked
@nuttbaked 7 дней назад
first time hearing about this
Далее
malicious javascript injected into 100,000 websites
12:28
Я нашел кто меня пранкует!
00:51
Future Proof Your Tech Career In the Age of AI
10:21
Просмотров 20 тыс.
The 3 Laws of Writing Readable Code
5:28
Просмотров 319 тыс.
How I MASTERED CODING with these 4 easy steps
8:02
Просмотров 25 тыс.
Generate ANY Website!│WebSim.AI Tutorial
11:18
Просмотров 6 тыс.
"Smart" design patterns with container queries
15:27
Просмотров 23 тыс.
How is THIS Coding Assistant FREE?
5:19
Просмотров 142 тыс.
Build AI Agents with Docker, Here’s How
51:59
Просмотров 21 тыс.
Claude 3.5 Deep Dive: This new AI destroys GPT
36:28
Просмотров 294 тыс.
This Package Saved My SaaS
5:46
Просмотров 105 тыс.
Я нашел кто меня пранкует!
00:51