Тёмный

AI controls the COMPUTER - Open-Interpreter FULL TUTORIAL!!! 

1littlecoder
Подписаться 79 тыс.
Просмотров 15 тыс.
50% 1

We'll learn how to use Open Interpreter
1. Installation
2. Troubleshooting
3. OpenAI API
4. Local Models
5. Use-Cases
Open-Interpreter github.com/KillianLucas/open-...
❤️ If you want to support the channel ❤️
Support here:
Patreon - / 1littlecoder
Ko-Fi - ko-fi.com/1littlecoder
🧭 Follow me on 🧭
Twitter - / 1littlecoder
Linkedin - / amrrs

Опубликовано:

 

10 сен 2023

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 89   
@spicer41282
@spicer41282 11 месяцев назад
Like you said : Amazing! I say amazingly good! without using a GPU! 1 thing i was waiting for? is to see the end result of the Subtitling? of your target vid. You might've just forgotten? As i can relate to your exceptional expectations of what it can do. Thank you for all the error traps prevention 👍!
@twobob
@twobob 11 месяцев назад
yeah I noticed that "answer the first question" bug too. I seem to recall the the very smallest llama didnt suffer from it. Have to test again. decent write up
@1littlecoder
@1littlecoder 11 месяцев назад
Thanks. Always good to hear a positive review from a critic 😊
@killianlucas
@killianlucas 10 месяцев назад
Amazing video!! Really cool use cases
@1littlecoder
@1littlecoder 10 месяцев назад
Glad you like them! Thanks so much for your work on this. I wish this grows more and more !
@DeonBands
@DeonBands 11 месяцев назад
Would be pretty cool if one could use an openai fine-tuned model, in theory one could fine-tune on chain of thought for specific system or SDLC?
@justrobiscool4473
@justrobiscool4473 4 месяца назад
hey i love the tutorial! ive been having a hard time runnin it in windows and been getting stuck with the API key running it thru LMStudio... If you got a microsoft tutorial for crewAI. Never know when i should be in Anaconda or Terminal or VScode or whatever
@mateocarino
@mateocarino 11 месяцев назад
What's the difference with chatGPT code interpreter if we have to use our api key to use it?
@UserB_tm
@UserB_tm 11 месяцев назад
Great video. I'm going to try this with a local llm
@1littlecoder
@1littlecoder 11 месяцев назад
Great!
@LuciferHesperus
@LuciferHesperus 10 месяцев назад
Finally... it has begun.
@MrMoonsilver
@MrMoonsilver 10 месяцев назад
Hey man, thank you so much for the content. What if I'm using a multi gpu setup locally, will the base model run parallelized or would that need to be set up separately?
@aimademerich
@aimademerich 8 месяцев назад
Phenomenal, thank you 🙏🏽
@1littlecoder
@1littlecoder 8 месяцев назад
You're so welcome!
@MrWuzey
@MrWuzey 11 месяцев назад
Open interpreter is a revolution but it's just bad with local LLM and i don't think 30B model will do different but i could be wrong and using it with open ai api is really costly.
@sekhardhana3453
@sekhardhana3453 11 месяцев назад
Awsome video bro
@1littlecoder
@1littlecoder 11 месяцев назад
Thanks 😁
@aimademerich
@aimademerich 8 месяцев назад
Thank you 🙏🏽
@1littlecoder
@1littlecoder 8 месяцев назад
You are so welcome
@ChefDomein
@ChefDomein 11 месяцев назад
Thank for this video bro, you makin my day with every video! Keep it up like this (L)
@1littlecoder
@1littlecoder 11 месяцев назад
Thanks bro, I try every day, somedays I'm very tired and somedays code just doesn't work
@ronyrufus950
@ronyrufus950 11 месяцев назад
This is great
@KevinKreger
@KevinKreger 10 месяцев назад
Pretty amazing. I just wish they would have support for more non-local APIs and configurable system prompt. I had a look at the code and it would not be easy to re-factor either and the hardcoded user prompt looks pretty lame.
@1littlecoder
@1littlecoder 10 месяцев назад
Couldn't agree more!
@killianlucas
@killianlucas 10 месяцев назад
Hey Kevin! dev of open interpreter here. this is coming by the end of the week :) internally working on the best way to support all these non-function calling models (the reason we started with GPT-4) like claude, cohere, a localhost, HF inference, together, etc. system prompt config is coming soon- it will just be a config.yaml you can easily edit (or edit from the command line with `interpreter --config`)
@KevinKreger
@KevinKreger 10 месяцев назад
Hi@@killianlucas Yes, that 'function' without an API must be a challenge, but excellent news that you are working on it. I am already amazed and now flattered to be chatting with Mr. Open Interpreter.
@andreduval7359
@andreduval7359 10 месяцев назад
@@killianlucas hi, i installed open interpreter and set my api keys. however i acccidentally deleted the api keys in openai account. i need help in changing my api keys in open interpreter . im not sure what argument to run to do this can you help?
@arthurperini
@arthurperini 10 месяцев назад
I'm Building a chatbot with openai functions calling. I already done with websearch and weather functions, news and some others. Can I use this Interpreter as a new function? . It would be amazing
@sexyface007
@sexyface007 10 месяцев назад
What is the size of llama for downloading ?
@zyxwvutsrqponmlkh
@zyxwvutsrqponmlkh 10 месяцев назад
So, this is cool, but also scary. I would like to run this in a VM but I must use everything locally and I don't know how well running the models inside a windows vm will work.
@sexyface007
@sexyface007 10 месяцев назад
Is there a colab version ?
@johnnyskelton783
@johnnyskelton783 11 месяцев назад
Could this be used in combination with aider for codebase context?
@1littlecoder
@1littlecoder 11 месяцев назад
The code is open source so there is some potential
@freedom_aint_free
@freedom_aint_free 11 месяцев назад
I'd only bother to install and try if someone can tell me: Is it *significantly* better at coding than Gpt-4 Plus or at least as good as it but with a bigger context window ? Otherwise, never mind.
@nmstoker
@nmstoker 10 месяцев назад
Another great video. However you shouldn't need to mention about pip3 anymore, it's just noise that you could skip so people aren't distracted (you've plenty of good things to share without overloading on things that stopped being needed for most people years ago)
@1littlecoder
@1littlecoder 10 месяцев назад
Thanks for the info! I just thought if someone's on Windows or Linux and aren't familiar with this. I'll try to minimise or figure out a way to display as text
@jackflash6377
@jackflash6377 10 месяцев назад
Don't assume everyone knows what you know. I'm new to python but not new to coding. Many people, like myself, need step by step with no detail omitted. Perhaps make a step by step of all commands and post it on a website or add it in the description.
@nmstoker
@nmstoker 10 месяцев назад
@@jackflash6377 understood, but I'm saying it's pointless to tell people about pip3 as pip works in 99.9% of cases, and that's not impacted by how much or how little you know. It's 2023 we can move on. If he really wanted to go into micro detail, he could mention that strictly you should actually be usibg "python -m pip" but that's not worth side-tracking for either
@adamrodriguez7598
@adamrodriguez7598 10 месяцев назад
@@jackflash6377agreed
@doge1931
@doge1931 11 месяцев назад
they enabled api access so u can use GPTQ models
@jawadmansoor6064
@jawadmansoor6064 11 месяцев назад
once you have downloaded a gguf with it, will it try to download the same model again after shutting it down (or shutting down/restarting computer)? I am talking about non standard model (any other model in gguf, not the llama gguf which is standard for it)
@1littlecoder
@1littlecoder 11 месяцев назад
Nope, it looks for the model in that folder and it won't download again
@jawadmansoor6064
@jawadmansoor6064 11 месяцев назад
so, can i put my downloaded model into the downloads folder of the interpreter? where is the downloads folder of interpreter anyway?@@1littlecoder
@jersainpasaran1931
@jersainpasaran1931 10 месяцев назад
can we use open interpreter, currently with chatgpt 3.5 turbo, how and where would the user be prompted?
@1littlecoder
@1littlecoder 10 месяцев назад
I mentioned that in the video , I think you need use an additional argument fast or something like that.
@jersainpasaran1931
@jersainpasaran1931 10 месяцев назад
it is true, always grateful@@1littlecoder
@1littlecoder
@1littlecoder 10 месяцев назад
Sorry, I think the argument changed - interpreter --model gpt-3.5-turbo this is how you need to enable in the CLI
@HB-kl5ik
@HB-kl5ik 11 месяцев назад
Buddy with GPT-4 it just works like magic!
@1littlecoder
@1littlecoder 11 месяцев назад
Is Buddy the tool name?
@HB-kl5ik
@HB-kl5ik 11 месяцев назад
@@1littlecoder buddy is what I'm calling you😭
@HB-kl5ik
@HB-kl5ik 11 месяцев назад
Love you for making video on open-interpreter :)
@NoNamenoonehere
@NoNamenoonehere 11 месяцев назад
gpt-3 is not too bad too,using a gpu n larger models in my case.
@1littlecoder
@1littlecoder 11 месяцев назад
My bad, I thought you were talking about some new tool that I didn't know :D
@andreduval7359
@andreduval7359 10 месяцев назад
hi, i installed open interpreter and set my api keys. however i acccidentally deleted the api keys in openai account. i need help in changing my api keys in open interpreter . im not sure what argument to run to do this can you help?
@1littlecoder
@1littlecoder 10 месяцев назад
You can overwrite the api key by setting the new environment variable. It'd just simply overwrite it
@andreduval7359
@andreduval7359 10 месяцев назад
@@1littlecoder ive tried, how do i overwrite it?
@felipeguarin2631
@felipeguarin2631 11 месяцев назад
Do you think apple is working on something like this?
@1littlecoder
@1littlecoder 11 месяцев назад
Honestly I don't think so but tomorrow we might get some idea if they are!
@justindressler5992
@justindressler5992 11 месяцев назад
Probably with trying WizardCoder Python 34b
@1littlecoder
@1littlecoder 11 месяцев назад
I couldn't find 7B WizardCoder GGUF and my Mac is Intel so I didn't try with the 13B GGUF, that's definitely worth a shot!
@justindressler5992
@justindressler5992 11 месяцев назад
@@1littlecoder I've been experimenting with larger gguf models with a mix of GPU and CPU offload loading as many layers as possible into the GPU. Not sure how to setup open interpreter to load the model on both GPU and CPU though
@1littlecoder
@1littlecoder 11 месяцев назад
@@justindressler5992 What's your GPU ?
@justindressler5992
@justindressler5992 11 месяцев назад
@@1littlecoder I'm using a 3080 with 12gb this is good to load most 13b models in VRAM. But the 13b models are kind of simple, the 30b+ models behave alot more like gpt 3. I'm also using Runpod to test the biggest models. There has been some recent improvements with coherence in the smaller models especially the LLama 2 ones. When splitty the model over GPU and CPU you have to be vary careful not to overallocate VRAM,because modern systems will start to page VRAm this is far worse than just running on CPU I try to leave 1-2gb if VRAM free incase the system needs it
@priyankak5589
@priyankak5589 11 месяцев назад
Is it possible to use interpreter with azure api key?
@1littlecoder
@1littlecoder 11 месяцев назад
Yes interpreter.use_azure = True interpreter.api_key = "your_openai_api_key" interpreter.azure_api_base = "your_azure_api_base" interpreter.azure_api_version = "your_azure_api_version" interpreter.azure_deployment_name = "your_azure_deployment_name" interpreter.azure_api_type = "azure"
@priyankak5589
@priyankak5589 11 месяцев назад
@@1littlecoder Thanks a lot 🙏
@priyankak5589
@priyankak5589 11 месяцев назад
Do I need to add this piece of code after import interpreter...in the command prompt itself?
@1littlecoder
@1littlecoder 11 месяцев назад
@@priyankak5589 Yes right
@gbengaomoyeni4
@gbengaomoyeni4 11 месяцев назад
Before you start a tutorial, try to first explain what the terms involved are and the reason why we should watch it. You just started with mentioning Open-Interpreter without first explaining what it was. There might be beginners watching your tutorial. By the way, I like your past tutorials. Thank you!
@1littlecoder
@1littlecoder 11 месяцев назад
My bad I'm sorry. I missed that here. I had covered that intro in my previous AI news update so I kind of picked up from where I left there.
@gbengaomoyeni4
@gbengaomoyeni4 10 месяцев назад
​@@1littlecoder​Oh ok!
@gwhnsa
@gwhnsa 10 месяцев назад
I am an idiot... Can't get it to work at all... Lol :)
@gunasekhar8440
@gunasekhar8440 10 месяцев назад
I'm a very big fan of yours. Please make a detailed video about how to use metagpt frameworks. It is an interesting AI product which can handle all the domains of software company.
@matbeedotcom
@matbeedotcom 11 месяцев назад
This looks painful but definitely on the right path. I know youre hungry for content but this is a super early side project at best
@1littlecoder
@1littlecoder 11 месяцев назад
Do you mean it's not working fine?
@matbeedotcom
@matbeedotcom 11 месяцев назад
@@1littlecoder No this has too many moments of error for it to be fine
@TheReferrer72
@TheReferrer72 11 месяцев назад
A side project with over 17k stars.
@matbeedotcom
@matbeedotcom 11 месяцев назад
@@TheReferrer72 I had a side project with a million users. Stars don't indicate stability.
@TheReferrer72
@TheReferrer72 11 месяцев назад
@@matbeedotcom Sure you did....
@bleo4485
@bleo4485 10 месяцев назад
Hey, thanks for the video. Would this work with Colab Pro?
Далее
best way out of the labyrinth🌀🗝️🔝
00:17
Просмотров 1,4 млн
AI coding assistants just leveled up, again…
4:51
Просмотров 1,1 млн
LLMs as Operating System (Memory) CRAZY START?
15:27
Просмотров 36 тыс.
Code Interpreter, But Now Free & on Your Computer!
7:22
Improve Your AI Skills with Open Interpreter
15:23
Просмотров 12 тыс.