I appreciate the effort to deliver a well-structured and very informative course. I just want to point out that rather than using multiple if statements for the pet_color as in the snippet below, if animal_type == "Dog": pet_color = st.sidebar.text_area( label="What color is your dog?", max_chars=15 ) if animal_type == "Cat": pet_color = st.sidebar.text_area( label="What color is your cat?", max_chars=15 ) ... you could do the below to avoid multiple if statements. pet_color = st.sidebar.text_area(label=f"What color is your {animal_type}?", max_chars=15)
@@nicknico4121 heres a pro tip, do not learn new ones constantly. Pick one you think you might like, and get skills and build those skills until you can complete an application or project that you designed and implemented yourself. There is no need to be learning all the new frameworks and languages every 2 minutes, you cannot keep up and even the best developers in the world don't keep up. Get the core skills first, then you can build applications in any language or framework your project calls for.
@@nicknico4121you shouldn't be worried about that, learn on your own pace and you'll be grateful afterwards. also, you should really only choose one programming language you think it's interesting to learn and stick to it.
@@defaultdefault812 default bro. That's not the point of my comment I just admire how these people so passionately creating videos as soon as possible. I know I'm not that far, but what I'm sure is that I'm consistent in my own pace. Goodluck in your journey.
thanks for such a helpful course the section for the Youtubw Assistant is much too dense and a bit all over the place. you don't run the langchain helper to check if file is okay (for a noob like me, i have to) then you are bouncing between tabs - also makes things more confusing. break that section down into specific chunks in the video, so that those of us who are ony starting out at coding can follow
You took most of the statements that you said in the introduction of this video from the 8 month old video on LangChain from Rabbitmetrics. You should have the decency and courtesy to at least mention that and cite that. It is a very bad practice to copy stuff from others and not cite it
Around minute 23, how about: st.title("Pets name generator") animal_type = st.sidebar.selectbox("What is your pet?", ("Cat", "Dog", "Cow", "Hamster")) pet_color = st.sidebar.text_area(label="What color is your " + str.lower(animal_type) + "?", max_chars=15)
So much complexity could have been resolved with f-strings, right? instead of using the llm template, just use an f-string, instead of using if statements for each animal type, use an f-string, ...
Why does the OpenAI LLM not respond with all the fluff like "Certainly! Finding a fitting name for a pet is a difficult process, and I'm happy to help in this regard. With this said, here are five examples of names that might fit your cat, which is black of color: 1. Shadow - Shadow is a common name for any black animal, so it would fit really well for your pet cat. 2. Midnight - The name midnight refers to the time of day at 12 am when it's really dark outside. The darkness is a reference to your cat's color! 3. etc etc etc Always remember that it's a big responsibility to choose a proper name for a pet. It's not easy to make such a decision lightly!"
Not sure, but I do know that if you tell ChatGPT to provide the output in a particular format, it will do so. E.g. tell it to "provide the output in a numbered list format and do not include any other text than the numbered list" and it will do that.
Hey you got a good learning experience, but one question, or we can able to do this things instead of using chat open ai to Microsoft Azure Open AI, can you give some notes on it.
I tried the agents as per your example, I added both wikipedia and ll-math as tools, and I asked the exact same question, but the response starts with action:calculator and it tries to compute the math first rather than using the wikipedia to search first. the agent is not reasoning... May I have your views?@rishabincloud
Failing at the start unfortunately when running the dog name generating script. I can print the model name so things are set up correctly package wise, but when the code reaches name = llm("Write 5 dog names") it throws the error : "module 'openai' has no attribute 'error' "
Thanks for the brilliant video. There is a small bug, while using lch.get_response_from query method, please pass k variable a value. Do you use any extension for terminal? Thanks
6:03 "Python was not found; run without arguments to install from the Microsoft Store, or disable this shortcut from Settings > Manage App Execution Aliases." Even though i have installed python on my system. Please resolve this issue. Thank you!
Can anyone explain that does it send 4000 words at a time or a total of 4000 words because of token limit? If it only sends 4000 words when k=4, how does it come to conclusion without reading the whole transcript? Thank you for the help, very informative and interesting video.
Based on my understanding, the RU-vid assistant finds the 4 most similar parts of the transcript*, merge them, and then feeds the text-davinci-003 with the merged text. So, based on these 4000 chunks, the text-davinci-003 tries to answer the user's question. *Each part contains 1000 chunks
Hi, no. You need to provide your card info, and then, at the end of the month, OpenAI will charge you as much as you spent. BTW they don't take any money when you used a small amount, for example 2 cents :)
Hi ,very good i need course of searched clients Mean clients hunting crash course for every purpose with extremely deeply techniques and things which you know and even have made anyone this course tell me. Thanks ❤❤❤
very informative, thanks, but this ugly bit of code around 23:50 made me feel extremely uncomfortable. instead of copy-pasting of the same code multiple times, why not use a simple f-string f"What color is your {animal_type}?" and dump all your "if" statements completely?
This was incredible!! Thank you so much for this video, it was really easy to understand and follow! I can't wait to start doing my own projects with langchain!!
A questoin here: When I was following the agent part, I do used tools of wikipedia and llm-math, but the agent only chosed to use calculater but not wikipedia throughout the process. It gives " I need to find the average age of a dog and then multiply it by 3 Action: Calculator Action Input: 3 * (12 + 15 + 10 + 8 + 5) / 5 Observation: Answer: 30.0" for the first part, which is very weird cause I expect it to use wikipedia instead. Anyone knows why?
getting error like belwo when pip install langchain command my intalled python version is 3.12 ERROR: Ignored the following versions that require a different python version: 0.55.2 Requires-Python
I still can't see why we need langchain. We can do templating with Jinja and use vllm for serving your LLM. Integrating with APIs is basic programming. Getting back structured data is much better with Guidance, LQML or Jsonformer. So why to use Langchain? I seem to not get it.
thx for ur video, I wanna connect 7b-chat-hf to langchain for summarization, neither map-reduce nor refine responds me, in the last step- map reduce took 2 hrs without responding and refine gives me blank document, have u faced this problem
Can someone else just appreciate with me that at approx 16:00 we learn that it takes 28Gb of memory to choose a Cat name. I died laughing. great video. i shall now continue watching.
It would be helpful if someone could help with the answers. Why do we need to use an embedding model, rather we can just ask the Gpt- 4 model to answer our question based on our custom data, right? What is the use of an embedding model over a gpt- 4? What if I want to create a text classifier based on my custom data what should I use?
Really cool tutorial and very helpfull for beginners. Best on RU-vid I woul say. Just a quick tip for starters. Start with doeing a few non-coders like Langflow or Flowise. They realy help to visualize what you are doing. For me it realy helpt to understand the fundamental concepts of which componentens to use.
Question to the crowd: What are the main python libraries to know apart from LangChain? Huggingface? OpenAI? Is AutoGPT a library? sorry, I am a bit lost.
Just learn one and stop trying to run before you can walk. Langchain is a framework. Huggingface is a platform for deploying LLMs Open AI is a service provider. AutoGPT Is a library. Go start with OpenAI APIs
Just fantastic !!! Thanks a lot. Some questions that come to my mind: - How to use it with HunggingFace Models or gpt4free? - How to use it with graphics or video based models like DALL-E? - Lets imagine I have a PDF that I convert to a vector db. What is the difference between asking an AI just based on the information of this PDF, as opposed to the total knowledge of ChatGPT + the information of the PDF? How to combine and compare it? - Since you are an Amazon Pro: Show how to deploy everything in the cloud with Beanstalk or the other web services
This is brilliant! Definitely the best langchain course for beginners. I saw several another courses on youtube and still couldn't understand fully how all of its tools works together. Only after this one I finally got it! Thank you so much!