With Ollama, you can run local, open-source LLMs on your own computer easily and for free. This tutorial walks through how to install and use Ollama, how to access it via a local REST API, and how to use it in a Python app (using a client library like Langchain).
👉 Links
🔗 Ollama GitHub: github.com/ollama
🔗 LLM Library: ollama.com/library
🔗 RAG + Langchain Python Project: • RAG + Langchain Python...
📚 Chapters
00:00 How To Run LLMs Locally
01:07 Install Ollama
02:45 Ollama Server and API
04:15 Using Ollama Via Langchain
30 июн 2024