Тёмный

Perplexica: This 100% LOCAL PERPLEXITY CLONE is NEW, FREE & OPENSOURCE (works with Llama 3) 

AICodeKing
Подписаться 10 тыс.
Просмотров 7 тыс.
50% 1

In this video, I'll be discussing about Perplexica which is a new, 100% local and opensource alternative to Perplexity. You can use this new alternative to self host your own local perplexity clone to stop paying for Perplexity's huge membership costs. This alternative is super easy to install. This alternative can be used with any opensource LLM such as Mixtral 8x22b, Mixtral 8x7b, GPT-4, Grok-1.5 & Gemini Code Assist, Github Copilot.
[Resources]
Perplexica Github Repo: github.com/ItzCrazyKns/Perple...
[Key Takeaways]
🚀 Perplexica is a game-changing alternative to Perplexity that offers a free and open-source solution for searching the internet.
🔒 With Perplexica, you can enjoy private and secure searches without worrying about your data being sold to third-party companies.
🤖 Perplexica uses local LLMs through Ollama, allowing you to choose the best option for your specific use case.
📊 The platform offers six focus modes, including All mode, Writing Assistant Mode, Academic Search mode, RU-vid Search mode, Wolfram Alpha Search mode, and Reddit Search mode.
🎯 Perplexica is fully customizable, allowing you to tailor it to your specific needs and preferences.
💻 You can install Perplexica using Docker or without Docker, making it easy to get started.
📈 Perplexica is still a relatively new project, but its active GitHub community is shipping changes fast to fix any issues that arise.
📁 Perplexica is not just limited to searching the web, but also allows you to search your local files and build your own AI-powered applications.

Наука

Опубликовано:

 

3 май 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 36   
@PhilippeVanLoo
@PhilippeVanLoo Месяц назад
First ! ;)
@PhilippeVanLoo
@PhilippeVanLoo Месяц назад
Man your videos are amazing ! Clear explanations of the concept and just after actionnable tutorial on how to install it !!! Love it !!! Thank you sooo Much !!!
@AICodeKing
@AICodeKing Месяц назад
Thanks buddy!
@RedOkamiDev
@RedOkamiDev Месяц назад
May the 4th be with you Mr. King, thanks again for your work.
@jonyfrany1319
@jonyfrany1319 15 дней назад
100% in love with this project!
@andreas7181
@andreas7181 Месяц назад
Thank you for your information.
@siddhantashtekar5806
@siddhantashtekar5806 Месяц назад
You are amazing ❤️
@siddhantashtekar5806
@siddhantashtekar5806 Месяц назад
I just want to donate you for your work❤️❤️
@AICodeKing
@AICodeKing Месяц назад
Your love is enough! Thanks, buddy!
@PhilippeVanLoo
@PhilippeVanLoo Месяц назад
@@AICodeKing Agree with @siddhantashtekar5806 you deserve a Patreon ;)
@hacknslashpro9056
@hacknslashpro9056 Месяц назад
what is the difference between this one and the streamlit one that AI research assistant groq+Tavily
@aimademerich
@aimademerich Месяц назад
Phenomenal
@mabidan
@mabidan Месяц назад
perplexity hates you 😅 but we love you ❤😍
@warlockassim4240
@warlockassim4240 Месяц назад
My best word : open source or completly free
@irkedoff
@irkedoff Месяц назад
💜
@atypocrat1779
@atypocrat1779 Месяц назад
i somehow got my i5 with 16mbs of ram to run the last one with docker. it was not so fast. lol. i don’t feel like getting an expensive desktop that needs a 1kw power supply
@johnbramich
@johnbramich Месяц назад
Going to install this. How do I keep it updated?
@AICodeKing
@AICodeKing Месяц назад
You can keep updating it by the "git pull" command
@PhilippeVanLoo
@PhilippeVanLoo Месяц назад
Ok ! Just tried to install it on windows with windows commands that are different. Everything seems to work but I have this message on the setting pannel of perplexita under embeding model : Invalid provider, please check backend logs I have entered few models adress and API in the settings ( llama 3, Groq and Open AI key) but I still have the same issue ? Do you know why I have this issue ? :) Maybe you can help :)
@AICodeKing
@AICodeKing Месяц назад
It should work. Maybe, you can try to check if Ollama ports are correctly configured in config toml. If it still doesn't work you can raise a issue on their repo and they can help you out much better.
@PhilippeVanLoo
@PhilippeVanLoo Месяц назад
@@AICodeKing Ollama already works perfectly with other web ui like open-webui
@PhilippeVanLoo
@PhilippeVanLoo Месяц назад
@@AICodeKing Thank you
@PhilippeVanLoo
@PhilippeVanLoo Месяц назад
@@AICodeKing On config .toml I have put : host.docker.internal:11434
@dra.soniavillarreal843
@dra.soniavillarreal843 Месяц назад
@@PhilippeVanLoo open-webui uses port 3000 and perplexica uses it too , how to solve that conflict ?
@dominiccogan945
@dominiccogan945 Месяц назад
Is it possible to use coolify to host this on your computer?
@AICodeKing
@AICodeKing Месяц назад
Yes, i think you could.
@codelucky
@codelucky Месяц назад
Why use local LLM when you can use GROQ? It would be helpful to see how to enter Groq API key in the video.
@AICodeKing
@AICodeKing Месяц назад
It's pretty and can be done by yourself.
@codelucky
@codelucky Месяц назад
@@AICodeKing oh ya it's pretty easy via the config.toml, thanks.
@df4privateyoutube722
@df4privateyoutube722 Месяц назад
Should actually dig into the code if you're gonna make a video on it, that would be helpful as well
@danield9368
@danield9368 Месяц назад
Best channel
@gabrielkasonde367
@gabrielkasonde367 13 дней назад
can you please also do the no docker video that might be really cool 🫠
Далее
ОВР Шоу: Глава Патриков @ovrshow_tnt
09:27
host ALL your AI locally
24:20
Просмотров 677 тыс.
Better Searches With Local AI
8:30
Просмотров 21 тыс.
Is Perplexity AI better than Google AND ChatGPT?
17:18
wireless switch without wires part 6
0:49
Просмотров 1,2 млн
wireless switch without wires part 6
0:49
Просмотров 1,2 млн
wyłącznik
0:50
Просмотров 24 млн