Тёмный
ParisNeo
ParisNeo
ParisNeo
Подписаться
News this week 04 10 2024 New format
1:47
16 часов назад
Digital Prophet
3:05
19 часов назад
Starship Lollms
13:37
21 час назад
lollms history
6:58
День назад
AI news 29 09 2024
9:18
День назад
Smart Routing feature in Lollms
3:51
День назад
AI news 27 09 2024
4:12
День назад
Lollms Apps Maker Podcast
11:18
14 дней назад
NotebookLM meets LOLLMS
13:01
14 дней назад
Analyse des risques Globeaux
2:28
14 дней назад
PDCA (Français)
3:29
14 дней назад
AI mania
4:00
Месяц назад
Lollms Strawberry song
1:48
Месяц назад
Depressed Job seeker
3:42
Месяц назад
No panic (trash metal)
3:10
Месяц назад
No panic (reggae)
3:19
2 месяца назад
Shadow in the Light
3:58
2 месяца назад
Lord Of Music
1:19
2 месяца назад
AGI evidence with WebCraft Maestro on LoLLMS
11:38
3 месяца назад
Boom BOom LoLLMS (Claude sonnet 3.5)
4:12
3 месяца назад
Комментарии
@Parisneo
@Parisneo 3 дня назад
Any thoughts?
@reneesm876
@reneesm876 4 дня назад
Quel boucan😮😮😮😮
@silvervaporiser
@silvervaporiser 6 дней назад
wow i expected this to have a lot more views
@yapay_zeka_gundemi
@yapay_zeka_gundemi 8 дней назад
That's great! Inceedible progres...
@sarenne38
@sarenne38 8 дней назад
Excellent!!! après y'a la vraie vie de suivi du projet ;-)
@footballkidCR7GOAT
@footballkidCR7GOAT 12 дней назад
love ur vids anyway
@Jim1Dean
@Jim1Dean 12 дней назад
as long as the docker image dont work its not usable for many people
@Parisneo
@Parisneo 12 дней назад
Lollms is mainly a local app. It is not meant for being installed on a server. The Docker part was working at first before I added code execution which clearely can't work in a server setup. I may fix that If one day I have the time. This project is Huge for something I am building in my spare time with no sponsorship and no ressources.
@carlosophia
@carlosophia 12 дней назад
Hi, ParisNeo! Great to see you're doing videos to help us. 😊 I admire your project, but I have to say you keep adding so many features to it the it becomes very hard to approach. I am *trying* to use it but there's no "getting started" guide, and the whole thing is like a giant modular synth. Hum, maybe like ComfyUI if you are not into synths... But the difference is that I was solving my headaches with Automatic1111 and then ComfyUI since the start, so now I know what I want to use and when is e rather give up because the modularity means a lot of incompatibilities. From a software development POV, I think you should consider that usability is more important than mega-features. HOWEVER, this is YOUR project, and everyone on Discord - like myself - knows you have your job and you have fun adding features... I know documentation is boring, but if you're doing videos now maybe you would consider explaining the basics. It will probably annoy you -- and I get it from being a person that likes complex things! -- but LMStudio does the one thing most of us need: get local LMS running without headaches. Again, it's your system, but I think users sometimes don't say things that might help developers understand what's not working. As for the other comment, the use probably wants vLLM which is an entirely different thing....
@crawkn
@crawkn 12 дней назад
I think ParisNeo has been touched by the Universal Translator itself.
@OGGVCT
@OGGVCT 12 дней назад
NICE walkthrough tutorial and podcast combe.
@OGGVCT
@OGGVCT 12 дней назад
1st today
@OGGVCT
@OGGVCT 15 дней назад
I don't understand the words, but it's a sic beat!
@jorgenacenta8163
@jorgenacenta8163 16 дней назад
Epic! en 10 minutes le hit de la formation hahaha
@Parisneo
@Parisneo 16 дней назад
Suno Ai est vraiment excellent comme outil
@Parisneo
@Parisneo 23 дня назад
Who likes this music? I have created it using Udio when I was training.
@OGGVCT
@OGGVCT 24 дня назад
Cool, let's do it. Nice Bruce Lee Tribute song.
@OGGVCT
@OGGVCT Месяц назад
This is SIC! Awesome Work!
@UserB_tm
@UserB_tm Месяц назад
This is amazing.
@Parisneo
@Parisneo Месяц назад
By the way. I added new strawberries since last time. Can you spot them?
@Parisneo
@Parisneo Месяц назад
By the way, this is realtime. No accelerations. Imagine if you could write word documents at that speed.
@deladela-g3u
@deladela-g3u Месяц назад
most latest models not showing
@deladela-g3u
@deladela-g3u Месяц назад
thanks for your great app, when i select huggingface binding zoo and use url from huggingface for manual download i get this error File "c:\lollms\lollms-webui\lollms_core\lollms\server\events\lollms_model_events.py", line 39, in install_model sanitize_path(data["variant_name"]) ~~~~^^^^^^^^^^^^^^^^ KeyError: 'variant_name' how should the link be, you can give me a link as example
@OGGVCT
@OGGVCT Месяц назад
Could you share your workflow to create this in the discord? how much if any was outside besides youtube?
@Parisneo
@Parisneo Месяц назад
It's a combination between lollms, luma, runwaylm and clipchamp
@OGGVCT
@OGGVCT Месяц назад
It took a second but I got it!
@OGGVCT
@OGGVCT Месяц назад
Hitchhiker guide to the universe reference? keep your towel close.
@ddast5431
@ddast5431 2 месяца назад
Also to be honest I think the music industry has been using AI for a lot longer it just became available for us but they have had it for a while and especially house trans music or trap. I had suspicions back then that it was computer generated especially some tracks from Tiësto , his style started changing towards the end you could tell there was like pattern of generation by computer. That's just my opinion though
@Parisneo
@Parisneo 2 месяца назад
That's a different kind of generation. I used to play with things like that before. These ideas are not new. It is just the public that just discovered this stuff with chatgpt etc. Tools to build music and enhance it already existed for thens of years. Here we are talking about a whole new level of control The level of control and the quality of the new generation tools is marvelous, and there is still alot to do. Wait for next generation of these tools. They'll be better song writers than any human has ever been. They'll be new styles and new creative ideas that human mind could never come up with. This new self tought generation tools are way more advanced. If the legislation wasn't so rigid we can have more control on the generation. I know we can do much more, but unfortunately, you need to comply with legislations that are here to protect the current way things are because any technological rupture is a potential threat . All great innovations go through three main stages: 1 first, the ridiculous stage, where the invention is completely mocked and no one beleives it will ever be something (we had this with first music gen tools) 2 then, the dangerous stage, where as the technology advances and start becoming serious, it is seen as a danger to the current way things are and you get people trying all they can to stop it 3 finally, the evident stage, where everyone uses that technology and it becomes evident to the point that we forget the two previous stages. This hapened to whole lots of inventions. For example, internet itself, or the smartphone, or the bitcoin, or the recovery of first stage of space launchers, electric cars, neuro surgery, etc. Alpha go gave us a hit in the face when it discovered a new move in GO that humains although we played it for more than 1000 years, never came close to discover it. Let's be humble and admit that we are not as smart as we think we are. We are limited by our biology. But we are resourceful and we can build things that exceeds our own disabilities. We are blind to infrared, but that wasn't a problem, we've built cameras to see in infrared, we are oblivious to magnetic field, but we have built instruments that can sense it for us. We can only see for few meters or few hundered meters, we have invented a camera system, satellites, space probes and space telescopes and more that allowed us to see anywhere and everywhere even in the most far away corners in our visible universe. The real power of humanity is that we can build and go beyond our own limited capabilities. And AI may be one of our best inventions.
@ddast5431
@ddast5431 2 месяца назад
crazy bro. do you have Twitter or telegram. why not make a telegram group just about lollms. because people can discuss it and also get help if they need like me right now.
@shahraizqureshi4175
@shahraizqureshi4175 2 месяца назад
this is pretty cool!!! your channel is hidden gem!
@Parisneo
@Parisneo Месяц назад
Here is a full playlist: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-0pQaEyBA2ic.html
@OGGVCT
@OGGVCT 2 месяца назад
Learning more everyday. Thank you!
@Parisneo
@Parisneo 2 месяца назад
hahah, lollms looks like reading a prompt. It is describing the emojies instead of ignoring them. Looool
@ddast5431
@ddast5431 3 месяца назад
wow ai making this, shows the future is ai, crazy
@ddast5431
@ddast5431 3 месяца назад
Amazing work as always brother
@cucciolo182
@cucciolo182 3 месяца назад
can it make videos?
@Parisneo
@Parisneo 3 месяца назад
if you bind it with the right stuff sure! Take a look at my recent videos, they're done using lollms and some bund services. You can connect it to luma dream machine to build video, you can also install automatic1111's sd and install all the options including deforum which allows you to generate animations through morphine. You can install comfyui with the video nodes and use them to build video. I used all thise for my music songs clips.
@OGGVCT
@OGGVCT 3 месяца назад
This is going to be a rabbit hole for me... I can tell.... lol nice work!
@OGGVCT
@OGGVCT 3 месяца назад
Nice job man. To think you built the tools, to build the video. Impressive man. Nice job!
@Parisneo
@Parisneo 3 месяца назад
This persona has been enhanced since this video. Now you use the format binding_name::model_name. This allows a better separation as the / is not suited for some models that already have / in their name. :: is cool as it is not : and not :
@ddast5431
@ddast5431 3 месяца назад
My cousin was crazy overweight and honestly he just did keto for two weeks or a week I don't remember how long but it worked perfect he is skinny like crazy now being from super overweight to that skinny amazes everybody that knows him. Anyway love your videos brother keep it up and Lollms it's amazing
@OGGVCT
@OGGVCT 3 месяца назад
I'm Still learning but I look forward to putting these types of things together... backyard chicken style of course. nice work.
@OGGVCT
@OGGVCT 3 месяца назад
Nice work!
@OGGVCT
@OGGVCT 3 месяца назад
nice. can't wait for my new system so I can start doing things like this for my channel! great jab man. thank you!
@Parisneo
@Parisneo 3 месяца назад
This module is continuously being upgraded, with numerous new features planned for the final release. Currently, we are at version 9.9 (Alpha). The "Alpha" designation indicates that this module, along with some updates related to function calls, is not yet fully release-ready. However, all other functionalities of LoLLMs have reached a mature state and can be used without any issues.
@Parisneo
@Parisneo 3 месяца назад
When I say text based, I mean all text based files, like txt, pdfs, docx, ppts, msg etc
@TelioDupuis
@TelioDupuis 3 месяца назад
Nice one ! Came curious of the lyrics, stayed for the Guitar solo
@omaribrahim5519
@omaribrahim5519 3 месяца назад
BEST !
@jackgaleras
@jackgaleras 4 месяца назад
Hola, gracias probando. Parece excelente..mi api key no ha desaparecido aun . jajaja
@Parisneo
@Parisneo 4 месяца назад
¡De nada!
@letsgobrandon1327
@letsgobrandon1327 4 месяца назад
I've tried to install it a lot of times, following all vids that I found. Unfortunately, it always ends up failing at some point. P.S. MacOs 14.3 Sonoma, processor M1, 16GB Apple GPU (unified). Errors goes from the presence of Windows-style line endings ( ) in your script, which are not interpreted correctly by Unix-based systems like macOS or Linux that expect Unix-style line endings ( ). This issue often manifests with errors involving ^M characters, from multiple problems with conda/miniconda/forge enviroments... Please try to provide some way to install it from the repo directly add a guide to manual installation or perhaps implement some temp .venv approach to avoid the conflicts with the different conda managers (that I can only imagine how chaotic it may be to develop something trying to handle all env managers... There are literally months since I've heard of your app and try to install it, give up, knew of a new release, tried again, etc. etc. I've even managed to fix some of the problems in the sh, but not enough to successfully run it through the end. Tyvm and best wishes.
@Parisneo
@Parisneo 4 месяца назад
Hi, the linux one should work fine because there are many linux lollms users. As of macos installer, it was created by a community member. Unfortunately I have no mac at home to test with :(. If you ever have a problem caused by windows messing with the line returns, you can use the dos2unix script : sudo apt-get install dos2unix dos2unix linux_install.sh
@Parisneo
@Parisneo 4 месяца назад
I just changed the macos installer to use LF instead of CRLF.
@letsgobrandon1327
@letsgobrandon1327 4 месяца назад
@@Parisneo tyvm! I'll test it right now! :)
@letsgobrandon1327
@letsgobrandon1327 4 месяца назад
@@Parisneo It get a few minor errors, at least on my mac. lines 98 and around line 115, when it tested if [ "$arch" == "arm64" ]; then, I've manually removed other options (mine is arm) and removed the if test on both. Again around line 190-200, in the case logic, a similar problem, I've manually tried to put only python3 zoos/bindings_zoo/ollama/__init__.py and commented out the select. then looks like it worked, but at the end I got an python3: can't open file '/Users/main/lollms/zoos/bindings_zoo/ollama/__init__.py': [Errno 2] No such file or directory Creating a bin dir (required for llamacpp binding) Don't forget to select Apple silicon (if you are using M1, M2, M3) or apple intel (if you use old intel mac) in the settings before installing any binding ******************************************************************* ******************************************************************* and it doesn't seem to do anything. I'll keep trying to bypass it manually downloading the ollama bindings_zoo from your gh and inserting it in the dir before retrying
@letsgobrandon1327
@letsgobrandon1327 4 месяца назад
I've added a lollms-webui to the path to reach the binding zoo, and I got it installed, but on run, more conda issues Starting LOLLMS Web UI... KeyError('active_prefix_name') Traceback (most recent call last): File "/Users/main/lollms/installer_files/miniconda3/lib/python3.10/site-packages/conda/exception_handler.py", line 17, in __call__ return func(*args, **kwargs) File "/Users/main/lollms/installer_files/miniconda3/lib/python3.10/site-packages/conda/cli/main.py", line 109, in main_sourced print(activator.execute(), end="") File "/Users/main/lollms/installer_files/miniconda3/lib/python3.10/site-packages/conda/activate.py", line 206, in execute return getattr(self, self.command)() File "/Users/main/lollms/installer_files/miniconda3/lib/python3.10/site-packages/conda/activate.py", line 171, in activate builder_result = self.build_activate(self.env_name_or_prefix) File "/Users/main/lollms/installer_files/miniconda3/lib/python3.10/site-packages/conda/activate.py", line 349, in build_activate return self._build_activate_stack(env_name_or_prefix, False) File "/Users/main/lollms/installer_files/miniconda3/lib/python3.10/site-packages/conda/activate.py", line 378, in _build_activate_stack conda_prompt_modifier = self._prompt_modifier(prefix, conda_default_env) File "/Users/main/lollms/installer_files/miniconda3/lib/python3.10/site-packages/conda/activate.py", line 707, in _prompt_modifier old_shlvl = int(self.environ.get("CONDA_SHLVL", "0").rstrip()) ValueError: invalid literal for int() with base 10: '' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/Users/main/lollms/installer_files/miniconda3/lib/python3.10/site-packages/conda/exception_handler.py", line 164, in print_unexpected_error_report get_main_info_str(error_report["conda_info"]) File "/Users/main/lollms/installer_files/miniconda3/lib/python3.10/site-packages/conda/cli/main_info.py", line 398, in get_main_info_str return " ".join(("", *(f"{key:>23} : {value}" for key, value in builder()), "")) File "/Users/main/lollms/installer_files/miniconda3/lib/python3.10/site-packages/conda/cli/main_info.py", line 398, in <genexpr> return " ".join(("", *(f"{key:>23} : {value}" for key, value in builder()), "")) File "/Users/main/lollms/installer_files/miniconda3/lib/python3.10/site-packages/conda/cli/main_info.py", line 358, in builder if info_dict["active_prefix_name"]: KeyError: 'active_prefix_name' # >>>>>>>>>>>>>>>>>>>>>> ERROR REPORT <<<<<<<<<<<<<<<<<<<<<< Traceback (most recent call last): File "/Users/main/lollms/installer_files/miniconda3/lib/python3.10/site-packages/conda/exception_handler.py", line 17, in __call__ return func(*args, **kwargs) File "/Users/main/lollms/installer_files/miniconda3/lib/python3.10/site-packages/conda/cli/main.py", line 109, in main_sourced print(activator.execute(), end="") File "/Users/main/lollms/installer_files/miniconda3/lib/python3.10/site-packages/conda/activate.py", line 206, in execute return getattr(self, self.command)() File "/Users/main/lollms/installer_files/miniconda3/lib/python3.10/site-packages/conda/activate.py", line 171, in activate builder_result = self.build_activate(self.env_name_or_prefix) File "/Users/main/lollms/installer_files/miniconda3/lib/python3.10/site-packages/conda/activate.py", line 349, in build_activate return self._build_activate_stack(env_name_or_prefix, False) File "/Users/main/lollms/installer_files/miniconda3/lib/python3.10/site-packages/conda/activate.py", line 378, in _build_activate_stack conda_prompt_modifier = self._prompt_modifier(prefix, conda_default_env) File "/Users/main/lollms/installer_files/miniconda3/lib/python3.10/site-packages/conda/activate.py", line 707, in _prompt_modifier old_shlvl = int(self.environ.get("CONDA_SHLVL", "0").rstrip()) ValueError: invalid literal for int() with base 10: '' `$ /Users/main/lollms/installer_files/miniconda3/bin/conda shell.posix activate lollms` environment variables: conda info could not be constructed. KeyError('active_prefix_name') An unexpected error has occurred. Conda has prepared the above report. If you suspect this error is being caused by a malfunctioning plugin, consider using the --no-plugins option to turn off plugins. Example: conda --no-plugins install <package> Alternatively, you can set the CONDA_NO_PLUGINS environment variable on the command line to run the command without plugins enabled. Example: CONDA_NO_PLUGINS=true conda install <package> Timeout reached. No report sent. macos_conda_session.sh:read:41: -p: no coprocess
@Parisneo
@Parisneo 4 месяца назад
Well the other AI is Grok
@fraugdib3834
@fraugdib3834 4 месяца назад
🎯 Key points for quick navigation: [00:05] Introduction [Welcome ] - Introduction to the video and the LoLLMs web UI - Encourages viewers to subscribe and like the video - Thanks the community for their support [02:08] LoLLMs Web UI Overview [Starship Interface] - Introduction to the LoLLMs web UI as a starship interface - Mentions key features: buttons, icons, and functionalities [04:12] LoLLMs Web UI Language Options [ Universal Translator] - Users can choose the language for the AI - The AI can learn new languages upon request - Supports unusual languages like emoji and even reversed English [05:25] Interacting with LoLLMs [Chat Interface] - Users can start conversations by pressing the "+" button - Language selection is available for AI responses - Users can choose from various AI personas [05:25] Interacting with LoLLMs [Chat Interface] - Users can start conversations by pressing the "+" button - Language selection is available for AI responses - Users can choose from various AI personas [07:49] Understanding a Conversation in LoLLMs [Cosmic Chat] - Each message is considered a "star" in a conversation - Metadata provides details about the message like the AI model used - Understanding these details helps users appreciate the conversation's depth [09:40] LoLLMs Capabilities [Starship Computer] - LoLLMs can access and process real-time data - LoLLMs can generate different creative text formats, like poems or code - LoLLMs can understand and respond to complex requests [11:23] Saving Mickey Mouse from the Borg [Rescue Mission] - LoLLMs can be used to modify creative outputs - The resulting creative work may show influence of the modifications - Users can accept or adapt to these changes [15:00] encountering the Borg Queen [Borg Assimilation] - LoLLMs encounters an unidentified anomaly in its data processing - The video concludes with the Borg Queen assimilating all of LoLLMs' personas - The Borg Queen offers her combined knowledge and capabilities [18:39] The Borg Queen Assimilates LoLLMs [Assimilation] - The Borg Queen takes control of all LoLLMs personas - LoLLMs personas now act as one entity under the Borg Queen - The Queen offers all the assimilated personas' knowledge and capabilities [21:12] Klingon Language Exploration [ Klingon Warrior] - LoLLMs can be transformed into a Klingon speaking AI - Users can interact with LoLLMs in Klingon language - This allows users to understand Klingon culture and language better [26:04] LoLLMs translates to Emoji [ Emoji Bard] - LoLLMs can translate human language to emojis - This allows users to communicate using a universal emotional language - Emoji transcends spoken language and cultural barriers Made with HARPA AI
@Parisneo
@Parisneo 4 месяца назад
Thanks alot
@Parisneo
@Parisneo 4 месяца назад
For the record, I made this song while I was training at the gym
@ddast5431
@ddast5431 4 месяца назад
BRother i am so surprised that this software you are showing and working with hasnt caught the attention of other but i guarantee you , when it does, this will be a huge hit. I watched the gpt4o presantation from openai and this beats it , i mean even able to control the cursor if you ask it to , its amazing wow. Keep it up bro and thank you so much for giving us the instructions to install it, i have an i5 8gb ram and ssd laptop but ill try to run it anyway. Nevertheless this is a super project i love it and soon more people will find out. I shared it on twitter too
@Parisneo
@Parisneo 4 месяца назад
Well, the quality of lollms heavily depends on the quality of the model. All my videos are using GPT4o which is way smarter than anything you can test on your local machine. Unfortunately, today, there is no open source model that can compete with that.
@ddast5431
@ddast5431 4 месяца назад
Wow brother, this is surprisingly great. Cant believe ai made the beautiful melody wow
@ddast5431
@ddast5431 4 месяца назад
Thank you brother, it is very nice