🔗 Try Here For Free: bolt.myaibuilt.app 🔗 If the above is down you can try this too: 9356f8c3.bolt-2xk.pages.dev/ 🔗 GitHub Repo: github.com/wonderwhy-er/bolt.new-with-download-project-as-zip Chapters: 00:00 - Intro 00:17 - Bolt.New from StackBlitz 00:28 - Few other options? 01:28 - It's fully open-source and extendable! 02:03 - I was so excited that I made a fully free version of it! 02:58 - You can add your own key to use premium models! 04:08 - But it works fine with Gemini 1.5 Flash which is FREE! 06:25 - Project download, Other feature I added to open source veri 07:22 - Growing community around Bolt.New 09:18.- What are we doing next time? You Decide!
@@ChrisMateo-q6s there is other popular repo, i am thinking weather its worth merging efforts or not considering there are interests of running it it locally and in self hosted variant.
While I am figuring out how to deploy it on my own server in stable way I deployed to cloudflare You can try there as alternative 9356f8c3.bolt-2xk.pages.dev/
Lol, that's too much praise :D So many before me in StackBlitz, OpenRouter, Google and on and on Thanks! Still things to do for this to be more useful.
Sadly code is written to run on CloudFlare workers... And I am trying to run it on my digital ocean instance. Which proceed to be hard and I spent too much time on it so far. Can't decide between refactoring it to run on Node or figuring out how to run it with github.com/cloudflare/workerd I restarted the server and will make a video of how to run it locally tomorrow.
While I am figuring out how to deploy it on my own server in stable way I deployed to cloudflare You can try there as alternative 9356f8c3.bolt-2xk.pages.dev/
Excellent video, this is amazing. Thank you for sharing that... I downloaded the git repo but now I don't know the steps to open it in my browser. Can you help me? I subscribed to your channel.
You mean local llms running on your machine? My focus is on hosted solutions that do not require local setup. I wonder if one can use a hosted solution that connects to localhost as a provider. I use LM Studio myself never found local models good enough
When deploying on ec2 instance and using it we are getting core issue due to which sometimes container , code and preview don’t work . When we use small models also we get the same error. If you have a solution for the same it will be nice to have. Also have you tried integrating with AWS bedrock Anthropic models.
Preview seems not to work due to problems with running terminal commands, it's on my list of things to fix next As for AWS bedrock models, no, I saw some others forks adding that, check forks if you need it in timely manner I hope we can organize in to community that adds such things together, plug and play style. Need AWS models, just flip a switch
No, Bolt uses Web Containers that can do backend javascript. In theory python can be added too, there are web interpreters for it. There is another product called e2b.dev/ They have python in similar setup, but there it runs in containers on server side which is way more challenging thing to self host. Still, if you are interested in python on server side + open source, then take a look in that direction. It just works slower as its server side containers. And more expensive to run.
Yes. Pool resources. Yes. I want contribute to yours or Cole Medin But..for theife of me I could not get this rmrhing running on my windows box or docker container. ...frigging cloudflare wrangler Yes, a video of how to get it running locally...i can tthen lush a PR to add docker container stuff
Yeah... I am suffering from that wrangler/cloudflare thing too... Spent most of my time around that so far and will need to spend more. I am running it in dev mode currently and it crashes 2-3 times a day. There is a way to run it in development mode locally And there is a way that was not added to run it in production mode too using github.com/cloudflare/workerd I can't yet decide if I want to refactor it to node.js or to try and use workerd Both have their pros and cons...
@@HonorHasCome it is possible to add, not on top of my list yet, with multiple model selection part of the problem is that not all models support it But noted, will see bit later
What do you like about bolt.new-any-llm? I would love for the community to sync on 1-2 forks, currently, there are a dozen forks where some work is being done. Am exploring whether it makes sense to merge with bolt.new-any-llm Not sure yet as there are two "directions" with this that are bit hard to synchronize currently. One is using it with local things like docker, ollama, lm studio. Which is not a good thing for total nondevelopers. Another one I am interested in is a hosted solution on websites that require setup to run. And I am not yet sure how to allow a single project to serve both directions.
Is there a limit to the number of prompts we can do? I keep keeping this error after the first prompts “there was an error processing your request”. What is the solution to this?
While I am figuring out how to deploy it on my own server in stable way I deployed to cloudflare You can try there as alternative 9356f8c3.bolt-2xk.pages.dev/
"there was an error processing your request" when i put anything on that box. So if i say create a landing page... and press enter, i get that message. It won't work.
While I am figuring out how to deploy it on my own server in stable way I deployed to cloudflare You can try there as alternative 9356f8c3.bolt-2xk.pages.dev/
There is rate limit, it is still free though. I just rechecked and Open Router for free models gives 20 requests per minute and 200 per day, but this is per API key, you can add your own API key to get more and still use free models About 70%, you reference some kind of comparison for Gemini 1.5 Flash?
I did, contact, connected, looked into his repo, and considering to merge efforts There is also StackBlitz office hours tomorrow that I plan to visit and ask them about their views on open source part of Bolt