If you implement an easy, user-friendly interface with voice recognition and text-to-speech, and then turn it into an app, this could be a good software for someone with early-onset Alzheimer's.
@@AllAboutAI Also if This allowed the outputs and inputs to interact in a loop, then in theory shouldn’t you be able to create a universal Turing machine with this concept? Correct me if I’m wrong.
I'm blown away by how well-organized your content is. As someone who's interested in AI, it's amazing to see how you've categorized your videos into different topics and subtopics. It really makes it easy to navigate and find exactly what I'm looking for.
Chris. I'm glad RU-vid recommended your page to me. I'm a former statistic data scientist in the US. Now I'm in Moscow with my choreographer wife, Irina, whom I met in NYC while in 1993 she rescued 5 shanghai'd girls from her 40-person, Russian rock & roll back-up, dance troop "Leader". I will visit NYC soon to do long-term math substitute teaching and find out how to legally send my wages back to my wife. Your AI blogs are extremely instructive and the most thorough that I've run across. Keep up your very helpful instructive blogs. You're helping many of us AI Muggles to become ChatGPT Wizards. Lol, Best regards, George
I have just discovered Obsidian for note taking and have been looking for a tutorial on using GPT-3 for it just like how you have done in this video! amazing!
@@AllAboutAI If you could show me, a n00b how to connect the sematic search to Notion, I would literally learn enough Python to implement it. Seems like a great opportunity
The reason i am not using open ai systematically is the same i am not using siri or their likes, it's running on their systems, I don't want to be profiled to the point where i am predictable. I will when i am running my own instance on my on secured server
This is awesome. I asked myself how to build a searchable so-database for the company i work for. We have so many text informations, but the infos are not categorized.
@@AllAboutAI yes, as you did here. I think this could be a awesome Methode do dive in our information. The next step is to give this infos to our clients via website.
Great video! How about the information provided to your personal or client's model GPT-3, is it confidential? i mean, if you have secret data to fine-tune your model, does it will be shared with Open AI? or it will stay confidential at the private acces of the client? Thank you very much!
Excellent succinct and well-explained as always. There are so many uses I can think of. If I assign an identification key to an article will the key be retrieved in the summary or perhaps several keys if the text is integrated? This is to identify the source of the information. I have been struggling to prepare pdfs for fine tuning such as converting to text then json especially when batch processing. I will head over to the subscription page and hopefully learn more. Preparing unstructured data may be a helpful future video 😉
Veldig bra! I think something like this might be useful in an app for people with short-term memory loss. I do think we need to begin discussing the ethics and methods of digital twinning as it will become a very popular thing. I could see some legal implications for compelling your second brain or digital twin to testify against you. Would it be a warrant for the information or would it be an extension of yourself? Anyway. Great to see this kind of thing being done.
How large a dataset could you use to create the vectors/brain? I've kept ad hoc notes about my life, ideas, projects, and client interactions in OneNote for several years. It's megabytes of text. Would it be possible to create a brain from all that?
@@keithprice3369 I think you would need some kind of structure to have a reliable output. Freeform would work but I'm not sure what use you would get out of it. In the video his data is labeled with dates, times, etc. So when he asks when an event occurred the AI can return that. If you don't have any structure I'm not sure what question you would ask.
@@samuelflippin1890 I don't know the requirements . But in my example, each OneNote page has a date and time attached, so that should cover time based queries. As for the rest, it seems it COULD work just on key words. Like, "when did I talk to Mary Thompson about getting access to their server?" Or "what's the connection string for ABC's server?" "Show me all the recipes for chicken." I'd GUESS NLP could determine the intent and find the applicable data. But, seriously, I know next to nothing about this. I'm, just intrigues. That said, OneNote has a pretty dang good search engine already. It's why I use it.
So basically, you get the embeddings for an article, use cosign similarity to match your query to a doc, and then feed the query and search results back to GPT (Davinci) completion along with a prompt? For the diary of things you did, did you split up the subtexts to get multiple groups of embeddings (Like get separate embeddings for the text of each day that you did stuff) or just got emeddings for the whole diary?
Personalised study notes with interesting possibilities for linkages , add in self testing , then - you can tutor yourself ( if not now then soon ) Or am I too Optimistic?
Hello Kris, I'm a 3D student really interested in learning about A.I. I already played a little bit with ChatGPT and Midjourney and I follow some channels like your's, but, I want to understand the system beneath and how it really works. What do you recommend me to do? I have 0 experience in programming (only basic C++)
Man, there are plenty of people on YT doing these tutorials but it's like we don't get nothing. We look at what you do and you don't provide a step by step guide. Is there a guide on this one? I am considering join the highest tier you have if there's a tutorial on how I can do it too. P.S. Also, can you use any kind of prompts? And does it have past memory of what you ask him of?
Great video! One question when enhancing GBT through the use of a custom dataset using semantic search does the model still retain all previous learned training data from Openai? Would I still be able to ask random questions that fall outside of the “2nd brain”? Than You!
this is awesome!! I have a question. When using chat gpt I realized thet the model doesnt know some specific topics and I want to create a model that contains that knowledge. I have books with that topic (at least 20 of them) and I can access transcriptions of the lectures about the matter. What kind of model should I use? A second brain or a fine tuned gpt? Imagine that I want to give it two uses: 1. People ask questions about the concepts and examples of its applications. 2. Be able to evaluate and correct phrases that the user will give the model.
You cant use fine tuning to add knowledge to a model. Fine tuning is just for learning new tasks or format! So semantic search might be a good opition. I will be doing a tutorial on my memebership if you are interested:)
@@AllAboutAI I just joined to see that, but that is not online yet or is it ? The 'membership' tab is a little convoluted for newcomers. Do 'membership' videos get listed in the videos tab too ? How can I find that ? The membership tab only contains a timeline
Really interested in this, so thanks.. question: What membership tier gives me access to the tutorial on how to get started on this myself? How much experience programming would I need? (Tricky to answer I know but could a novice do this?)
@@AllAboutAI Sure man. awesome work. I got an idea from this where i can a create a resume about me, and just share the website. So, the recurituer can just ask questions about me.
I have been writing digitally using an app called Day One would something like this work with it? It's all stored in .json files on my computer and online.
How large a dataset could you use to create the vectors/brain? I've kept ad hoc notes about my life, ideas, projects, and client interactions in OneNote for several years. It's megabytes of text. Would it be possible to create a brain from all that?
@@AllAboutAI Pinecone does look quite interesting. I wish they had a better summary of what it does and how it works. I'll dig deeper. I could also see a huge benefit from dumping years of emails into it, but I don't really know how I'd get the emails into a single text file.