It’s actually absolutely shocking to me that we have this technology, I remember watching that movie a couple years after it came out thinking how cool it would be to have this but also thinking that we were probably like 100 years away from it or something, to actually see this, and be able to have it especially for free is a dream I never thought would come true. It’s not quite there yet to the extent of “her” BUT we are probably like 2 years away from it becoming “her”at the insane rate this is all progressing. That’s crazy to think about🤯
Google really dropped the ball on this one. Two decades of massive data collection and they didn’t prioritize consumer facing GPT based AI seriously in fear of search ad revenue cannibalization. Remember when Google literally drove around the world to create Street View? That innovative yet tediously determined spirit is lost to number crunchers and sales people now.
No, did you watch the Google IO Astra demo just a day after 4o was launched? With LLMs, it's all about who has the most data and if a start up has trained on more data than THE Google, then... ... see more
@@t.w2000 I was just suggesting there are legal processes a large company needs to follow for obtaining and training on data, let alone "generate" multimodal content from it. There is no way open ai has access to more multimodal datasets than what Google has.
@@rupakvigneshfacts, open air started in like 2015, google started in 1998 and has been the worlds leading search engine and tech company. I don’t really know a whole lot about all of this so how do you think open ai got so much data to use that it’s currently ahead of google?
Me: chat gpt do I look good? Chat gpt: sorry I am unable to lie and my code doesn’t allow me to hurt your feelings. I will now block you from using my services
It's way more advance than people may think unfortunately they're many predictions from these inventers that say by 2030 it'll be the norm to see ai robot's walking around and majority of the population will have them. We need to get rid of most of these inventions
I am 29 and to be honest, this is what I thought we would have really soon. When you see how quickly all tech evolved, especially the two decades from 2000-2020 had so much growth in computing power, while also making everything more portable, tech is really going fast forward and it's not stopping. When first assistants were introduced with Siri and Google assistant I expected this kind of AI to come really soon. But 100 years? In a 100 years the earth will be destroyed and (or) we'll live on other planets too. It's still 76 years to 2100 and think about where we were 76 years ago...
This is like the AI before HER (scarlett). And that was a pretty advanced one as well. Her level of ASI is about 7-10 years away as it requires new infrastructure as the model needs to be able to train live instead of having one model that is retrained for all. Its a whole different ballpark.
@@johnsonfromml8662 There is one difference. Without AI there were ways of spying but noone could possibly listen to all that audio and find interesting parts. The difference is scale.
I mean you already carry a device with ya everywhere that has everything it could ever want, those AIs are most likely going to be client based rather than cloud based I’d imagine
@@CraigScottFrost Of course she is. Now all we had was a demo no one got to use and the implication that it wasn't a voice actor. The product will have to sound nothing like it did. And OpenAI will get to look like they pulled off the movie Her. We won't even get to enjoy pretending a fake person was interested in talking with us. We'll all be disappointed, because life is nothing but disappointment and lies.
I paid $20. I have 4o. It doesn’t do video. It doesn’t talk like that. I don’t get what they’re portraying here but it’s not the same. Does anybody know why???
Literally just asked chat gpt 4o “What’s the difference between gpt 4 and 4o?” and it replied: “The difference between GPT-4 and GPT-4-turbo (often abbreviated as GPT-4-t or GPT-4.0) primarily lies in their performance and efficiency characteristics: 1. **Performance**: GPT-4-turbo is designed to be faster and more cost-effective than the standard GPT-4 model. It is optimized for quicker responses, making it more suitable for applications requiring high throughput or real-time interactions. 2. **Cost**: GPT-4-turbo is typically cheaper to use compared to GPT-4. This makes it a more economical option for large-scale deployments or when budget constraints are a significant consideration. 3. **Architecture**: While OpenAI hasn't disclosed specific architectural differences, GPT-4-turbo is engineered to achieve its efficiency improvements through optimizations that may involve changes in how the model processes information or allocates computational resources. 4. **Use Cases**: Both models can be used for similar tasks, such as text generation, language understanding, and conversation. However, GPT-4-turbo is often preferred for scenarios where speed and cost are critical factors. Overall, the choice between GPT-4 and GPT-4-turbo depends on the specific needs of the application, balancing the trade-offs between speed, cost, and possibly slight variations in output quality.” Idk, it seems like open ai continuously lobotomizing gpt. This is below google bard on launch type of response.
People need to understand that these models aren't necessarily aware of their own name, let alone version names of newer GPT models that aren't even in their training set. When you ask it "What's the difference between GPT4 and GPT4-V" it will still pretend it knows what GPT4-V is, even if it doesn't exist.
The Rabbit R1 made no sense at all from the very beginning since it should just be an Android/iOS app. Having it be hardware is just nonsense when we already have smartphones.
so how do all those spammers get my personal information? i am browsing the web for weed, shortly after i get spam mails containing weed ads. those illegal spammers do not even have my email address, they also do know what i do.. so how is that?
@@meggi8048 some shady site you bought weed from sold you out. I'd say clear your cookies every 6-8 months and log into the sites that you use often again.
@meggi8048 Bro, the second you go online, you agree to all those custom adds. The reason why the web is free, is because you pay with your data. You don't have a choice tbh.
No doubt we are cooked 🙂 and rumours are GPT-4o is free because openAI is working on robots like humans so they wanna clear that how this model performs and get more information, more data and also to reach maximum number of humans that why it's free to use
Right, the amount of data it’s going to collect from millions of users talking to it every single day will be insane. I have a feeling Apple will role a fee into its iCloud services for full access.
Thats not even the craziest part. Ai's like Google assistant, Siri, Alexa, etc. can't see the human world they don't understand what human life is they just compare it with a picture on internet and show you the matching results. But the fact that the Chat GPT Ai can understand the human world and it can identify how the environment is seriously crazy. This is some new technology which will favour us or destroy us we never know.
I will be impressed with AI when Amazon can give me a recommendation for a book that I actually want to read. After selling me 500 books, you’d think they’d be able to predict the next one I want to read. But no. They have no clue.
You pay for gpt4? I just run a local LLM model that's equal to if not more powerful than GPT4 for personal use Takes alot of setup and some mid end hardware laying around but you can pull everything on the video off locally with enough time
Because they can more than recover costs by collecting way more data including exactly how you look like in real life right now, the places you visit and your home and office setup and things you use and want, leading to more "customized" ads than ever. And even then there will be a premium tier coming later once they get you hooked, like with chatgpt
@@MysticWorldsStudiosyour open source Ai of any model is nowhere close to gpt 4 , yes it's cheaper but not even close in terms of smartness of even base gpt 4
It is crazy the Startrek TOS imagined AI with robotic sound and even they imagined the android Commander Data from TNG couldn’t get the emotion right. Yet here we are in 2024 (not even 2400) with AI that resembles human talking.
The voice feature will not be free. It's only for us Plus users. Also, I have access to it now. I was one of the lucky ones who got access a couple of weeks ago. 🥰
Just got the new iPad pro M4. Can’t wait to talk to Siri, especisally because I walk naked in my appartment. ”Hey Siri, instead of using my face in FaceID, could I unlock the pad with… this? How do you like it by the way? And here is Uranus! 😄
Yeah, Im sure that Siri having this power will not end super well for everyone. Its like now youll have your own personal CIA agent telling apple all you do. They already know now, but with AI, they will know even more.
Really? And what makes it less exciting than it seems? Sure, it’s running on a similar model to GPT 4, but it’s much faster and includes live video processing which will make it by far the most practical application of AI seen thus far for the average person
Lol man dont call them dumb, just let them waste their money, we all have something to makes us feel special, just let them think they are superior. Its not hurting us. I mean look how hopeful they are that apple can copy and market yet another technology they can pay for, siri ai just $69 a month to talk to gpt4, and integrate it into their $3500 quest 3.5.