Тёмный
No video :(

Intel Advances in AI: Brain-Like Computing and Spiking Neural Networks Explained 

Anastasi In Tech
Подписаться 192 тыс.
Просмотров 91 тыс.
50% 1

Опубликовано:

 

22 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 344   
@AnastasiInTech
@AnastasiInTech Год назад
Let me know what you think !
@danielricardolizicastro5332
Thanks for the always great content! Happy holidays and an amazing 2023 for you! Cheers!
@masternobody1896
@masternobody1896 Год назад
bad news i became homeless cause inflation please help
@Knapweed
@Knapweed Год назад
OK, I "think" you're awesome.
@sulimanibra5332
@sulimanibra5332 Год назад
Kindly..let me contact you in private! I have serious scientific subject to discuss...
@BabbittdaWabbitt
@BabbittdaWabbitt Год назад
I thought holography was going to fit into all of this AI in a major way, No ?
@kipling1957
@kipling1957 Год назад
We need to get this guy on the Lex Fridman podcast.
@martin-fc4kk
@martin-fc4kk Год назад
would be great!
@Stan_144
@Stan_144 Месяц назад
Excellent idea ..
@armartin0003
@armartin0003 Год назад
This comment is for the algorithm. As an AI enthusiast, I rarely see someone this close to the bleeding edge of this technology. Great content. I had been curious how we were going to get faster when chips are already at the molecular level - and BAM a heroine comes to the rescue. Hooray! I'd subscribe to anyone for this, but it makes me especially happy to see a young lady delivering the information. Pretty and smart CAN go hand in hand! I hope Anastasi becomes far more famous than the Kardashians of the world. Girls need more role models such as this.
@mistycloud4455
@mistycloud4455 Год назад
Combining ai and biology is important
@treborobotacon
@treborobotacon Год назад
My like of your comment is also for the algorithm. Well put.
@panpiper
@panpiper Год назад
Couldn't agree more.
@runeoveras3966
@runeoveras3966 Год назад
Well put! Support this 100%! 😊
@tedmoss
@tedmoss Год назад
Who are the Kardashians?
@Slavigrad
@Slavigrad Год назад
Explain neuromorphic chip. ChatGPT Response: A neuromorphic chip is a type of computer chip that is designed to mimic the structure and function of the human brain. It is inspired by the way neurons and synapses in the brain work and is intended to enable more efficient and flexible computation. Neuromorphic chips are typically made up of a large number of artificial neurons and synapses that are connected in a way that allows them to process information in a similar way to the human brain. They can be used to perform a wide range of tasks, such as image recognition, natural language processing, and decision-making. One of the key advantages of neuromorphic chips is that they are highly energy efficient, as they are able to perform complex tasks using much less power than traditional computer chips. They are also able to operate in real-time and can learn and adapt to new information without the need for explicit programming. There are a number of different approaches to designing neuromorphic chips, including using traditional semiconductor materials, such as silicon, and more novel materials, such as memristors. Research in this area is ongoing, and it is hoped that neuromorphic chips will eventually be able to perform tasks that are currently beyond the capabilities of traditional computer chips.
@therealb888
@therealb888 Год назад
This is awesome. A neuromorphic approach to AI is something that I imagined even before joining engineering because it felt so intuitive. Here I am today with more questions after watching this video. This is the kind of research that excites me and makes me light up like a neuron lol.
@1q2w3e4r5t6zism
@1q2w3e4r5t6zism Год назад
A neuromorphic approach was started at 1996 at the Institute of Neuroinformatics in Zürich (Rodney Douglas).
@noldor__
@noldor__ Год назад
As a long time software developer who has not yet worked with AI, this type of revolutionary news can be a little scary for me =) but it is also exciting and motivational to finally start learning AI so that I won't be obsolete in the near future.
@Crunch_dGH
@Crunch_dGH Год назад
The only channel I default playback to .75x, sometimes .5x & still have to repeat multiple times. That’s value in my book!
@prabakark6791
@prabakark6791 9 месяцев назад
😀
@JamesAwokeKnowing
@JamesAwokeKnowing Год назад
The format of this video is FANTASTIC. Way better than just the raw interview and better than just the pure "report". If your channel were a stock, I'd be buying. Thank you for investing so much effort in this video. You are bringing the AI, journalism, and media world in a beautiful direction. I hope the future is full of videos like this.
@AnastasiInTech
@AnastasiInTech Год назад
Thank you :))
@janpancake
@janpancake Год назад
This is an interesting take on this topic from the chip and architecture level that I was not expecting when I clicked on the video!
@moosethompson
@moosethompson Год назад
Two points you made that I found very interesting. First is the low power requirements. The second is how neuromorphic chips are for targeting the distributed market as opposed to big machine learning data centers. It seems to me that between these two factors this could make a huge impact through a wide variety of mobile devices. Maybe even stop people looking at their phone from stepping in to traffic. Now wouldn't that be a wonder. :)
@JonS
@JonS Год назад
There are quite a few other companies working on spiking neural network chips (BTW, the idea dates back to 1952!). Prophesee in France is using their SNN for event-based vision. Sony even released a couple of image sensors with Prophesee's technology built in. It's going to be interesting to watch this over the next few years. Thanks for the video.
@narrativeless404
@narrativeless404 Год назад
Lots of discoveries were made long time ago and then forgotten for quite a while
@JamesAwokeKnowing
@JamesAwokeKnowing Год назад
I've been following neuromorphic since IBM's project over a decade ago. I feel a bit sad they were 'ahead of their time' but they did validate the concept in an at scale "commercial" way (not lab with wires all over) and inspire the industry. It's wonderful to see this tech now on verge of coming of age, and now way more AI engineers are out there and commercial demand for AI in edge devices is high. Excellent video.
@narrativeless404
@narrativeless404 Год назад
Well, most of the tech arises ahead of it's time, and fades away out of either no use or being flat out discarded And then gets brought up again when it becomes actually practical and finds a demand
@sirus312
@sirus312 Год назад
@@narrativeless404 TLDR yolo IBM/INTEL for the next decade?
@narrativeless404
@narrativeless404 Год назад
@@sirus312 Uh, what?
@accesser
@accesser Год назад
Beautiful holidays, that's very nice Thank you for your videos. Hope lots more next year
@YogeshBaisyet
@YogeshBaisyet Год назад
Didnt Intel just announced Brainchip Akida Ip Foundry ? what does that mean
@AnastasiInTech
@AnastasiInTech Год назад
Intel will manufacture Brainchip's chips
@godsgod1677
@godsgod1677 Год назад
Anastasi is like an ASMR AI to me. I sometimes don't know what she's saying as I'm literally just listening.
@416dl
@416dl Год назад
Another interesting exploration into this subject...looking forward to the coming year as the field and your channel grows. Buon Natale
@WXSTANG
@WXSTANG Год назад
I wonder... it was about 20 years ago I re programmed an assembly line that had some problems with data sets, and was slow. I set a comparator, to see the data in, vs the data retained in 8 bit chunks, and if nothing changed, I would skip re sending the information, until there was data that needed to update. It was massively faster, and more efficient. It almost sounds like I set a primative neural net without even knowing it. BTW... it worked fantastic.
@danko5866
@danko5866 Год назад
Thanks for making this, I wanted to see exactly this type of content about Neuromorphic computers
@shadowarez1337
@shadowarez1337 Год назад
Thanks for these videos I'm starting to think this will be the history defining video series once AI takes over this series will show future civilization how AI take over started 🤣 great work as always thank you for always giving us a glimpse into future.
@AnastasiInTech
@AnastasiInTech Год назад
Thank you :)
@yonaguska2050
@yonaguska2050 Год назад
This is one of your most intriguing videos to date. I’ve been following neural network architectures for a long time. My first digital circuit designs were asynchronous, to gain the speed advantage of propagation over clocking. This new architecture seems to have a natural advantage over conventional deep learning systems. I look forward to hearing more about its future.
@brandonvinson5255
@brandonvinson5255 Год назад
Best channel for deep insight.
@Ty_Mathieson
@Ty_Mathieson Год назад
Informative as always, thanks.
@russadams3008
@russadams3008 Год назад
Another very interesting video. Thanks for your content. 👍👍👍
@sergefrancois7405
@sergefrancois7405 Год назад
Thank you Anastasi for this interesting information.
@jesselistarseed
@jesselistarseed Год назад
This was an absolutely fascinating video Anastasi!!!! The future of the chip...the NN
@AlexGGener
@AlexGGener Год назад
Pretty nice video!! Well organized and clear. From someone researching on SNNs and spiking architectures just some extra info: The loihi chip is not asynchronous per se. All the logic is clock based since it is digital. It is asynchronous on a message protocol level between neurons. Although it gives amazing improvements in power consumption, there is still room for SNNs. The akida chip is more focused on feedfoward SNNs converted directly from CNNs for simple tasks as classification. Also, only LIF neurons are supported. At the end, events are packed in frames and the temp information of the spike is lost. However, they consume less than GPUs. I'm glad that you mentioned memristors and RRAM architectures, not a lot of people know about it. Just to mention there are already some teams researching on photonics architectures, Check Ipronics and the UPV in Spain. Glad to see someone talking about neuromorphics in RU-vid! Keep going!!
@stevefreeman6646
@stevefreeman6646 Год назад
Every now and then I stop by. I think it's because there are so few content creators for engineering and hardware on the deeper topics. The associations made with the chemistry field is interesting too. On spiking systems, I've noticed the similarity with how humans make snap decisions, a situation I'm familiar with. Maybe it was luck, but mine were usually the right ones. My point is, using these systems with others (e.g., GPT), along with expanded hardware speed and integration, could solve more complex problems like this, ones that appear unpredictable. QC? Anyway, thanks a bunch and have a great holiday.
@dchdch8290
@dchdch8290 Год назад
Simply great video! Thanks for sharing these insights !
@Nilmoy
@Nilmoy Год назад
amazing, I see there is an ingenious Revolution going on!
@WickedGamerCollector
@WickedGamerCollector Год назад
Always when you start about AI ... instant Terminator / Skynet flashback 😆 Wondering would something like this happen ? 🤔
@qjtvaddict
@qjtvaddict Год назад
Yes only a matter of when 😊 not if
@akanhakan
@akanhakan Год назад
You should have your own podcast.
@Intellectualodysseyai
@Intellectualodysseyai Год назад
Thanks for always bringing quality information, I enjoy your videos and I feel your breakdowns are always easy to follow :)
@Karla-Flr
@Karla-Flr Год назад
Geniale Präsentation! Deine Videos sind grandios 🎊 🎉
@MrCtfx
@MrCtfx Год назад
Merry Christmas to you and the cat
@chad0x
@chad0x Год назад
Anastasias voice is so nice. I could listen to her all day talking about anything :)
@DanielWolf555
@DanielWolf555 Год назад
Interesting interview with the Intel guy.
@micwin2
@micwin2 Год назад
This text was written in German and translated into English using chatGPT. Anastasi, thanks again for such a factual, competent and (can I say?) charming contribution. I feel validated by science. I am a software developer, administrator and AI enthusiast and now I finally have a name for where AI needs to go. You are right, classical approaches (and chatGPT probably also belongs to this category) follow a von-Neumann architecture, while the neuromorphic approach is more promising; I am a fan of decentralized structures and peer-to-peer anyway. I hope the name catches on because it hits the mark. If the project leaves the lab, Intel will probably have to make concessions to classical needs of chips and boards simply to lower costs for the end customer with third-party compatibility - if it ever gets to the end customer or prosumer! I would definitely be in if someone gave me a corresponding PCI-based card to play with :-) or even better - a USB or Lan box that I could connect to my Raspberry cluster,... my goodness, imagine if every household had an AI box, what that would change in the world,... Anyway, thanks again for the contribution, I always enjoy watching you 🙂
@narrativeless404
@narrativeless404 Год назад
Yea, classical computers aren't doing well for AI, because they weren't designed for it to begin with They were just machines that should execute commands sequentially and consistently in sync with the entire system
@jerimiahsmedley3722
@jerimiahsmedley3722 Год назад
I really love your videos! 🥰🥰🥰please keep up the hard work. I wish I could at least do coding. But I fail or am unable to problem solve it. My education is limited and my back ground did not set me up for success. I wish I was as capable as you. Truly enjoy your work , your field of research, and your energy for this world of tech/engineering!
@neotower420
@neotower420 Год назад
whoah, this is fascinating. Can't wait to see what other unique ideas come from these developments
@BruceNewhouse
@BruceNewhouse Год назад
Another amazing episode, thank you. Question: We are steadily moving ahead with advances in computer technology in every aspect, but it appears it is only a matter of time where the newest technologies come together and achieves improvements significantly beyond expectations. Based on your knowledge, awareness, education, intuition, gut instincts, educated guesses, etcetera, when do you predict the next major advance in home computers and business computers to hit the mainstream market? Any thoughts on what the top-of-the-line home and business computers will be like in five years? I hope you have a Very Merry Christmas and Happy Holy Days.
@centuriomacro9787
@centuriomacro9787 Год назад
Awesome video. I almost skipped it, wasn’t clear to me that it’s about hardware
@waynemorellini2110
@waynemorellini2110 Год назад
This guy has it right. This follows what I came up when around 10 years old, about how to emulate intelligence. Lots of progress over the decades. But my model had greatly expanded, but he hasn't talked about any of the additional stuff needed for intelligence! This is still complicated lego blocks!
@emilythorne8181
@emilythorne8181 Год назад
Wonderful, What will happen in the future 💙
@mnomadvfx
@mnomadvfx Год назад
Spiking Neural Nets in silicon form are not new at all. IBM was at this years ago in 2015 on the 28nm process with their TrueNorth chip. It had 4,096 cores each with 256 'neurons' that each connected to the other for 65,536 synapses per core. So 1,048,576 neurons and 268,435,456 synapses in a single chip all pulling just 70 milliwatts. Also it was using SRAM which isn't the most area efficient memory so there was a huge room for improvement in a design predicated on a MRAM or ReRAM based compute in memory architecture.
@LorenzoVDH1996
@LorenzoVDH1996 Год назад
This is so exciting, what a time to be alive.
@Ryan256
@Ryan256 Год назад
Because timing, tempo, and cadence are inherent to neuromorphic AI, I imagine it will be better at composing original music than traditional AI, which basically just repackages the music it was trained on that was created by human composers. Neuromorphic AI will have an intrinsic feel for music, as opposed to traditional AI that just shuffles around preexisting musical phrases.
@easye8354
@easye8354 Год назад
Thank you and merry Christmas 🎅. As a shareholder of Brainchip, what is the significance of the recent partnership with Intel Foundries?
@doug3691
@doug3691 Год назад
Merry Christmas!
@gama3181
@gama3181 Год назад
One of the main topics of interest for me is organic computing. I expect that in the future, we can use DNA to encrypt and store digital information, proteins to build logic gates and neural networks, and whole cells as computing nodes. There are a couple of very interesting scientific papers about this, but are incipient and expensive
@malindrome9055
@malindrome9055 Год назад
That can also help us live for thousands of years! I want to live to see fusion fully take place and traveling outside of the solar system! I'm 15 so i hope that happens in my lifetime!
@Sven_Dongle
@Sven_Dongle Год назад
@@malindrome9055 The way things are going, you would get to see the world decline into tribal ghet-toe stupidity for thousands of years trapped on this mudball drowing in human feces. Sounds dandy.
@malindrome9055
@malindrome9055 Год назад
@@Sven_Dongle not if AI takes over
@gama3181
@gama3181 Год назад
@@malindrome9055 i hope so! I'm 26yr and also I want to see the new arrival to the moon and Mars ... But certainly, this are bad times, in special for certain countries/peoples. I highly recommend this short film about the changing world :/ ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-xguam0TKMw8.html
@Sven_Dongle
@Sven_Dongle Год назад
@@malindrome9055 Takes over what? The ghet-toe?
@MrFoxRobert
@MrFoxRobert Год назад
Thank you!
@sullenstudent48
@sullenstudent48 Год назад
Neuromorphic computing is a type of computing that is inspired by the way the human brain works. It involves the use of hardware and software systems that mimic the structure and function of neurons and synapses in the brain. The goal of neuromorphic computing is to create more efficient and powerful computing systems that can perform complex tasks, such as pattern recognition, decision making, and learning, in a more energy-efficient and robust way than traditional computers. One of the key features of neuromorphic computing systems is that they use parallel processing, which allows them to perform many tasks at the same time. This is in contrast to traditional computers, which typically use sequential processing and can only perform one task at a time. Neuromorphic computing systems also tend to be more energy efficient than traditional computers because they use less power and are able to perform tasks using fewer transistors. This makes them well suited for use in devices that need to be small and portable, such as smartphones and wearable devices. Overall, neuromorphic computing has the potential to revolutionize the way we think about computing and could lead to the development of new technologies and applications that were previously unimaginable.
@FredPlanatia
@FredPlanatia Год назад
Another excellent video on a fascinating topic. I think you combine well the excerpts from interviews with your own commentary. Mr. Davies is eloquent in his explanations but i think i would get overloaded if there were long uninterrupted monologs from him. This way the information is quite palatable for lay persons.
@augustinelopez1508
@augustinelopez1508 Год назад
Marry Christmas. To All ; In real time energy efficient, high adaption, and optimization solutions ring like jingle bells to me. And it sounds great. Keep doing excellence.😎
@wplaat
@wplaat Год назад
Thanks. Really interresing topic
@Aboriginal_Alien
@Aboriginal_Alien Год назад
You have become the number one source for the highest quality info and current reporting on developments in one of the most important sectors of development in the world. Micro Chip technology is the new nuclear weapon. It is behind drones which will be the future of war far but more importantly we will see that a counties sovereignty is tied to their economic health and ultimately using chips for AI, Block Chain Technologies, Transport, drones etc… will prove to be what determines if a country becomes eaten and eats other countries.
@sirousmohseni4
@sirousmohseni4 Год назад
Thanks for making this video.
@PythonAndy
@PythonAndy Год назад
Good video again :)
@visiter127
@visiter127 Год назад
We're not far away from when u intro someone on ur ch , as a head of a company or engineer that's actually just something an ai has dreamed up and deepfacked a video and we won't know the different untill u tell us lol , happy Christmas and hopefully 2023 will b an amazing year for this ch
@markmalonson7531
@markmalonson7531 Год назад
Excellent presentation.
@Rising_Pho3nix_23
@Rising_Pho3nix_23 Год назад
I swear as soon as I get a handle on what a neural network neuron is, they come up with something totally new haha
@KazenoniKakuremi
@KazenoniKakuremi Год назад
We have to solve for inference and this sounds like an interesting approach
@tedviens1
@tedviens1 Год назад
Our brain is primarily a massively parallel pattern matching system. Our physical body sensors provide the input to our brain's big data set. Most data fields are sensory images. Other fields are subsets containing thoughts, feelings, and inference relationship processing. Our brain has a secondary purpose of being a physical body monitoring system with feedback control of body functions. All of this processing by our brain is output at varying levels of awareness to our mind. Our mind is not located within our brain. Our mind is the envelope that wraps around our awareness, our sense of self. It is from our awareness that we relate to other beings and the physical world we share with them. It is within our awareness that we hold an imperfect knowledge of our self and our stumbling relationship to everything outside ourselves. The functioning of our mind is not indicated on any EEG fMRI, PET Scan, or other brain activity tool.
@imconsequetau5275
@imconsequetau5275 Год назад
MRI scans clearly show that decisions are made subconsciously first, and then we consciously become aware of the decisiveness -- as we act. A great deal of practice is required to bring decision making to a methodical slow conscious act of listing the merits and relative weights of each input. This is not conducive to survival when snap decisions are required, but it is imperative for engineering.
@tedviens1
@tedviens1 Год назад
​@@imconsequetau5275 You are simply giving a great example of how lies are built in science. MRI, fMRI, PET scans and other brain activity sensing systems merely reveal brain activity localized to areas in the brain determined to be specific to certain thought activity of the subject. These regions were first determined by invasive wire stimulation and expanded by thousands of hours of studies of brain activity monitoring with external stimulation of photos, sounds, external spoken remarks, or requested subject thought phrases. It is pure vanity for any researcher to declare that the measured activity is related to the brain's formation of a specific thought. Just as it is vanity for me to assert that all the measured activity, cognitive and pre-cognitive, is limited to pattern matching activity around the topic generated or about to be generated in the subject's mind, which is external to the brain.
@springwoodcottage4248
@springwoodcottage4248 Год назад
Super interesting & well presented, but until we see some practical applications it is impossible to know whether these architecture ideas are practical & powerful or just very early stage technology developments that greatly lag conventional architecture in practical performance. Ideally we would need to see direct comparison on tackling several edge computational tasks between the architectures. Maybe such data exists, but as of now I haven’t seen it & that makes me suspicious about the practicality of these approaches. Thank you for sharing!
@fluiditynz
@fluiditynz Год назад
Scary changes for anyone hoping for a stable future!
@cm24624
@cm24624 Год назад
This guys sounds exactly like a guy Ian Malcolm from Jurassic Park was talking about: “Your scientists were so preoccupied with whether they could, they didn’t stop to think if they should.”
@MozartificeR
@MozartificeR Год назад
You have your space man in the background. Very nice:)
@solosailorsv8065
@solosailorsv8065 Год назад
SUBBED: My senior project was a digitally controlled network of Op-Amps. 32 op-amps with 3 digital potentiometers each was a big board (in the ‘80s i used resistor ladders). It could be programmed to perform nearly instantaneous analog calculations/processing, and could feedback to ‘weight’ each node, on the fly. Do you know of anyone pursuing this approach? The new chip stacking can make these nodes as dense as FPGAs. (op amps and digital are different FAB processes) (PS, I’ve had so many patents stolen I don’t care anymore and just went SAILING ! , LoL)
@solosailorsv8065
@solosailorsv8065 Год назад
@@marmosetman That's an interesting (worthy) application: a digitally controlled analog array might be used to create the training data across a Monte Carlo of variable analog inputs for AI training extremely quickly
@kipling1957
@kipling1957 Год назад
Such temporal filters could be a significant step towards relevance realization while avoiding combinatorial explosion in novel complex environments.
@Waynesification
@Waynesification Год назад
You mean Chuck Moore's asynchronous patent which a lot of companies, including Intel were sugned onto. Which Chuck used to design his low energy recied holding processors and processing arrays. You could simulate this AI network type on that fully.
@BanXxX69
@BanXxX69 Год назад
First only comment🤓🤘🏽🥺❤️🤗 This is insanely interesting :D
@danielhull9079
@danielhull9079 Год назад
Yay an un unpredictable clock!
@MozartificeR
@MozartificeR Год назад
Timing controls the context in which the data is interpreted by the brain, by sounds of it. That makes sense as well, because we interpret things from multiple I/O devices connected to the cantral ind peripheral nervous systems. Eg. Touch, sight, sound etc. Could you imagine if we did not do the same consciously? when interacting with the world.
@dreamphoenix
@dreamphoenix Год назад
Thank you.
@michaelmindman5897
@michaelmindman5897 Год назад
Good video!
@qkktech
@qkktech Год назад
Neuromorphic analysis is ok but it can be better when that neuromorphic node can explain what it did somehow to digital computer. So there must be built some top layer that can feed and retrive info from such computer. I built in 20 years ago one very simple node on Xilinx and used numbers sorted to Gray Code to address similarity of things that tree that is built on top of gray code is backed in psychology known term priming - and also explains why such computer must sleep to reorganize and optimize coding. So when you look at that structure then it classifies problems and when there is no place where you can put that thing then it stored to short term memory and will reorganized during sleep. And what is amazing all such computers have different ideas since Gray Code can have many variations and coding is also different. It is like rising child when child understands first thing then it starts to build similar tree what is unique and is not changing as much as computers tree of ideas.
@towerparanormalinvestigations
It's definitely one of my new favorite channels!
@beautifulsmall
@beautifulsmall Год назад
asynchronous, capacitance , AC coupled. a far more direct way than event cameras. Mythic is back , very happy to see. the research will soon be keeping an eye on us.
@TonyDiCroce
@TonyDiCroce Год назад
It's interesting... when I learned NN I asked my teacher about the temporal issue and he had no idea what I was talking about.
@light7126
@light7126 Год назад
Anastasi is like a AI whisper in my ear
@YasinNabi
@YasinNabi Год назад
The easiest way to succeed in any field is to work on it consistantly, and consistancy is the key to success. I experienced my self. By the way, thanks for the uploads , a fellow creator.
@kingmaker007.
@kingmaker007. Год назад
Wish you all the best nice reported
@darrellrobinson707
@darrellrobinson707 Год назад
Happy holidays
@josephmcmanus6385
@josephmcmanus6385 Год назад
Thanks!
@AnastasiInTech
@AnastasiInTech Год назад
Thank you !
@griffnut666
@griffnut666 Год назад
My first robot will have Anastasi’s voice
@mistavoid
@mistavoid Год назад
now to find a way to not end up like the aperture science test subjects
@lil-zeta
@lil-zeta Год назад
If you pay close attention you might spot the memory stitch.
@martin-fc4kk
@martin-fc4kk Год назад
awesome content!
@orion7592
@orion7592 Год назад
Wow! Back in the days we used to call this type of research "biomimicry". So, essentially, we tap into the way our brains function, and design algorithms around the synaptic actions. Gee wizz - nature is still light years ahead in efficiency and efficacy of function. Okay - not light years, maybe, but impressively advanced.
@silvomuller595
@silvomuller595 Год назад
What a time to be alive
@2pist
@2pist Год назад
We get to witness the creation of our own prison.
@nitinhshah
@nitinhshah Год назад
So much advancement! Which companies will bring advanced use chips to market first. Nividia seems to be ahead but IBM, Intel, Apple, Adobe, AMD all seem to forging ahead ...
@benh7475
@benh7475 Год назад
Thanks for everything you.
@callejondorado
@callejondorado Год назад
You cannot say that in our brain “memory is distributed”. Scientists still don’t know, or cannot agree on what memory is; Let alone if it is distributed. Memory and thoughts cannot be confused with brain, and there is not certainty that it’s generated by the brain. Some people even said that coincidence, including memory, exists without the brain, for instance when a person dies and its spirits remains around.
@imconsequetau5275
@imconsequetau5275 Год назад
Nonsensical conclusion. *Memory* and *Consciousness* are emergent properties of correctly functioning living brain tissue, and *they do not exist* in the absence of life.
@coolninjabroguy
@coolninjabroguy Год назад
it seems like making it easy to talk about is key. for memory and people
@sswpp8908
@sswpp8908 10 месяцев назад
I've watched a bunch of Mike Davies videos. The tech looks promising. My only doubts are that is so focused on catching up in the CPU market that i fear Loihi will never get enough attention from leadership to become any more that a research device.
@konstantinavalentina3850
@konstantinavalentina3850 Год назад
I'm still of the opinion that we're likely to see less monolithic architecture physical layer philosophies in the future as opposed to more composite systems that exploit the most efficient use of chip/architecture types for the tasks and operations they're best suited. For instance we may see a resurgence in and growth in developmental sophistication of analog systems, and those systems get bolted on to a larger system that's also running Neuromorphic, Quantum, and other architectures. A composite system like this would be even more brain like in that a human brain isn't a solid mass of undifferentiated grey matter. Our brains have many different systems all evolved to perform their tasks efficiently. Motor neurons aren't going to play a huge part reading a book. The neocortex is going to be better at neocortex work than the hippocampus or the optic chiasma, although all of those may play a part in working together. Hind-brain structures will be better and more efficient at hind-brain related input and output than the frontal lobe. In this sense, I suspect we'll something similar in the future with the developmental sophistication of AI. After all, we already do some level of composite architecture in traditional computingas we've got graphics cards dedicated to graphics, networking chips for internetworking, and everything else tied together and working as a whole with the CPU.
@compuglobalhypermeganet
@compuglobalhypermeganet Год назад
The memory is a morphogenetic field, around you and the earth. Chemisty, electricity and magnetism create your awareness.
@tedmoss
@tedmoss Год назад
Cybernetics goes back to James Clerk Maxwell in 1868, I read about it in 1954 in Norbert Wiener's 1948 book. There is nothing new under the sun. Essentially, you learn from nature and apply that to machines and apply that to nature... Its an endless loop.
@luizmt2
@luizmt2 Год назад
I just like this voice!
@MM-eu9hm
@MM-eu9hm 11 месяцев назад
Absolutely amazing...what happens when AI meets quantum computing?🤪👀
@johnmanderson2060
@johnmanderson2060 Год назад
Great video 😊a directional mic or tie mic would be great, as it would suppress the room ambience. Can you give it a test ? Merry Christmas 🎄
@dadsonworldwide3238
@dadsonworldwide3238 Год назад
After being raised in the wake of the industrial revolutionary dogmatic mindset it's wonderful to see all this idealism being put into testable practice right before our very eyes. Out with the dark bland materialism and hello to the revival of the idealistic age.lol
@ye849
@ye849 Год назад
Neuromorphics r not what u think they r. They r meant to solve specific intuitive tasks like pathfinding/ walking/ keeping balance. It is not meant to replace complex thinking. That is still ML
Далее
Future Computers Will Be Entirely Different
14:56
Просмотров 185 тыс.
PEDRO PEDRO INSIDEOUT
00:10
Просмотров 2,7 млн
New IBM AI Chip: Faster than Nvidia GPUs and the Rest
15:34
Why Brain-like Computers Are Hard
17:44
Просмотров 237 тыс.
New Phase in AI. What’s Next for AI and AI Chips
11:08
Aleph Alpha’s AI Explained: The Secret Sauce
18:10
Просмотров 84 тыс.