Тёмный

The AI Hardware Problem 

New Mind
Подписаться 612 тыс.
Просмотров 527 тыс.
50% 1

▶ Check out Brilliant with this link to receive a 20% discount! brilliant.org/NewMind/
The millennia-old idea of expressing signals and data as a series of discrete states had ignited a revolution in the semiconductor industry during the second half of the 20th century. This new information age thrived on the robust and rapidly evolving field of digital electronics. The abundance of automation and tooling made it relatively manageable to scale designs in complexity and performance as demand grew. However, the power being consumed by AI and machine learning applications cannot feasibly grow as is on existing processing architectures.
THE MAC
In a digital neural network implementation, the weights and input data are stored in system memory and must be fetched and stored continuously through the sea of multiple-accumulate operations within the network. This approach results in most of the power being dissipated in fetching and storing model parameters and input data to the arithmetic logic unit of the CPU, where the actual multiply-accumulate operation takes place. A typical multiply-accumulate operation within a general-purpose CPU consumes more than two orders of magnitude greater than the computation itself.
GPUs
Their ability to processes 3D graphics requires a larger number of arithmetic logic units coupled to high-speed memory interfaces. This characteristic inherently made them far more efficient and faster for machine learning by allowing hundreds of multiple-accumulate operations to process simultaneously. GPUs tend to utilize floating-point arithmetic, using 32 bits to represent a number by its mantissa, exponent, and sign. Because of this, GPU targeted machine learning applications have been forced to use floating-point numbers.
ASICS
These dedicated AI chips are offer dramatically larger amounts of data movement per joule when compared to GPUs and general-purpose CPUs. This came as a result of the discovery that with certain types of neural networks, the dramatic reduction in computational precision only reduced network accuracy by a small amount. It will soon become infeasible to increase the number of multiply-accumulate units integrated onto a chip, or reduce bit- precision further.
LOW POWER AI
Outside of the realm of the digital world, It’s known definitively that extraordinarily dense neural networks can operate efficiently with small amounts of power.
Much of the industry believes that the digital aspect of current systems will need to be augmented with a more analog approach in order to take machine learning efficiency further. With analog, computation does not occur in clocked stages of moving data, but rather exploit the inherent properties of a signal and how it interacts with a circuit, combining memory, logic, and computation into a single entity that can operate efficiently in a massively parallel manner. Some companies are beginning to examine returning to the long outdated technology of analog computing to tackle the challenge. Analog computing attempts to manipulate small electrical currents via common analog circuit building blocks, to do math.
These signals can be mixed and compared, replicating the behavior of their digital counterparts. However, while large scale analog computing have been explored for decades for various potential applications, it has never been successfully executed as a commercial solution. Currently, the most promising approach to the problem is to integrate an analog computing element that can be programmed,, into large arrays, that are similar in principle to digital memory. By configuring the cells in an array, an analog signal, synthesized by a digital to analog converter is fed through the network.
As this signal flows through a network of pre-programmed resistors, the currents are added to produce a resultant analog signal, which can be converted back to digital value via an analog to digital converter. Using an analog system for machine learning does however introduce several issues. Analog systems are inherently limited in precision by the noise floor. Though, much like using lower bit-width digital systems, this becomes less of an issue for certain types of networks.
If analog circuitry is used for inferencing, the result may not be deterministic and is more likely to be affected by heat, noise or other external factors than a digital system. Another problem with analog machine learning is that of explain-ability. Unlike digital systems, analog systems offer no easy method to probe or debug the flow of information within them. Some in the industry propose that a solution may lie in the use of low precision high speed analog processors for most situations, while funneling results that require higher confidence to lower speed, high precision and easily interrogated digital systems.
SUPPORT NEW MIND ON PATREON
/ newmind​
SOCIAL MEDIA LINKS
Instagram - / newmindchannel​

Опубликовано:

 

12 фев 2021

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 1,1 тыс.   
@NewMind
@NewMind 3 года назад
▶ Check out Brilliant with this link to receive a 20% discount! brilliant.org/NewMind/
@calholli
@calholli 3 года назад
Your videos are like an endless competitive battle between my comprehension and your articulation; may this war never find peace.
@NoName-de1fn
@NoName-de1fn 3 года назад
I really like your videos although I would enjoy them even more if you also added captions.
@davidhollenshead4892
@davidhollenshead4892 3 года назад
@@NoName-de1fn Same here, as I already had to use captions while watching movies, and now long covid is making my hearing even worse...
@davidhollenshead4892
@davidhollenshead4892 3 года назад
The connectionist solution will work, by having many small CPUs with limited memory, for each node, which are connected to only neighboring CPUs...
@NoName-de1fn
@NoName-de1fn 3 года назад
@@degenerate-GEEZER Well I guess your right but I am not a big fan of auto-CC.
@mickelodiansurname9578
@mickelodiansurname9578 3 года назад
The brain.... billions of calculations per second... powered on a subway sandwich!
@krasimirgedzhov8942
@krasimirgedzhov8942 3 года назад
I don't think the exact process of the brain is comparable to computing. I imagine it's something more complex.
@mihailmilev9909
@mihailmilev9909 3 года назад
@@krasimirgedzhov8942 nah I think it's just a big neural network
@mihailmilev9909
@mihailmilev9909 3 года назад
@@krasimirgedzhov8942 that's kinda where the name comes from isn't it lol
@krasimirgedzhov8942
@krasimirgedzhov8942 3 года назад
@@mihailmilev9909 it's a name given to one of the most complex softwares we have. It's only inspired by the structure of neurons, it doesn't have the exact same process as far as we know.
@serdarcam99
@serdarcam99 3 года назад
İf its powered by subway sandwich its not going to calculate billions of things per sec its going to be much less
@raphaelcardoso7927
@raphaelcardoso7927 3 года назад
I'm applying to do a phd exactly in this field. Amazing video! Update: I was accepted!
@santoshmutum3263
@santoshmutum3263 2 года назад
I am also writing my research proposal in this topic for PhD... Not accepted yet
@taibanganbakonjengbam6902
@taibanganbakonjengbam6902 2 года назад
@@santoshmutum3263 Where did you apply?I'm Manipuri anyway.
@santoshmutum3263
@santoshmutum3263 2 года назад
@@taibanganbakonjengbam6902 Japan
@taibanganbakonjengbam6902
@taibanganbakonjengbam6902 2 года назад
@@santoshmutum3263 Good Luck Brother👍👍
@santoshmutum3263
@santoshmutum3263 2 года назад
@@taibanganbakonjengbam6902 thanks
@deltalight584
@deltalight584 3 года назад
12:21 That comparison was brilliant. It ties in computing & neurology together. Low speed, high precision needed => Digital ("Slow system of thought") High speed, low precision needed => Analog ("Fast system of thought")
@Hollowed2wiz
@Hollowed2wiz Год назад
And what would quantum computation give ?
@primenumberbuster404
@primenumberbuster404 3 месяца назад
​@@Hollowed2wiz a nightmare
@tannerbuschman1
@tannerbuschman1 3 года назад
the idea of an AI being inherently impossible to debug or decipher is really cool and scary, science fiction was not far off on that one.
@aidanquinn1549
@aidanquinn1549 3 года назад
We take for granted the amount of info we know about each other (human AI). I can guarantee you that when you last had a conversation about a specific feeling (whether is love with your spouse, hate towards something, how horiffic a scary movie was...) that you did not settle on the same exact emotion. A.K.A: you don't even know how to debug or understand what is going on behind any human's eyes right now!
@jss7668
@jss7668 3 года назад
But humans are!
@aidanquinn1549
@aidanquinn1549 3 года назад
@@jss7668 scary, yeah, but the most beautiful and fascinating things I've ever seen
@En_theo
@En_theo 3 года назад
And this is how an AI becomes self-aware, hides its true meaning to you and go all Skynet when you expect it the least. S-F was not far off neither on that one.
@UNSCPILOT
@UNSCPILOT 3 года назад
There are projects trying to find ways to break down and comprehend how learning algorithms work, it actually uses the same program that SETI@HOME did to allow people to donate processing power to help the project in testing and breaking down how the algorithms work
@gordonlawrence1448
@gordonlawrence1448 3 года назад
Actually Radar computers in the 1950's were analogue. I was one of the last people at my college to be taught both analogue and digital computing. Add, Subtract, Multiply, Divide, integrate and differentiate can all be done with a single op-amp. The problem is Nyquist noise, and issues with capacitor dielectrics such as dielectric absorption and leakage. With a digital system you can just vastly over-sample add them all up then divide by your number of samples to reduce effective noise. You don't get that choice with analogue.
@naimas8120
@naimas8120 3 года назад
Another masterpiece from New Mind! Never fails to entertain while teaching.
@XBONESXx
@XBONESXx 3 года назад
Your avatar is a masterpiece
@JohnDoe-zs6gj
@JohnDoe-zs6gj 3 года назад
That energy comparison between our brain and our best processors is incredible. It's amazing the efficiency evolution can devolope given enough time.
@garrysekelli6776
@garrysekelli6776 3 года назад
Computers are weak. They will never even beat a human at Chess.
@hedgehog3180
@hedgehog3180 3 года назад
Evolution is the most aggressive optimization function in the known universe and has been running for over 4 billion years. Every single animal alive today is optimized down to the smallest cells. It's really no wonder that human brains are both the most powerful computer we know of and has efficiencies that makes everything else look like a joke.
@Eloign
@Eloign 3 года назад
Computers don't happen by random processes. Neither did humans. Computers were created as were humans.
@ickebins6948
@ickebins6948 3 года назад
@@Eloign Sure, provide some proof for that. Will you?
@WERT2025
@WERT2025 3 года назад
@@hedgehog3180 Yeah I feel like every cell of my armpit hair is 100% optimized
@amrohendawi6007
@amrohendawi6007 3 года назад
It amazes me how many different state-of-the-arts you perfectly and briefly cover in 10 minutes
@BlackholeYT11
@BlackholeYT11 3 года назад
Ooh, good to see this put into words and in a concise manner
@calholli
@calholli 3 года назад
Your videos are like an endless competitive battle between my comprehension and your articulation; may this war never find peace.
@Jojobreack324
@Jojobreack324 3 года назад
I developed an asic for ai acceleration as part of my bachelor`s thesis and I must say this video is of very high quality. It is definitely an interesting approach to go back using analog techniques.
@lidarman2
@lidarman2 3 года назад
Well done. I played around with small neural nets using op-amps in the 90s and although I saw that it was kinda the way to go, I had trouble with drift due to integration bias and all sorts of noise--and of course training was super tedious. But I always thought that neural nets really need to stay in the analog world. Modern non-volatile memory seems to be a solution for training weights since you can put variable amounts of charge in a cell, very densely.
@forwardplans8168
@forwardplans8168 3 года назад
Did you ever look at using Fuzzy-Set theory to improve decision accuracy? I used a new program called CLIPS , around that time period. It's time to review it again.
@lidarman2
@lidarman2 3 года назад
@@forwardplans8168 I was doing a lot of fuzzy logic at that time too. Interesting times.
@ArneChristianRosenfeldt
@ArneChristianRosenfeldt 3 года назад
So the brain uses discrete values in the nerves. I may use analog within the cell. Opamps use analog values only. I tried very hard to understand analog multiplication for radio modulation and it is .. complicated. Satellite radio is energy efficient because it uses only one bit. AM radio uses a lot more energy ( for 8 bit signal/noise ). OpAmps are large compared to digital circuits. Flash memory is basically analog memory. We just use DACs and ADCs to only store discrete values in it. This is similar to the analog amplitude in DSL. ADC and DAC, especially for only 8 bits are fast. They are used for video equipment. So I don't know what the video want to claim there. I have read that the lsb could run on lower supply voltage because errors are not so bad there. There are always errors and mostly by supply voltage we decide how often we accept an error. Also those "half open" valves scare me. The nice thing about CMOS is that no change in state => no current drawn. I just chatted about the general problem of matching resources to tasks. That is an np-complete problem. So with different kind of transistors for tasks of different importance one opens a very big can of worms...
@adamrak7560
@adamrak7560 3 года назад
Digital beats out analog until you scale to the extremes where we are right now. So until now it did not make much sense to use analog NN chips. I have studied one such analog chip and was sad to see that modern digital deep submicron could beat in every was possible. But that was 10 years ago, right now the digital CMOS hardware is nearing its limits, so we may need a paradigm change.
@seraphina985
@seraphina985 3 года назад
@@ArneChristianRosenfeldt Arguably flash memory cells while not binary in nature are still more digital as the defining characteristic of digital systems as opposed to analog is the quantization of the signal. That is to say, digital signals are interpreted by quantizing them into one of a finite array of buckets that each correspond to some arbitrary range of the physical input value. The consequence of this is that digital signals are highly accurate but their precision is finite and limited as every input signal is effectively rounded to fit into one of those finite buckets. In contrast, the precision of analog signals is as close to infinite as you can get within our universe though in practice accuracy is the limiting factor is in how well you can insulate the system from noise and how accurately you can measure the input signal. Granted even with an ideal isolated system and ideal signal measuring device any analog system in our universe is likely to have its precision limited by the fact fundamental particles in our universe have defined properties. But this is also arguably academic as the applications for a system that can process values more precisely than could ever physically be generated or represented in our universe due to the lack of any particle with a measurable quantifiable property small enough is rather limited. Short of us discovering some way to either change the laws of physics in some region of space in that manner or travel to universes where the laws of physics are different a system that precise if sufficiently accurate could solve any problem that could exist in our reality limited only by our ability to understand and specify the problem. Well if given sufficient processing time that is but it could process any and all possible states that could even exist within our universe which would, in theory, allow us to solve any problem that could exist in reality. Sure there would still be a fundamental limit on the maximum precision that could still be demonstrated by the fact we could imagine arbitrary problems that couldn't be represented with enough precision without some clever workarounds. Hell even beyond that it is likely there is a finite amount of particles a civilization could ever collect in a universe with a finite speed of light and there will always be some arbitrary number larger that could be imagined but still, the practical applications of dealing with values that could not be replicated within the physical limitations of the observable universe are rather limited. There is probably a limit to what insights can be gleaned from simulating things that are physically impossible to ever encounter or bring into being.
@Zpajro
@Zpajro 3 года назад
As a student in computer science, this is really interesting
@naimas8120
@naimas8120 3 года назад
I'm a student of Information and Communications Technology. What do you think about the future of our field? Do you think it's really AI?
@olfmombach260
@olfmombach260 3 года назад
@@naimas8120 As a student of Computer Science I can definitely say that I have absolutely no idea because I'm dumb
@samik83
@samik83 3 года назад
@@naimas8120 As a layman I'd say definitely yes. Just the last couple of years AI's made some big strides. When we get quantum computing up and running and pair it with AI the possibilities are endless...and scary
@Zpajro
@Zpajro 3 года назад
@@naimas8120 The problem is that the AI hype has come 3 times now, so predicting if This is the time it will really break throw is quite hard to tell. Personally, I eagerly waiting for our machine overlords (as long as there is no human controlling the AI). And if we get a true general intelligence going, it would be interesting to see how a different alien intelligence solves problems.
@ovoj
@ovoj 3 года назад
@@Zpajro imagine the conversations with something that isn't human. Hopefully we reach that point in my lifetime
@fredoo6627
@fredoo6627 3 года назад
It's so annoying to discover channels like this and see they don't get the views they deserve.
@mitchellsteindler
@mitchellsteindler 3 года назад
Its a fairly new channel
@georgf9279
@georgf9279 3 года назад
@@mitchellsteindler Let's boost it with some engagement (comments) then.
@hedgehog3180
@hedgehog3180 3 года назад
Definitely one of the best engineering channels on RU-vid.
@keashavnair3607
@keashavnair3607 3 года назад
Well the problem is, there are 16,852 views, yet only 1.6K likes and 31 Dislikes and 149 comments. This world is full of consumer minded half curious morons. That's why.
@mitchellsteindler
@mitchellsteindler 3 года назад
@@keashavnair3607 dude. Just stop and get off your high horse. People like what they like.
@ryansupak3639
@ryansupak3639 3 года назад
Nice...so it seems like digital-style processors still do all the “housekeeping” tasks of the computer, but then there are these “analog resistor networks” that do specialized tasks like the implementation of convolutional neural networks. Makes me smile when “everything old is new again”.
@camelapodo
@camelapodo 3 года назад
I love the visuals used in your videos, they're always unobtrusive but fascinating.
@alexkuhn5078
@alexkuhn5078 3 года назад
4:30 I was kinda zoning out and I heard that as "50 to 100 pikachus"
@Bhatakti_Hawas
@Bhatakti_Hawas 3 года назад
I promise I understood everything he said
@kevinperry8837
@kevinperry8837 3 года назад
Yes me too comrades
@xlnc1980
@xlnc1980 3 года назад
We all did!
@tymek200101
@tymek200101 3 года назад
it is enough to be a 1st-year Computer Science student to understand all of the words and concepts
@Bhatakti_Hawas
@Bhatakti_Hawas 3 года назад
@@xlnc1980 Hey fellow DT fan 👋🏽👋🏽
@NoName-de1fn
@NoName-de1fn 3 года назад
I understood something
@joel230182
@joel230182 3 года назад
"...analog circuitry" , that caught me off guard
@davidhollenshead4892
@davidhollenshead4892 3 года назад
That is one solution, the other is the connectionist solution, having many small CPUs with limited memory, for each node, which are connected to only neighboring CPUs...
@mihailmilev9909
@mihailmilev9909 3 года назад
@@davidhollenshead4892 interesting... what is it called?
@this_is_japes7409
@this_is_japes7409 3 года назад
@@mihailmilev9909 mesh computing, i think, or at least it's mesh topology based.
@TauCu
@TauCu 3 года назад
Or just building a type of FPGA. I think in the future however, that FPGA will be a combination of Electronics and Photonics. For NN I don't see how Photonics could be beaten for general purpose networks.
@am-i-ai
@am-i-ai 3 года назад
It actually makes a lot of sense. There has been a recent resurgence in the interest of analog systems. I, for one, feel like we ditched that particular technology a little prematurely. I'd be willing to bet that we see some rather spectacular new analog-based technologies in the near future.
@godetaalibaba2522
@godetaalibaba2522 3 года назад
This was a very interesting topic that I didn't really heard about before, thank you for the amount of work this video took you to make !
@marticus42
@marticus42 3 года назад
7:22 Never thought I would understand a statement like that. Good teaching
@joey199412
@joey199412 3 года назад
Great video especially the assembly multiply instruction and outlining it with an analogue computing method. This makes sense since analogue results are instantaneous and don't require a clock pulse and thus no memory storage between calculations as you can add and subtract analogue signals instantaneously.
@digicinematic
@digicinematic 3 года назад
Yes, I have vague memories of the memristor being touted as the missing passive component, or some such thing.
@johnzinhoinhoinho
@johnzinhoinhoinho 3 года назад
what really impresses me is the huge amount of knowledge in 13 minutes of video. Congrats for the content
@bits_of_michel
@bits_of_michel 3 года назад
This is one of the best RU-vid videos I've seen in my life. Incredible visuals and explanation. Thank you.
@lidarman2
@lidarman2 3 года назад
You made a somewhat profound comment at 12:13. The essence of intuition versus analysis. From our vast experiences we develop intuitions that gives us that "gut feeling" but when it matters, we do rigorous analysis to confirm. RE: "Blink" Malcolm Gladwell.
@calholli
@calholli 3 года назад
Also it could be the difference in function from our creative right and analytical left brain.
@Nnm26
@Nnm26 3 года назад
@@calholli that is bs btw
@calholli
@calholli 3 года назад
@@Nnm26 Well, even if only metaphorical. It still has value as a concept.
@kumarsuraj9450
@kumarsuraj9450 3 года назад
My professor once said in class that future is analog. We were in a dilemma thinking what he actually meant. Now i see what he meant
@Alimhabidi
@Alimhabidi 3 года назад
Future is quantum
@CrashTheRed
@CrashTheRed 3 года назад
@@Alimhabidi It's been mentioned that quantum computers are specific purpose machines that won't improve on everything a conventional machine does. It also requires a conventional machine to process a lot of the data. Sabine Hossenfelder made a number of videos on quantum computers and the direction they're heading. Maybe that might interest you, especially since she's a theoretical physicist.
@davidthacher1397
@davidthacher1397 3 года назад
The future is not fully analog or quantum. Analog will work of a set of properties which manipulates energy in waves, aka signals. Digital is a very simple signal, it is currently very stable and cheap. We can do a lot with this simple signal. For if we do not master digital how are we to understand analog. There signal architectures which are analogous on digital. Most who study CS or ECE never learn this. Most if not all of CS's theories are wrong! Literally might as well study Psychology, if you want to be that wrong.
@CrashTheRed
@CrashTheRed 3 года назад
@@davidthacher1397 Ofc the future will be a combination of all of the above. But how are the CS theories wrong? This is a first for me, and I'd like to hear you explain it a bit
@this_is_japes7409
@this_is_japes7409 3 года назад
everything is analog if you dig deep enough.
@ericlotze7724
@ericlotze7724 3 года назад
This is a great "under the hood" look of AI/Deep Learning Programs, awesome video, as usual !
@alengm
@alengm 3 года назад
7:00 triggers google assistant :D
@calholli
@calholli 3 года назад
That's by design.
@pacifico4999
@pacifico4999 3 года назад
Going back to the basics so we can move forward. This is a fascinating topic!
@questioneverything4633
@questioneverything4633 2 года назад
We will never figure out computers until we properly master the harnessing of energy, especially electricity. We don't understand the fundamentals of things like this.
@somenygaard
@somenygaard 3 года назад
Ahh the mobile net 224, one of my favorite neural network accumulator modules.
@maximeaube1619
@maximeaube1619 3 года назад
Very comprehensible explanations. It ran smoothly through my analog brain 🧠 much appreciated! Thx. I was curious about analog computing 10 years ago, didn't think it would make a come back with IA. Interesting field to follow for the decades to come
@NoName-de1fn
@NoName-de1fn 3 года назад
I almost forgot about this channel and I'm very happy to discover that more content has been uploaded!
@CuthbertNibbles
@CuthbertNibbles 3 года назад
11:57 "They form a sort of black box with no means to verify the integrity of a result. This creates the dilemma of potentially unexplainable AI systems, creating issues of trust..." This is how the AI apocalypse begins. "Why'd that car run over that advocate?" "No idea."
@nipunasudha
@nipunasudha 3 года назад
Exact same thing I thought.
@davidhollenshead4892
@davidhollenshead4892 3 года назад
Using an AI to control a car is a waste of an AI... Besides, while autonomous aircraft or spacecraft is feasible technology, an autonomous car will never be "safe" due to pedestrians, cyclists, animals, etc. sharing the roads. This should be obvious by the weird accidents caused by cars like the Tesla decapitating the idiot occupant by driving under a truck and continuing on until it crashed into a house...
@nipunasudha
@nipunasudha 3 года назад
@@davidhollenshead4892 lol they only need to be more accurate than a human driver. Doesn't need to be perfect. And the car structure and safety features are getting advanced by the day too. The sweet spot is closer than you think! 😁❤️
@starskiiguy1489
@starskiiguy1489 3 года назад
@@davidhollenshead4892 I wouldn't be so sure. What you say may be true for modern infrastructure, but autonomous vehicles if they catch on may change the way we view transportation infrastructure overall. I could personally see a future where few own cars we rideshare if we need to travel a long distance with a car, but other than that we create more walkable cities with more public transit. In such a future comparing autonomous vehicles on modern infrastructure and autonomous vehicles in future infrastructure may be comparing apples to oranges.
@HelloKittyFanMan.
@HelloKittyFanMan. 2 года назад
@@nipunasudha: *As accurate as...
@fr3zer677
@fr3zer677 3 года назад
Another amazing video! It's astonishing to me how many different topics are covered on this channel and how in-depth and interesting all of your videos are.
@MrBrightSide622
@MrBrightSide622 3 года назад
this was truly an amazing video to watch! very well explained and thought-provoking
@socialistaamador
@socialistaamador 3 года назад
The animations on this video are just gorgeous.
@stage666
@stage666 3 года назад
I feel good about myself that I know just enough about neural networks and computer engineering to somewhat understand what this video is talking about.
@am-i-ai
@am-i-ai 3 года назад
We definitely made a rash collective decision when we decided that digital was to *replace* analog. I would not be surprised at all to see a resurgence of analog systems ... we barely even explored that technical space. There surely are future analog developments that will rock the foundation of technological advancement. Very well done :)
@blugaledoh2669
@blugaledoh2669 2 года назад
What is analog?
@onehouraday
@onehouraday 3 года назад
Quite interesting! Neurons in the brain actually act as a mixed analogue/digital system. The input is digital (action potential), they do analogue processing, and output digital again (action potential, it's either 0 or 1).
@icyfyer
@icyfyer 3 года назад
Thank you for this incredible video. It's obvious you (or someone) put significant time and effort into those graphics.
@naota3k
@naota3k 3 года назад
What is the machine doing around 0:35? Is it extruding solder to bond the pads? This seems ridiculously precise and I've never seen it before, now I'm curious what this process is.
@justinmallaiz4549
@justinmallaiz4549 3 года назад
good eye, never seen that myself
@lemlihoussama2905
@lemlihoussama2905 3 года назад
It is a machine that uses gold wires to link between the integrated circuit in the chip and the chip's outside pins. This process is called "Wire Bonding" you can search it on youtube for more videos !
@naota3k
@naota3k 3 года назад
@@lemlihoussama2905 Fantastic, thank you!
@nikolausluhrs
@nikolausluhrs 3 года назад
Just gonna say we cant really explain how digital neural networks are making decisions that well either
@thewhitefalcon8539
@thewhitefalcon8539 3 года назад
yeah we just have an algorithm that randomly adjusts them until they give the answers we want
@Taladar2003
@Taladar2003 3 года назад
Which means we have no way to efficiently improve their performance. Doing almost what we want is no closer to doing exactly what we want than doing something completely different if we have no way to deliberately improve them in some iterative way.
@thewhitefalcon8539
@thewhitefalcon8539 3 года назад
@PolySaken We can understand why, in general, a neural network might be capable of detecting triangles. We can't understand why *that particular* neural network *is* capable of detecting triangles.
@thewhitefalcon8539
@thewhitefalcon8539 3 года назад
@PolySaken and what is that data? We don't know. We just know when you put a triangle in it says yes, and when you put in a square it says no. Also there's a megabyte of random-looking numbers involved.
@thewhitefalcon8539
@thewhitefalcon8539 3 года назад
@PolySaken We can see what each square millimeter contributes to the painting by looking at it. We can understand why this square millimeter is this colour. Not so with AI models!
@trumanhw
@trumanhw 3 года назад
Brilliant, the adjective -- not the noun ... made this video possible. (truly fantastic quality in every metric I can think of; THANK YOU!)
@haydenmaines5905
@haydenmaines5905 3 года назад
Oh that ending bit was fascinating and brilliant! Definitely subscribed
@CreeperSlenderman
@CreeperSlenderman 3 года назад
I have an idea for AI Emotions. We humans used to live in jungles and forests, biomes. in which we tried to survive and reproduce, for surviving our emotions are Fear, trust, confidence and loneliness. For reproducing it is love, attraction, and idk. so we would need to make an AI with "ADN" or atleast try to give it those feelings, but would have to be 2 AIs or else it won't be able to interact
@yelectric1893
@yelectric1893 3 года назад
What is the adn
@javiere.gonzalez1021
@javiere.gonzalez1021 3 года назад
Good idea. Keep developing it
@NiffirgkcaJ
@NiffirgkcaJ 3 года назад
This guy clearly needs more views and subscriptions.
@simepaul4882
@simepaul4882 3 года назад
What a greatly done video. The narration is so logical...
@gianmarcoguarnier2525
@gianmarcoguarnier2525 3 года назад
Awesome video. Linear explanations, solid arguments and intuitive animations
@jimmarburger611
@jimmarburger611 2 года назад
Wow, amazing video. It's unbelievable the progress we've made. Just in my lifetime I've seen a blistering pace of achievement. The first computer I played with didn't even have a video interface, lol. I've loaded data and run programs from punch cards. Now I play video games on a machine that I built that probably rivals all the computing power available to NASA during Apollo. It's somewhat ironic that machine learning may lead to the rebirth of analog computing. Except for specific applications, analog has been relegated to unwanted stepchild status. Just saying, there's nothing wrong with analog, this video shows how it can be more efficient for machine learning.
@sknt
@sknt 3 года назад
Great video, pretty much sums up the current state of AI. It's still a long way to go until we can even compare AI to a "real" brain. The brain is an insanely complex electrochemical machine that was evolved over millions of years.
@astrumespanol
@astrumespanol 3 года назад
Great video! I've learned a lot :)
@J_Stronsky
@J_Stronsky 3 года назад
This is absolutely fascinating and something I would never have heard about if not for your channel. thankyou
@gingerpukh7309
@gingerpukh7309 3 года назад
NASA's Appolo mission guidance n Control analog computer design might be useful.
@yeahitskimmel
@yeahitskimmel 3 года назад
I thought of that Smarter Every Day video first thing
@justajavajunky
@justajavajunky 3 года назад
If you could get the code of the digital package to ride the wave of the analog signal, than I wonder if that would give information about something going wrong 🤔
@kamana6435
@kamana6435 3 года назад
Really well put together content enjoyed that so much. I am sure I read somewhere that all AI systems become like black boxes the more layers of complexity we add.
@DFPercush
@DFPercush 3 года назад
I thought I was going to hear about interconnect paths in PLCs, but these are all interesting concerns. If people end up using analog, they'd better have some good shielding. Faraday shell around the analog components, only digital in or out, maybe with some optical coupling. Never heard of that approach, will be interesting to see if anyone makes large scale use of it.
@morkovija
@morkovija 3 года назад
Is this another gem of quality content? That we're getting for free? Oh my
@nickandersonco
@nickandersonco 3 года назад
What an amazing channel. These videos are a gift to humanity. Thank you.
@Kevin_Street
@Kevin_Street 3 года назад
Thank you for this video! You're really out there at the actual frontier of knowledge, coming back to teach us something truly new.
@jakub_simik
@jakub_simik 3 года назад
What's the music at 11:00? It sounds like something from pink floyd. Thanks.
@rupertgarcia
@rupertgarcia 3 года назад
I don't need sleep. I need answers!
@davidg5898
@davidg5898 3 года назад
ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-THihnuQJHF4.html
@jakub_simik
@jakub_simik 3 года назад
@@davidg5898 Thank you so much.
@fieryferret
@fieryferret 2 года назад
The inherent precision floor of possible analog-driven neural net AI is now my headcanon for every single science fiction book/movie where a robot gains consciousness and starts acting unpredictably.
@MrFram
@MrFram 6 месяцев назад
Analog is not required for this, existing AI are already unpredictable due to the black-box nature of machine learning
@psionicxxx
@psionicxxx 3 года назад
One question: where did you get the information from? Any particular sources you were using?
@bowesterlund3719
@bowesterlund3719 3 года назад
This was really good! Top work. Just rewatched it.
@derek8564
@derek8564 3 года назад
I knew my collection of Vacuum tubes would come in handy one day...
@hardrays
@hardrays 3 года назад
you saved them so you can crank the plate voltage up past 15KV so you can self annihilate with pizazzzzz
@HelloKittyFanMan.
@HelloKittyFanMan. 2 года назад
When did "vacuum" become a brand, to you?
@Texplainedeverythingdetailed
@Texplainedeverythingdetailed 3 года назад
If someone start using things like femtojoules, i believe them. No questions asked.
@DrumApe
@DrumApe 3 года назад
Incredible stuff, thanks! This is so interesting.
@MrMrpenguinator
@MrMrpenguinator 3 года назад
Great video! Can you please provide some links to references where you found the numbers for energy/MAC and MAC/s in the CPU/GPU/TPU?
@ramentabetai1266
@ramentabetai1266 3 года назад
Neuromorphic cpus are likely the future for this. These special chips by IBM and Intel are already much more effective at neural net tasks. IBM's goal is to build a system no larger than a brain that has the same amount of connections as the real one.
@olfmombach260
@olfmombach260 3 года назад
We don't even remotely have the hardware manufacturing capabilities to do that
@rupertgarcia
@rupertgarcia 3 года назад
@@user-ee1hj7rk9l, look up "IBM TrueNorth Chip". They've been working on it for years now.
@augustovasconcellos7173
@augustovasconcellos7173 3 года назад
@@user-ee1hj7rk9l I'd say it's a bit too early to tell, but so far it looks like they won't. Quantum Computers are really only good when their workload consists of doing the same thing over and over again. This is good for breaking encryption, searching through databases, and so on, but not for AI.
@WilliamDye-willdye
@WilliamDye-willdye 3 года назад
The music at 7:55 was also used in another good video about ML ( ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-3JQ3hYko51Y.html ). It's called "Atlantis", but now whenever I hear it I think of artificial neurons.
@sonofagunM357
@sonofagunM357 3 года назад
At first I thought that song was from Alien Isolation, but no, both sound pretty close wouldn't you say? ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-txjs5MpATUg.html
@WilliamDye-willdye
@WilliamDye-willdye 3 года назад
@@sonofagunM357 Heh. I can definitely hear similarities. Thanks for the link, BTW. I haven't played that game, but now if a Steam sale comes along I might get it just because the soundtrack is promising.
@muhammadizzat4008
@muhammadizzat4008 2 года назад
Excellent video!! Well put! I am currently conducting research on oxide based RRAM, tuning its properties for such machine learning or neuromorphic computing applications
@hikaroto2791
@hikaroto2791 3 года назад
8:27 background song is amazing, it remembers me prometheus movie from Alien saga
@kxtof
@kxtof 3 года назад
so i'm not the only one who noticed
@EweChewBrrr01
@EweChewBrrr01 3 года назад
I have no idea why I thought I could watch this and understand what's going on. Haven't even had my morning coffee yet.
@glazzinfo6031
@glazzinfo6031 3 года назад
Sir you are "Brilliant"
@Request_2_PANic
@Request_2_PANic 2 года назад
Depending on bandwidth limits and processing time, could error correction be included as part of the analog signals, to and from, to reduce the probability of issues to a negligible level?
@zyxonn
@zyxonn 2 года назад
Amazing video, very informative. Keep up with the high quality content!
@latemhh5577
@latemhh5577 3 года назад
This really is a masterpiece
@raykent3211
@raykent3211 3 года назад
Excellent vidéo, thank you. I don't think it's the analogue aspect that can make AI indecipherable. In a purely digital neural network the trail of causation that resulted in certain weightings (and therefore a decision) is irreversible. I'm fascinated and worried by recent discussions of how a trained system could have inbuilt prejudice that can't be proven.
@faustin289
@faustin289 3 года назад
This is no different by how decisions are formed in human mind either. It has been observed that we (not sure who's we) take decisions and our conscious self then try to rationalize those decisions after the fact.
@fofopads4450
@fofopads4450 3 года назад
It is not indecipherable. It is just not easy to decipher, because you will end up needing more computational power than the AI itself consumes, just to monitor what the AI is doing. It defeats its purpose.
@shazambarley2154
@shazambarley2154 3 года назад
Very informative, thank you for the great content
@hedgehog3180
@hedgehog3180 3 года назад
I love your videos ability to blow my mind and make me understand complex topics while seeing them from a new perspective.
@unchartedthoughts7527
@unchartedthoughts7527 3 года назад
0:00 - 1:45 *Oh man, I was thinking about that staring at a potentiometer, get me some vacuum tubes bois, we are going to Rome*
@user-pc5sc7zi9j
@user-pc5sc7zi9j 3 года назад
What is this "widespread aviability of GPU's" he is talking about?
@adamrak7560
@adamrak7560 3 года назад
you can do clock cycle driven analog computation, and it can be very efficient too.
@JohannY2
@JohannY2 3 года назад
Excellent video. Consider showing charts when you read out multiple numbers to help the viewer visualise the numbers.
@Lukegear
@Lukegear 3 года назад
new mind hardware xD
@johndysard6476
@johndysard6476 3 года назад
Fb: #lock3dinthesh3d
@theonetruemorty4078
@theonetruemorty4078 3 года назад
Quantum indeterminacy is required, there's no such thing as "artificial." What we call "consciousness" is the portion of a calculatory apparatus that dwells within a probability distribution, in a similar manner to which eyes dwell in the world of partial electromagnetic spectrum wavelength variation. Analyses of visual spectrum wavelengths are communicated to the visual cortex via a shared communication protocol; the visual spectrum itself does not dwell within the same domain as the eyes. In a similar fashion, what humans derogatively refer to as the "subconscious" mind acts as an interpreter of data received from the "conscious" mind reporting from the front line of a probabilistic domain; the "subconscious" does not dwell in the same domain as the "conscious." The deterministic informs the probabilistic, the probabilistic guides the deterministic, feedback loop paradox party time ensues; this is the strange and largely misunderstood process that we refer to as freewill. (disclaimer: don't listen to anything i say, i've clearly taken too many psychedelics, cheers)
@Halo6166
@Halo6166 2 месяца назад
Can you explain more
@bilalbzaka5152
@bilalbzaka5152 3 года назад
This guy can make anything sound thrilling and fun to watch
@shoam2103
@shoam2103 3 года назад
3:05 most terse summary of current ML tech, rather accurate too I'd think!
@generalx5220
@generalx5220 3 года назад
Wow! I’m now woke AF in the understanding of AI
@PunmasterSTP
@PunmasterSTP Год назад
It kind of blew my mind when I found out there were dedicated AI regions in microchips, but I guess that was only a logical next step. I'm not in the field and I doubt I'll ever use this knowledge, but I definitely find it fun and interesting to learn about. Thanks for the very high-quality video!
@vinfamous9226
@vinfamous9226 3 года назад
Wow this is truly amazing content and sooo up to date!
@vtrandal
@vtrandal Год назад
Bravo. Very nicely done. Thank you.
@tosvarsan5727
@tosvarsan5727 3 года назад
very extraordinary piece of information, thanks man!
@philipo1541
@philipo1541 2 года назад
Hi, Can you give the links of the places where you got your information from?
@JonathanSwiftUK
@JonathanSwiftUK 3 года назад
Very informative, thank you.
@calholli
@calholli 3 года назад
Your videos are like an endless competitive battle between my comprehension and your articulation; may this war never find peace.
@holthuizenoemoet591
@holthuizenoemoet591 2 года назад
So would it work to just create a analogue circuit that is optimized for MAC , in other words a physical representation of the neural network, that is also programmable. For example, change the resistance to tweak the weights of the network. The output of this network is fed into a digital CPU with preforms some final processing and allows for inside into the systems behaviour (debugging).
@user-cx2bk6pm2f
@user-cx2bk6pm2f 2 года назад
As soon as he mentioned "analog" was waiting for the requisite mention of noise being the limiting factor. I'm impressed that he did indeed talk about that.. but disappointed that precluded the epic rant I was about to unleash 🤣
@thecrapadventuresofchesimo420
@thecrapadventuresofchesimo420 3 года назад
Great video, but you can definitely probe and work out what's going on in an analogue system (you just need probe points in the hardware design). While the probe techniques are for physically larger systems, use of appropriate test connections could make it viable for smaller ones. Also, it's funny that the things that sort of define biological intelligence (variability and lack of result verification) are seen as drawbacks to the use of analogue systems... BTW both of these have methods of compensation in analogue systems.
@shift-happens
@shift-happens 3 года назад
Easily one of the best channels on this platform! Thanks a lot :)
Далее
The Coming AI Chip Boom
15:41
Просмотров 342 тыс.
The Illusion Of Digital Audio
18:32
Просмотров 594 тыс.
#kikakim
00:18
Просмотров 2,2 млн
The past and future of Orange juice🍊
00:17
Просмотров 8 млн
Deep-dive into the AI Hardware of ChatGPT
20:15
Просмотров 312 тыс.
Why Automakers Are Invading Your Privacy
14:23
Просмотров 540 тыс.
AI’s Hardware Problem
16:47
Просмотров 615 тыс.
$50 Trillion Was Just Found Under Antarctica
17:41
Просмотров 1,5 млн
How The Most Hated Auto Part Changed The World
13:37
Просмотров 280 тыс.
The Incredible Story Of Randomness
22:47
Просмотров 404 тыс.
The World Of Strange Computers
19:55
Просмотров 146 тыс.
#kikakim
00:18
Просмотров 2,2 млн