Тёмный

Future Computers Will Be Radically Different (Analog Computing) 

Veritasium
Подписаться 15 млн
Просмотров 12 млн
50% 1

Visit brilliant.org/Veritasium/ to get started learning STEM for free, and the first 200 people will get 20% off their annual premium subscription. Digital computers have served us well for decades, but the rise of artificial intelligence demands a totally new kind of computer: analog.
Thanks to Mike Henry and everyone at Mythic for the analog computing tour! www.mythic-ai.com/
Thanks to Dr. Bernd Ulmann, who created The Analog Thing and taught us how to use it. the-analog-thing.org
Moore’s Law was filmed at the Computer History Museum in Mountain View, CA.
Welch Labs’ ALVINN video: • Self Driving Cars [S1E...
▀▀▀
References:
Crevier, D. (1993). AI: The Tumultuous History Of The Search For Artificial Intelligence. Basic Books. - ve42.co/Crevier1993
Valiant, L. (2013). Probably Approximately Correct. HarperCollins. - ve42.co/Valiant2013
Rosenblatt, F. (1958). The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain. Psychological Review, 65(6), 386-408. - ve42.co/Rosenblatt1958
NEW NAVY DEVICE LEARNS BY DOING; Psychologist Shows Embryo of Computer Designed to Read and Grow Wiser (1958). The New York Times, p. 25. - ve42.co/NYT1958
Mason, H., Stewart, D., and Gill, B. (1958). Rival. The New Yorker, p. 45. - ve42.co/Mason1958
Alvinn driving NavLab footage - ve42.co/NavLab
Pomerleau, D. (1989). ALVINN: An Autonomous Land Vehicle In a Neural Network. NeurIPS, (2)1, 305-313. - ve42.co/Pomerleau1989
ImageNet website - ve42.co/ImageNet
Russakovsky, O., Deng, J. et al. (2015). ImageNet Large Scale Visual Recognition Challenge. - ve42.co/ImageNetChallenge
AlexNet Paper: Krizhevsky, A., Sutskever, I., Hinton, G. (2012). ImageNet Classification with Deep Convolutional Neural Networks. NeurIPS, (25)1, 1097-1105. - ve42.co/AlexNet
Karpathy, A. (2014). Blog post: What I learned from competing against a ConvNet on ImageNet. - ve42.co/Karpathy2014
Fick, D. (2018). Blog post: Mythic @ Hot Chips 2018. - ve42.co/MythicBlog
Jin, Y. & Lee, B. (2019). 2.2 Basic operations of flash memory. Advances in Computers, 114, 1-69. - ve42.co/Jin2019
Demler, M. (2018). Mythic Multiplies in a Flash. The Microprocessor Report. - ve42.co/Demler2018
Aspinity (2021). Blog post: 5 Myths About AnalogML. - ve42.co/Aspinity
Wright, L. et al. (2022). Deep physical neural networks trained with backpropagation. Nature, 601, 49-555. - ve42.co/Wright2022
Waldrop, M. M. (2016). The chips are down for Moore’s law. Nature, 530, 144-147. - ve42.co/Waldrop2016
▀▀▀
Special thanks to Patreon supporters: Kelly Snook, TTST, Ross McCawley, Balkrishna Heroor, 65square.com, Chris LaClair, Avi Yashchin, John H. Austin, Jr., OnlineBookClub.org, Dmitry Kuzmichev, Matthew Gonzalez, Eric Sexton, john kiehl, Anton Ragin, Benedikt Heinen, Diffbot, Micah Mangione, MJP, Gnare, Dave Kircher, Burt Humburg, Blake Byers, Dumky, Evgeny Skvortsov, Meekay, Bill Linder, Paul Peijzel, Josh Hibschman, Mac Malkawi, Michael Schneider, jim buckmaster, Juan Benet, Ruslan Khroma, Robert Blum, Richard Sundvall, Lee Redden, Vincent, Stephen Wilcox, Marinus Kuivenhoven, Clayton Greenwell, Michael Krugman, Cy 'kkm' K'Nelson, Sam Lutfi, Ron Neal
▀▀▀
Written by Derek Muller, Stephen Welch, and Emily Zhang
Filmed by Derek Muller, Petr Lebedev, and Emily Zhang
Animation by Ivy Tello, Mike Radjabov, and Stephen Welch
Edited by Derek Muller
Additional video/photos supplied by Getty Images and Pond5
Music from Epidemic Sound
Produced by Derek Muller, Petr Lebedev, and Emily Zhang

Опубликовано:

 

19 апр 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 13 тыс.   
@anishsaxena1226
@anishsaxena1226 2 года назад
As a young PhD student in Computer Science, your explanation of how neural networks come to be and evolved, and the math behind it, is the cleanest and most accessible that I have come across. As I focus on computer architecture, I came to this video without much expectations of learning anything new, but I am glad I was wrong. Keep up the great work!
@deepblue3682
@deepblue3682 2 года назад
From USA?
@alex.g7317
@alex.g7317 2 года назад
There’s a reason he has 11, 000, 000, 000 subscribers after all 😏
@unstable-horse
@unstable-horse 2 года назад
@@alex.g7317 Wow, that's more than the population of Earth. Where does he find all those subscribers??
@exoops
@exoops 2 года назад
@@unstable-horse Mars
@alex.g7317
@alex.g7317 2 года назад
@@unstable-horse omg, lol 😂. That was a typo! I meant 11, 000, 000!
@KraftyB
@KraftyB 2 года назад
Fun fact, that Graphics Card he’s holding at 18:00 is a Titan xp with an msrp of $1200, he says it draws 100w but it actually draws about 250w, so that tiny chip that only draws 3w is even more impressive
@Matthew-rl3zf
@Matthew-rl3zf 2 года назад
Let's hope these new analog chips can solve our GPU shortage problem 😂
@justuskarlsson7548
@justuskarlsson7548 2 года назад
In general when doing machine learning you are only using the CUDA cores of a graphics card so the wattage never gets close to its maximum. A lot of the processing units are simply not being used, for example shaders and 3D processing units. For example on my GTX 1080 I sit between 60-90w out of 200w when doing Pytorch machine learning. So I think 100w out of a maximum effect of 250w seems reasonable.
@chrisoman87
@chrisoman87 2 года назад
you can underclock GPU's, thats what they do in cryptomining to improve their profit margins, depending on the chip they can operate effeciently at a fraction of their nominal power
@AC3handle
@AC3handle 2 года назад
man, I'm old enough to remember when a $1200 card was considered EX PENS >IVE< And not...'going price'.
@chrisoman87
@chrisoman87 2 года назад
@@AC3handle 1200 wont buy you enough power for a decent DL rig either. An RTX 3090 goes for ~$3000 USD
@avinashkrishnamurthy6251
@avinashkrishnamurthy6251 Год назад
Analogue was never meant to die; the technology of that time was the limiting factor IMO. It appears like Analogue - Digital hybrid system can do wonders in computing.
@DigitalJedi
@DigitalJedi 2 месяца назад
I know this is an old comment, but I figured I'd add that as far as physical packaging goes, nothing stops us from putting one of these next to a conventional CPU. Cooling it would be the hard part as the temperature would swing the outputs by introducing noise. Might be better as an M.2 PCIE device.
@goldenhate6649
@goldenhate6649 2 месяца назад
@@DigitalJedi Its incredibly unlikely this will ever expand into the home. These would likely be built entirely different from traditional computers.
@DigitalJedi
@DigitalJedi 2 месяца назад
@goldenhate6649 As we've seen they can be build on NAND processes already, which are widely adopted by consumer electronics. The use case provided of low-power wake-word and condition detection seems like a great application if they can find the right product in the consumer space.
@williamtell1477
@williamtell1477 Год назад
AI researcher here, you did a great job on this. For anyone interested the book Perceptrons by Minsky/Papert is a classic with many proofs of limitations and explorations of the limits of the paradigm. It still holds up today and its fascinating to read what scientists were thinking about neural networks during the year of the moon landing!
@musbiq
@musbiq 6 месяцев назад
Great recommendation. Thanks.
@christophertown7136
@christophertown7136 4 месяца назад
A Logical Calculus of the Ideas Immanent in Nervous Activity
@Septimius
@Septimius 2 года назад
I see Derek is getting into modular synthesizers! Also, funny to see how the swing in musical instruments to go from analog to digital and back is being mirrored in computing generally.
@paradox9551
@paradox9551 2 года назад
My first thought when he pulled out the analog computer was "Hey that looks like a modular synth!"
@toddmarshall7573
@toddmarshall7573 2 года назад
Witness Audio Modeling (search for it on RU-vid).
@p1CM
@p1CM 2 года назад
Music has always been an AI task
@theisgunvald4219
@theisgunvald4219 2 года назад
As a semi-professional music producer with almost half a decade of working with professional musicians I would agree - and this is mainly because people feel a lack of “soul” in music. Those small human errors that we’ve spent decades trying to get rid of with Autotune, Drum machines, Sequencers, digital synthesiser and digital samplers (the last two CAN create sounds that will always come out the same way as long as the input stays the same - however there are exemptions). This is probably something the people I know in the music industry refer to as “The generation-rule”, in brief the music today is a result of what our parents and grandparents heard combined with new technologies and pop culture. - If you’re interested in music and maybe want to stay ahead of the game look it up. Some refer to it as the “30 year rule” as well.
@PetraKann
@PetraKann 2 года назад
@@p1CM AI has no tasks
@jeffc5974
@jeffc5974 2 года назад
One of the first things I learned in Electrical Engineering is that transistors are analog. We force them to behave digitally by the way we arrange them to interact with each other. I'm glad there are some people out there that remember that lesson and are bringing back the analog nature of the transistor in the name of efficiency and speed.
@jasonbarron6164
@jasonbarron6164 2 года назад
At the expense of accuracy?
@JKPhotoNZ
@JKPhotoNZ 2 года назад
Well, semi analogue. Don't forget the bias (voltage drop) before you get current amplification. Also, to say that analogue computers are more power efficient that digital is pretty hard to back up. A $2 microcontroller can run on a few mA for the desired task, then sleep on uA. You'll need at least 5mA for an analogue computer to start with and you can't make it sleep.
@danimayb
@danimayb 2 года назад
@@JKPhotoNZ Great point. And with current Nano transistor technology, That efficiency (along with raw power) is going far beyond what a true analogue system could produce.
@rahulseth7485
@rahulseth7485 2 года назад
Yeah but then you'll never know at which zone is it on? Because amplification happens differently for different input parameters. And not all transistors from the same batch would perform the same, i.e. it will lack repeatability (as Derek mentioned).
@mycosys
@mycosys 2 года назад
the insoluable (even in theory) problems of analog are noise and signal integrity, which is why he didnt even mention them. This channel has gone to poop honestly.
@YolandaEzeagwu
@YolandaEzeagwu Год назад
I had a business analysis course that tried explain the perceptron and I didn't understand anything. I don't have a strong maths background. This video is pure genius. The way you explain ideas is amazing and easy to understand. Thank you so much. This is my favourite channel
@NoahSpurrier
@NoahSpurrier Год назад
I remember seeing an analog differential calculator in high school in my physics and electronics teacher’s lab. It was more of a museum piece. It was never used. RIP Mr. Stark
@elliott8596
@elliott8596 Год назад
To be fair, many of the tools we use are analog. We just don't call them "analog computers"... even though, they kind of are.
@rogerphelps9939
@rogerphelps9939 Год назад
Exactly. Museums is where analog computers belong.
@certainlynotmalo1.0.06
@certainlynotmalo1.0.06 Месяц назад
@@rogerphelps9939 The words of someone who knows nothing but his own little world. And he is content with it. Honestly, i'm jealous. For real, stay that way or life will get an awful lot harder. I would give everything i have to aquire such luxury.
@5MadMovieMakers
@5MadMovieMakers 2 года назад
Hyped for the future of computing. Analog and digital could work together to make some cool stuff
@teru797
@teru797 2 года назад
True AI is going to be the end of us. Why would you want that?
@kalindibang9578
@kalindibang9578 2 года назад
@@teru797 true AI wont be possible for the next 200 years and by then if humanity kept on living how they are we aint gonna survive anyways
@michaelschiller8143
@michaelschiller8143 2 года назад
@@teru797 it would still take quantum computers to be able to have the memory necessary to run
@jpthepug3126
@jpthepug3126 2 года назад
@@teru797 cool
@jonathanthomasjohn8348
@jonathanthomasjohn8348 2 года назад
@@teru797 we are already the end of us
@belsizebiz
@belsizebiz 2 года назад
For amusement only: My first day at work was in1966, as a 16 year old trainee technician, in a lab dominated by a thermionic valve analogue computer (or two). These kept us very warm through the winter months and cooked us during the summer. The task was calculation of miss distances of a now-obsolete missile system. One day I was struggling to set up a resistor/diode network to model the required transfer function, but the output kept drifting. I found the resistors were warming up and changing value. Such are the trials and tribulations of analogue computing....
@_a_x_s_
@_a_x_s_ 2 года назад
Thus the temperature coefficient is very important for recent precision devices. And a high accuracy low ppm resistor is expensive, which is one of the reasons why it costs so much for high-end electronics instruments.
@mikefochtman7164
@mikefochtman7164 2 года назад
I was going to comment that one disadvantage of analog computers is keeping them calibrated. If you want a precise amount of 'voltage' or movement to represent a real-world value, you have to keep it calibrated. Older mechanical ones had wear/tear, electronic ones have issues as well.
@stefangriffin2688
@stefangriffin2688 2 года назад
Ah!? But what if the resistors were warming up, digitally?
@Cat-ir8cy
@Cat-ir8cy 2 года назад
@@stefangriffin2688 you can't have a digital resistor
@aravindpallippara1577
@aravindpallippara1577 2 года назад
@@stefangriffin2688 yeah digital signals works with gates - on or off
@asg32000
@asg32000 6 месяцев назад
I've watched a lot of your stuff for years now, but this is the best one by far. Great job of explaining something so complex, difficult, and relevant!
@zebaerum7
@zebaerum7 6 часов назад
I used this content and visuals for my Electronics Engineering final year technical seminar. I loved the content, and the way it's put together. Thanks for choosing the most interesting stuff to put in my feed.
@TerryMurrayTalks
@TerryMurrayTalks 2 года назад
As a 70-year-old boomer my technical education involved building and testing very basic analog devices. Thanks for this video, it helped me to a better understanding of neural networks.
@magma5267
@magma5267 2 года назад
You must be really healthy because you dont even look close to 70! :D
@TerryMurrayTalks
@TerryMurrayTalks 2 года назад
@@magma5267 Thanks for the heads up. I've got a good women, 6 children, 8 grandchildren and a recently placed stent in my heart that keeps me going :)
@vedkorla300
@vedkorla300 2 года назад
@@TerryMurrayTalks Good for you my man! I am still 20 and don't know what to do in life :(
@vource2670
@vource2670 2 года назад
Yep your 70 mate
@TerryMurrayTalks
@TerryMurrayTalks 2 года назад
@@vedkorla300 You have plenty of road left to travel, follow what you love, enjoy the journey, don't let the bumps in the road stop you and if you can get a soul mate to share it with you it will all be good.
@PersonaRandomNumbers
@PersonaRandomNumbers 2 года назад
My professor always said that the future of computing lies in accelerators -- that is, more efficient chips that do specific tasks like this. He even mentioned analog chips meant to quickly evaluate machine learning models in one of his lectures, back in 2016! It's nice seeing that there's been some real progress.
@xveluna7681
@xveluna7681 2 года назад
That's pretty much where things have always been. Using basic building blocks that do specific functions. A linear voltage regulator has the job of maintaining a constant output voltage for a given set of current levels and different input voltages. You can buy an Opamp and using resistors to make a function called a schmitt trigger. Or you might just buy a schmitt trigger from Texas Instruments and put it onto a board with less board space consumed. Or a schmitt trigger might be embedded for free in certain other ICs (intergrated circuit). The major computing engines I have seen so far have been effectively GPUs, CPUs, and FPGAs. Xilinx &Altera (now Intel) have specialized in making FPGAs. An FPGA's basic internal components are logic elements with flipflops with reset, aysnc reset inputs, 4-input look up table, etc. Cascade these to make larger units like a multiplexer, floating point arthimetic unit, etc. Its programmable so you can effectively emulate a worse performing specialized CPU. A CPU is still more effcient at doing CPU type functions. A GPU does specific stuff as well. The idea of doing analog computations honestly just sounds like another building block to add into a complex system. Only there simply hasn't been a large enough demand to require the generation of specialized hardware like what was described in this video. That one start-up sounds like its developing a chip that will do a series of very specific functions and will need to be integrated into a large systems to accomlish a specific task.
@matsv201
@matsv201 2 года назад
Well that have sort of always been the case. I don´t know what was the first accelerators, but one of the fairly early once was the FPU. We nu just take it for granted. Sprite accelerator was also fairly early. Then graphics accelerators. Then video decoder/encoder Then MMU accelerators Then 3D accelerators. Them SIMD accelertors. Then T&L accelerators Then physics accelertors Then raytracing accelerators Then deep learning accelerators.
@LundBrandon
@LundBrandon 2 года назад
ASIC devices have existed for decades...
@calculator4482
@calculator4482 2 года назад
@@LundBrandon they will soon become obsolete though due to reconfigurable computing devices line FPGAs
@LundBrandon
@LundBrandon 2 года назад
@@calculator4482 FPGAs have also been around for decades, plus they draw more power. I'm a computer engineering student right now currently designing a CPU to be synthesized onto an FPGA. I'm not dumb.
@lc5945
@lc5945 Год назад
I remember the first time I heard the term "deep networks", it was back in 2009 when I was starting my MSc (using mainly SVMs), a guy in the same institute was finishing his PhD and introduced me to the concept and the struggles (Nehalem 8 cores era)...the leaps in performance made in NN since then thanks to GPGPU are enormous
@HrLBolle
@HrLBolle 8 месяцев назад
Mythic Gate's approach to work: Kind of reminds me of the copper filament memory planes with ferromagnetic rings representing the bits used as memory for the AGC (Apollo Guidance Computer). The video on which this memory is based was released by Destin, aka Smarter Every Day, and accompanied his and Linus Sebastian's meeting with Luke Talley, a former IBM employee and, at the time of the Apollo missions, a member of the data analysis teams responsible for the Analysis, evaluation and processing of the telemetry data received from the Apollo instrument ring.
@SandorFule
@SandorFule Год назад
I am a process control engineer, born in 63. In the 80-ies we used analog computers to calculate natural gas flow - for the oil and gas company. A simple flow computer was around 10 kilos, full of op amps and trimmer pots. It was a nightmare to calibrate it. :)
@deang5622
@deang5622 Год назад
Op amps with their offset voltages and input bias currents leading to inaccuracy. Sounds like a nightmare. Constant recalibration required?
@rogerphelps9939
@rogerphelps9939 Год назад
Absolutely.
@victorblaer
@victorblaer Год назад
@@rogerphelps9939 just calculating the uncertainty at each step sounds like a nightmare.
@percutseituan
@percutseituan Год назад
but you can mix with digital control for adjusting and decision
@benoitroehr4100
@benoitroehr4100 Год назад
I think Nasa (or was it still Naca at the time?) was able to simulate flight caracteristics with analog circuits too. I'm thrilled to see this tech coming back !
@suivzmoi
@suivzmoi 2 года назад
as a NAND flash engineer that bit about the usage of floating gate transistors as analog computers is interesting. particular because in flash memory there is a thing known as "read disturb", where even low voltages applied to the gate (like during a flash read) to query it's state can eventually cause the state itself to change. you would think it is a binary effect where if it's low enough it would just be a harmless read but no...eventually there will be electron build up in the gate (reading it many times at low voltage has a similar effect to programming it one time at a high voltage). in this particular application the weight would increase over time the more you query it, even though you didn't program it to be that high in the beginning. it's interesting because, it's sort of analogous to false memories in our brains where the more we recall a particular memory, the more inaccurate it could potentially become.
@donkisiko
@donkisiko 2 года назад
Underrated comment!
@Xavar1us
@Xavar1us 2 года назад
Absolutely love this comment! This has been on my mind for at least an hour now, the point you make is intriguing and a bit haunting, thanks for that!
@JeyeNooks
@JeyeNooks 2 года назад
Fkin right on!!
@Lassana_sari
@Lassana_sari 2 года назад
Very interesting.
@sampathsris
@sampathsris 2 года назад
Underrated comment. Then in Eternals style we will have to reprogram the memories of our servants every now and then.
@bishalpaudel5747
@bishalpaudel5747 10 месяцев назад
This is very well explained video on analog computing. Never could I have thought the topic of analog computing can be put out in 20 minute video with such a phenomenal animation and explanation. Respect your work and effort to make science available to all for free. Respect 🙏
@Psrj-ad
@Psrj-ad Год назад
this make me want Derek to talk about Neural networks and AI related topics a lot more. its not just extremely interesting but also constantly developing.
@funktorial
@funktorial 2 года назад
started watching this channel when I started high school and now that I'm about to get a phd in mathematical logic, I've grown an even deeper appreciation for the way this channel covers advanced topics. not dumbed down, just clear and accessible. great stuff! (and this totally nerd-sniped me because i've been browsing a few papers on theory of analog computation)
@gautambidari
@gautambidari 2 года назад
Absolutely. Love the way he covers the concept for everyone. Those who don't know in depth about it can still go away with a sort of basic understanding. And those who do understand it in depth will enjoy discovering new areas of invention that they can further explore. Looking forward to reading some papers on using analog computing in neural network applications
@victorymorningstar
@victorymorningstar 2 года назад
I'm smart too.
@mentaltfladdrig
@mentaltfladdrig 2 года назад
Same here. But i didnt go to high school and mylife became a total mess and i havent graduated whatsoever :)
@SteveAcomb
@SteveAcomb 2 года назад
“nerd-sniped” lmao I feel exactly the same and here I was thinking I was way ahead of the curve on alternate computing 😂 jokes on me
@melanezoe
@melanezoe 2 года назад
Freaked me out to see that opening analog plug board. That’s how I learned programming in my first data processing class at Fresno State University-in 1964. Eerie to have that memory rise.
@Ozhull
@Ozhull 2 года назад
Damn you're old! Glad you're still kicking around
@jimmysyar889
@jimmysyar889 2 года назад
@@Ozhull he’s only like 75 chill
@amaan06
@amaan06 2 года назад
Lol
@beesharp9503
@beesharp9503 2 года назад
Woo fresno!
@PaulJosephdeWerk
@PaulJosephdeWerk 2 года назад
I graduated Fresno State in 1993 with a BS in CS (after a stint in the military). I even took an artificial intelligence class. I still have my perceptron book.
@activision4170
@activision4170 3 месяца назад
Great video. Never knew this was a thing. Very useful. Might one day just be an extra part on the motherboard designed for fast approximation calculations
@di380
@di380 3 месяца назад
I agree, one point I was going to mention regarding analog computers is that they are susceptible to voltage fluctuations, environmental noise and the accuracy of your results are directly dependent on the accuracy of your equipment reading the output voltages. There is that, but this makes sense when talking about specific applications like this one 👌
@quietcanadian5132
@quietcanadian5132 2 года назад
I’ve been an engineer for 44 years. Great video. I actually worked on analog computers in the 70s when digital processing was still new. Never to this level though. Great job!
@apollochaoz
@apollochaoz 2 года назад
🇨🇦🏳‍🌈
@raijuko
@raijuko 2 года назад
Its amazing how fast all of this is evolving. Looking at this, and comparing it to facial recognition software in simple phone apps we have now really shows how much all of this has influenced what kids and teens easily use today.
@johndoh5182
@johndoh5182 2 года назад
I didn't catch the part where he quit talking about analog systems though when he went to the logic systems being used for matrix operations, because that was digital. There may have been analog inputs into the system, but there's an A to D conversion, and everything he showed at the end was strictly digital, so a bit misleading there. Current systems for AI are digital.
@GeovanniCastro666
@GeovanniCastro666 Год назад
@@raijuko yes but i still believe in God . And i am a fan of science experimenting and inventing
@Ave117
@Ave117 2 года назад
This actually helped me a lot to understand how neural networks work in general. For me it was kinda like black magic before. It still is to an extend but to know that moden Neural Networks are kind of more complex multi-layered perceptrons helped a lot.
@chibicitiberiu
@chibicitiberiu 2 года назад
Yes, indeed. Something I find fascinating are recurrent networks, where some neurons feed back into the network which would allow some information to be saved from one image to the next one. This would allow the AI to process things that change in time, like music and video. For example, if you're tracking a subject with a camera and he turns around, a recurrent AI would be able to continue tracking the subject.
@connorjohnson4402
@connorjohnson4402 2 года назад
Yea in the end to some of it is kind of voodoo black magic thought i mean they call them black boxes for a reason.
@Blox117
@Blox117 2 года назад
speak in english pls
@aladdin8623
@aladdin8623 2 года назад
The video seems to contain some quite biased infos though. The top 5 error rate of humans of 5,1% is of course not accurate. If human beings were that bad, we accordingly would have much, much higher car accident rates. Those kind of inaccurate percentages come from statistics based on captchas. And several conditions do distort the results there. - human users often don't bring up the needed concentration and attention to solve captchas as they actually could. In fact they are angered by them and often times just click quickly through them. In traffic while driving a car human beings are much higher on alert and do tremendously less mistakes. Here human beings still beat autonomous drive computers by several orders of magnitudes, measured in car accidents per million driving hours. - the captchas often do not meet the actual human perception. The captcha images are often unclean, got low resolution and distortions. In the real world humans perceive much higher quality from their surroundings than some crippled captchas. A more clear image increases the recognition dramatically. It is really crucial in educational videos from whom and from where you take your numbers. Science is not always that objective as we are told, especially when corporations are funding them financially with own interests. Other than that the video is quite interesting. I also wished though that many common misconceptions would have been cleared up. For example many people still believe, that computers would work like human brains. This is plain nonsense, mostly spread by science fiction. The brain still does pose big mysteries to us especially "the big problem of consciousness".
@anteshell
@anteshell 2 года назад
It is still kind of voodoo or black magic. While the overall working mechanisms are well known and the output can be estimated based on the input, how the neural network exactly reaches to the answer is nearly impossible to inspect because of the sheer amount of variables. In essence, you feed a black box with something and you can expect it to give you a particular answer with some confidence, but no-one has any damn idea what exactly happens inside the black box.
@jasonturner1045
@jasonturner1045 Год назад
How have I not found this channel before now?? Fascinating topics.
@fierybones
@fierybones 3 месяца назад
I happened to watch this just after playing with a modular (audio) synthesizer. In these, each module is controlled by a voltage, originating from an oscillator, a keyboard, or a "sequencer". The concept that makes a modular synth interesting is, the voltage pattern (waves) output from a module can either be used as an audio signal (if it's in the audio spectrum), or to control another module. In the simplest case, output from a voltage controlled oscillator (VCO) can be routed to a speaker to produce a sound. But it can also be routed to a module that filters a signal in some way, based on the output voltage of its predecessor. Maybe the thing that makes "ambient" music's slowly-shifting textures interesting is that they mimic the neural networks of our brains.
@certainlynotmalo1.0.06
@certainlynotmalo1.0.06 Месяц назад
Aot of the actually do! You can even (kind of) help your brain waves to synchronise with the oscillations, it's not by brute force (you have to play along or it doesn't work very well) but it can greatly help for sleeping, learning and related stuff. Reminds me of old hardware synthts, where you had to connect each of the synth parts with cables. But tat gave you an amazing amount of flexibility! BUT, no one cared to write the configurations down... That was te funniest and most awful part at the same time...
@dust7962
@dust7962 2 года назад
The problem with this system of computing is that interference is a huge factor. When you only test to see if there is voltage or not then you don't need to worry about interference. But when you get into building systems that use varying voltages say 0v 0.5v 1v then you need to worry about interference, and the more numbers you add the greater this factors into being an issue. Interference could come in the form of microwaves, radiation, mechanical vibration (think fans cooling off a server rack,) and the list drags on as almost anything can cause interference. That ostiliscope used in the example is an expensive piece of equipment that minimizes interference, but the cost is far higher than with a binary number system.
@introprospector
@introprospector 2 года назад
Binary computers have to deal with interference too, that's handled by error correction. Error correction is already baked into the infrastructure of every digital component, to the point where we don't realize it's there. They suggested one method of error correction in the video, and they're probably not even scratching the surface of what's possible.
@fltfathin
@fltfathin 2 года назад
i think the crux is the medium, the AI models brain which is so good at re-building itself, and it uses electron and chemicals to convey information. our transistors are too limited to mimic that interaction. for example the new 3w chip needs to be custom made for each model if i got it right.
@dust7962
@dust7962 2 года назад
@@introprospector Yes, but with binary error correcting is simpler as interference isn't as much of a burden on the architecture. When the job is to check if there is, or isn't voltage it is a lot less complex than checking 8 different voltage thresholds.
@dust7962
@dust7962 2 года назад
@@fltfathin This is called an ASIC (Application specific circuit) the computer is pretty much just sent to the landfill after it's outlived it's usefulness instead of having the ability to be repurposed. Which is another concern about where computing in general is heading as PCBs use less and less semi-precoius, or precious metals there is less incentive to recycle.
@JayJay-dp8ky
@JayJay-dp8ky 2 года назад
@@dust7962 Yeah but I put my mobo in the case first and then the radiator wouldn't fit, so I had to take it out and install the radiator first. It was really annoying. I didn't watch this video, but I'm assuming this is what he was talking about.
@IronAsclepius
@IronAsclepius Год назад
My undergraduate work was actually with a professor who did research in the brain as an analog computer and using neural networks and analog computing as an attempt to achieve super-turing computation. A researcher who's name is worth looking into in all this from my research would be Hava Siegelmann. At the time I understood much less about the problem. My task was essentially to try and prove that analog computation could be modeled with a neural network on a digital computer. Not sure if my comment will be buried or not, but it's an area worth looking into if you're more deeply interested in this problem.
@AayushPatel-gc3fw
@AayushPatel-gc3fw Год назад
I have never done much/extremely deep research on a topic, but this seems very interesting.
@raystir98
@raystir98 Год назад
id like your comment to not be burried.
@noblenessdee6151
@noblenessdee6151 Год назад
0s and 1s . high and lows. voltage and no voltage (digital representation of numbers); have absolutely nothing to do with brain neurons. it's complete BS . For all we truly know about the brain there could be a near unless amount of information with every firing of a neuron. We have no idea what format conciseness information is in and likely never well as "in time" humans.
@AayushPatel-gc3fw
@AayushPatel-gc3fw Год назад
@@noblenessdee6151 Engineers : well I will approximate every thing a neuron is saying, to just two numbers. 🙂...
@beeswaxlover
@beeswaxlover Год назад
@@AayushPatel-gc3fw words are the limitations, not numbers, all words can be expressed in code, not all humanity can be expressed in words.
@snerttt
@snerttt 6 месяцев назад
I'd be interested to see a digital computer adopt an analogue component, possibly to be utilized in situations of physics simulation, much like how a GPU is utilized to independently create graphics from the CPU.
@joaoluizpestanamarcondes6219
Bro, this channel is crazy top shelf stuff, im amazed, thank you for that
@FragEightyfive
@FragEightyfive 2 года назад
Using an analog computer to demonstrate differential equations is a perfect teaching tool. I really wish tools like this were used in colleges more often.
@Elrog3
@Elrog3 2 года назад
They already use crap like this far too often. This isn't something use for for a differential equations course. Maybe it would be ok for a circuits course or even Computer Science course focused solely on analog computers. In math, just give us the numbers and the logic... Don't waste are time with this stuff.
@Elrog3
@Elrog3 2 года назад
@@JackFalltrades I am an engineering student.
@Elrog3
@Elrog3 2 года назад
@@JackFalltrades I'm not calling letting students know of use-cases for things crap. I'm calling taking up class time that is meant for teaching students the logic of how to solve differential equations (because that is the class the original poster said it would be good for) and instead using that class time to teach something that only a tiny fraction of the class would ever use.
@quotidian8720
@quotidian8720 2 года назад
it is used in control systems
@Noootch
@Noootch 2 года назад
@@Elrog3 He never said it should be used in a differential equations course. You just sound like the type of students that go to university and ask which courses they need in order to get a high salary position in industry.
@koborkutya7338
@koborkutya7338 Год назад
i recall our control system teacher at the university in the '90s said Space Shuttle flight controls contained analogue computing because it had to process like several thousand sensors' input to produce outputs and digital was just too slow for the job.
@rogerphelps9939
@rogerphelps9939 Год назад
He lied.
@TARS-CASE
@TARS-CASE 9 месяцев назад
@@rogerphelps9939 the Space Shuttle did indeed use analog computing for some of its flight control systems. the Space Shuttle used a hybrid digital/analog system for flight controls. Most of the high-level control logic was handled by digital computers, but critical low-level control functions were performed using analog circuits. The analog components were able to process sensor inputs and produce control outputs much faster - on the order of microseconds - compared to even the fastest digital computers of the era, which took milliseconds. This speed was essential for stability during flight.
@user-tg5sv5ps2i
@user-tg5sv5ps2i Месяц назад
I can imagine it too be also just more fault tolerant. Discrete = hard, continous = easy. An overflow in digital can literally crash a whole system. In analog there is more room for error.
@emmateedub9672
@emmateedub9672 Год назад
An interesting video covering some of the beginnings of AI, how computers work and also environmental considerations. I would like to find out more about Rosenblatt however i was expecting something of the idea of mechanical computers. Good information good video, thanks!
@davidchristensen4643
@davidchristensen4643 3 месяца назад
It's interesting how circular technology is. Back in the 1970's my first job out of uni was with a national research association focused on all things to do with ships in the UK. Whilst the primary focus of my work was providing QA services to the various research teams, including maintaining and enhancing language systems like RATFOR, and system management of the ICL, IBM, Perq, CV & DEC systems, I was also involved in developing two specific analogue/digital hybrid projects. One was focused on managing and monitoring loading balances for bulk cargo ships and the other was simulating ship navigation into ports in real-time. Both of these projects involved interfacing the analogue data from real-time sensors to digital monitoring and mapping algorithms. Unfortunately, at that time, analogue was seen as a historical burden and both were eventually canned. Now, almost 50 years later, it's great to see that our ideas of the 70's are coming back into fashion.
@lonewulf0328
@lonewulf0328 2 года назад
This was one of the best layman's explanations of neural net training models that I have ever seen. Awesome content!
@duongchuc1834
@duongchuc1834 2 года назад
ok
@patakk8145
@patakk8145 2 года назад
but it isn't, he literally said he's going to skip back propagation (which is how models are trained nowadays)
@PaulAVelceaVSC
@PaulAVelceaVSC 2 года назад
i am a layman I did not understand a bit of it, pun intended
@robertb6889
@robertb6889 2 года назад
As a guy who helps manufacture flash memory I find this really intriguing: especially because flash memory is continuing to scale via 3-D layering, so there’s a lot of potential, especially if you can build that hardware for multiplication into the chip architecture.
@ravener96
@ravener96 2 года назад
You are still strugling with interconects from one side to the other
@Zeuskabob1
@Zeuskabob1 2 года назад
@@ravener96 With many ML algorithms you can split problems into multiple sub-problems for different networks to handle. I wonder if developing that area of ML would be helpful to make effective analog systems? For an example, in image processing a pixel at the top left of the image has little interaction with a pixel in the bottom right of the image compared to nearby pixels. If you wait to compare them until multiple layers later, it speeds up processing the image and allows for algorithms to become more adept at finding sub-patterns in the image.
@martiddy
@martiddy 2 года назад
@@Zeuskabob1 Depends on what kind of images processing the neural network is doing, if the computer wants to identify a face in a person maybe it doesn't need to process all pixels once it has processed all the pixels near the face, but in some cases distant pixels can indeed be correlated, like the images from a camera in an autonomous car identifying the white lines of a street, where it could be 99% sure it is a straight line but the corner pixels clearly indicates that is curve line.
@robertb6889
@robertb6889 2 года назад
@@ravener96 Yeah, but interconnects can be designed around with clever architecture to an extent. It's still quite interesting.
@seldompopup7442
@seldompopup7442 2 года назад
Flash cells are micron scale while the AI accelerators doing integer operation are built with the latest 4 nm technology. And floating gates have really limit life compared to pure logic circuit.
@spoidermon2515
@spoidermon2515 Год назад
Damn Man!! You explained it pretty well!! All that history and theory wrapped in 22 mins! Incredible!
@kasparsiricenko2240
@kasparsiricenko2240 Год назад
When I was in institute back in 2016 I was thinking of this specific “gates” as well as undergraduate. I knew someone was already implementing it but still missing the time I could be part of innovations. What a genius way of reimplementing circuits for neural networks. Maybe that’s what the FPGA is future is - neural networks
@DomDomPop
@DomDomPop Год назад
It’s funny, for those of us who are into electronic music production, analog never left! There are lots of great analog synthesizers out there that can produce all kinds of complex waveforms, and some of us have been known to tack an oscilloscope on to modular gear to view and use those waveforms. Even some relatively simple gear can produce complex, “3D” structures with the correct cable patches. A lot of what you described at the beginning is the backbone of synthesis for music, and the same principles obviously apply to mathematical operations.
@rogerphelps9939
@rogerphelps9939 Год назад
You can do everything digitally that an analog system can do and more. An example is resampling in order to change the frequency scale of a recording. This can be done in real time using digital methods, not so much for analog methods.
@DomDomPop
@DomDomPop Год назад
@@rogerphelps9939 Depends on what you’re doing and what’s important to you. Analog synths are great for experimenting with the knobs and patch bay (if available) and learning what exactly each change has on the overall waveforms. They’re really great for learning what exactly you’re doing and what you’re getting as a result. Yeah, there are software synths meant to emulate hardware knobs and a patch bay, but I haven’t found clicking through all that as valuable as plugging and experimenting yourself. That stuff really depends on the person, though. What doesn’t depend on the person, and is arguably more important, is the fact that aliasing can end up being a problem on digital synths. When you start doing some crazy cross modulation between sources and/or you’re dealing with lots of harmonics, if the processor can’t keep up, your sound will suffer. Same with super high frequencies. Depends on the synth, of course, but analog synths can tend to have a warmer, purer sound to them as well, because you don’t have to emulate all those harmonics. It really comes down to the same arguments being made here regarding analog computers: there’s no processor overhead needed to create some very complex shapes, and to do so perfectly accurately, on analog. I use both types of synths, as lots of people do, and I would never say that one somehow makes the other unnecessary. Hell, there are hybrid synths that give a mostly analog signal path while allowing for, say, a digital sample and hold circuit and the ability to save certain parameters. People make those kinds of things for a reason, you know?
@victorpereira8000
@victorpereira8000 Год назад
Pythagoras discovered math with music I think right? Really like your comment
@RAndrewNeal
@RAndrewNeal Год назад
@@rogerphelps9939 Difference is that you need billions to trillions of transistors to do digitally what can be done using tens to hundreds of transistors analogously.
@rogerphelps9939
@rogerphelps9939 Год назад
@@RAndrewNeal Wrong. The errors arising from component tolerances, noise and temperature dependent offsets make anything complicated pretty much impossible in analog. Transistors in digital processors are extremely cheap. Provided you have good DACs and ADCs you can do anything to whatever precision you need in digital.
@SIBUK
@SIBUK 2 года назад
The most interesting thing I found in this was when he was saying that in the chip they had to make it alternate between analog and digital signals to maintain coherence. It's interesting because the brain does something similar where it alternates between electric pulses and chemical signals.
@chrisfuller1268
@chrisfuller1268 2 года назад
The problem is machine learning is still not capable of being used commercially in general environments (e.g. security cameras) because they can't handle unpredictable situations. The brute force method of AI is still the only solution for general environments (e.g. self driving cars)
@riskyraccoon
@riskyraccoon 2 года назад
@@chrisfuller1268 people also suffer from the brute force nature of processing information, aka confirmation bias. Thankfully we can take steps to correct this, but many people lack the tools and mindset to make these self corrections.
@chrisfuller1268
@chrisfuller1268 2 года назад
@@riskyraccoon Yes, humans are flawed, but we are capable of recognizing objects no matter what else is in our field of view. This is a task machine learning will never be able to solve in 100% of all possible (infinite) environments. Brute force AI requires more development effort, but is capable of also identifying objects in many environments. This is why machine learning is a step backwards in technology and why it should never be used in life critical applications.
@chrisfuller1268
@chrisfuller1268 2 года назад
@Adam H Amen, I never thought of the beast as an AI! The beast will be cast into the lake of fire so I believe he will be flesh and blood human with a soul, but the 'image of the beast'!
@chrisfuller1268
@chrisfuller1268 2 года назад
@Adam H yes, that is a very interesting way of looking at it! I think we're a very long way from an AI being able to reason, but we've been using AI to kill people for decades.
@santiagojimenezpinedo3473
@santiagojimenezpinedo3473 11 месяцев назад
This is really cool, and there is another startup that have a different approach using analog but instead of using voltage and currents, they use light, so it is really interesting how the analog is coming back. I would really appreciate it if you would make a video about this. The startup is Lightelligence. As always, thanks for these videos.
@frightenedsoul
@frightenedsoul 3 месяца назад
Terrible name, though lol. Lightelligence. I get the idea behind it but it just doesn’t work as a satisfying portmanteau
@gregseljestad2793
@gregseljestad2793 11 месяцев назад
I just found out that the SR71 engines had a hydraulic computer that ran the system. That would be amazing to see. I worked at Caterpillar and a friend of mine was tasked with converting a craper transmission module from a hydraulic base to electronic. It was a very old design and all engineers had passed on. They had a team of engineers that had to replicate all hydraulic functions into an electrical equivelant. It was fascinating to me. One of the functions they had to replicate is going up a steep hill with a full load and being able to shift without rolling backwards. Holding the load, sharing the load with two clutches, and increasing one clutch while reducing the other clutch to make it a seemless shift. So enjoy this topic. Thanks!
@SaanMigwell
@SaanMigwell 3 месяца назад
Most nuclear power plants are pneumatic computers. Well, the old subs and breeder reactors anyway.
@nicholasjayaputra5754
@nicholasjayaputra5754 2 года назад
I thought there was no other part to the first part, thank you for the satisfaction you have given me through the knowledge I got from this video
@zaksmith1035
@zaksmith1035 2 года назад
Can't wait to watch this with my kids. I forgot it was coming, we were waiting so long for it.....
@AxxLAfriku
@AxxLAfriku 2 года назад
NO! NO! NO! Many people say I am sick in the head. NOOOO!!!! I don't believe them. But there are so many people commenting this stuff on my videos, that I have 1% doubt. So I have to ask you right now: Do you think I am sick in the head? Thanks for helping, my dear nico
@byronvries3826
@byronvries3826 2 года назад
@@AxxLAfriku 0
@nicholasjayaputra5754
@nicholasjayaputra5754 2 года назад
@@zaksmith1035 That's awesome mate
@nicholasjayaputra5754
@nicholasjayaputra5754 2 года назад
@@AxxLAfriku I'm honoured to have your bot-like reply in my comment. As for the answer, well, I don't know, but have a good day mate!
@BrianBoniMakes
@BrianBoniMakes 2 года назад
I used to calibrate analog computers that ran experiments and test equipment. They were often odd mixtures of analog and digital technologies. Near the end I had to keep a few machines alive as they aged out of tolerance, there was always a way you could tweak out some more performance by shifting the calibration away from areas you didn't need in a much more forgiving way than any new digital could.
@nenmaster5218
@nenmaster5218 2 года назад
Anyone knows some Good Science-Channel for me to cxheck out?
@yash1152
@yash1152 2 года назад
thanks a lot Brian Boni for your valuable input (keys: computers: mix of analog and digital)
@yuro5833
@yuro5833 2 года назад
@@nenmaster5218 Nile red and Nile blue
@nenmaster5218
@nenmaster5218 2 года назад
@@yuro5833 Thx! Know Hbomberguy?
@yuro5833
@yuro5833 2 года назад
@@nenmaster5218 I actually thought I didn’t then realized I had seen several of his videos and forgot about him so thank you as well
@grabdoel
@grabdoel 3 месяца назад
I learned more through your video than i did in engineering class :(. thanks a lot and it opens a great perspective on new innovations where analog is combined with digital. Will dive into it.
@javierperea8954
@javierperea8954 Год назад
That's so beautiful. Using a photocell as an analog to digital interface, with the advantages of both systems applied effectively in a system.
@The1wsx10
@The1wsx10 2 года назад
wow that analog chip sounds extremely competitive. im surprised they already have something that good. mad props to the guy who figured out the hack with the flash storage
@dorusie5
@dorusie5 2 года назад
I wonder how temperature sensitive it is.
@hughJ
@hughJ 2 года назад
@@dorusie5 I'm mostly curious about the write-cycles and lifespan of the flash cells. Is the network going to get Alzheimer's after a few days?
@SharienGaming
@SharienGaming 2 года назад
integrated circuits like that have always been really efficient - the downside is that they are extremely specialized... as the guy said: its not a general computation chip it can literally only do matrix multiplication - but that it can do really damn efficient (though slightly imprecise - which likely still is good enough for neural network purposes since they arent interested in the exact value of the result) so...sure thats competitive for that one purpose - but useless for anything that isnt that purpose but if the type of calculation that they can do is in high demand - they likely can sell a lot of specialized hardware either for specific devices or plug-in cards for computers that supply fast matrix multiplication operations
@wouterhenderickx6293
@wouterhenderickx6293 2 года назад
I've been wondering about analogue usage of SSDs for a long time. It's an oversimplification, but each cell holds a voltage which can also be interpreted as an analog signal. If we take music as an example, you could basically write the value of one sample point to a cell, writing 16 bits worth of information to 1 NAND cell. This of course makes it impossible to compress the music, but it would allow to store music 'losslessly' at the same cell usage as a compressed 256kb/s file on TLC storage. Of course, NAND reproduction isn't perfect (and as such, music reproduction wouldn't actually be lossless), but I wonder how close this would come compared to the compressed digital file. I think this could be potentially useful for offline caches and downloads from Spotify for example, as the data can be checked and corrected when a high speed network connection is actually available.
@JustNow42
@JustNow42 2 года назад
Already? We did this before 1960.
@harrybarrow6222
@harrybarrow6222 2 года назад
Rosenblatt’s Perceptron was essentially a one-neuron network, although he could perform logical operations on the binary data inputs before passing results, which gave it more power. Minsky and Papert at MIT were concerned that Rosenblatt was making extravagant claims for his Perceptron and scooping up a lot of the available funding. In their book, “Perceptions”, Minsky & Papert proved that one-neuron networks were limited in the tasks they could perform. You could build networks with multiple Perceptions, but since Perceptrons had binary outputs, nobody could think of a way to train networks. That killed funding for neural networks for decades. In the late 1980s, interest was re-kindled when John Hopfield, a physicist, came up with a training technique that resembled cooling of a physical spin-glass system. But the big breakthrough came when the error back-propagation technique was developed by Rumellhart, Hinton &Williams. In this, neurons were modified to have a continuous non-linear function for their outputs, instead of a thresholded binary output. Consequently, outputs of the network were continuous functions of the inputs and weights. A hill-climbing optimisation process could then be used to adjust weights and hence minimise network errors. The rest is history.
@3nertia
@3nertia 2 года назад
And now, we're "evolution" but with awareness and intent heh
@slatervictoroff3268
@slatervictoroff3268 2 года назад
Critically wrong. Not one-neuron - that doesn't even make sense. One *layer*.
@brunsky277
@brunsky277 2 года назад
​@@slatervictoroff3268 I have to disagree. Perceptron is one-neuron (one neuron that receives multiple inputs and puts out one output). This makes it also one layer network I would say
@meateaw
@meateaw 2 года назад
@@brunsky277 thinking about it though, the inputs all had their own weights, those weights correspond to a neuron. A modern ai model has inputs, and the weights exist on the layers. Therefore the perception had 400 inputs, 400 weights, and 1 output signal. That implies to me 400 neurons, in a single layer, leading to a single output value.
@WilisL
@WilisL 2 года назад
@@slatervictoroff3268 No, one layer can be multiple perceptrons, it’s technically one-neuron (which is technically a one layer though)
@NR-bt7yz
@NR-bt7yz 6 месяцев назад
I've recently started learning ML and this video helps so much. You just made me a Patreon supporter. Thanks Derek!
@user-ou8qw2sg3d
@user-ou8qw2sg3d Месяц назад
This blows my mind. Thank you. It's so cool to learn this way about algorithms.
@carterbentley9030
@carterbentley9030 Год назад
Back in the mid-1960s my uncle, Joseph Grandine, designed a combination analog/digital computer that could optimally combine the two modes to solve complex problems in signal processing and data analysis. He called his computer the Ambilog 200. At that time, digital computing won the day. Now it sounds like he was a few generations ahead of his time.
@dinoschachten
@dinoschachten Год назад
Amazing. Just found two articles about it in the Internet Archive.
@IAreBean
@IAreBean Год назад
That is awesome
@LeKhang98
@LeKhang98 Год назад
That's amazing. We should show him this video and ask him what does he think about it.
@stwessboi
@stwessboi Год назад
cap
@hamzahbalogun4220
@hamzahbalogun4220 Год назад
I would love to know him
@Deveyus
@Deveyus 2 года назад
A couple missed points: Things like google's Coral are also pushing incredibly high values, and to my knowledge are doing it digitally as an ASIC. Large models are expensive to train, there's no contention here, from mythic, you, or the wider AI community, but several advancements have been made in the last couple of years, that are letting models be compressed and refined to less than 1% of their original size. This makes them incredibly small and efficient operations, even on traditional CPUs.
@Zeuskabob1
@Zeuskabob1 2 года назад
I'd love to read about that! I've been dipping my toes in ML algorithms and many of the really interesting networks require an immense amount of memory to function, on the order of tens of gigabytes. I'm curious why those models require such an immense amount of memory, and what can be done to improve that.
@siddharthagrawal8300
@siddharthagrawal8300 2 года назад
@@Zeuskabob1 u don’t really need 10s of gigabytes to get a good model that can perform well on a task (usually). Most people still use models of size less than 5gb or so.
@vigilantcosmicpenguin8721
@vigilantcosmicpenguin8721 2 года назад
+
@flightrisk7566
@flightrisk7566 2 года назад
thanks for pointing this out 🙄 seems like it was deliberately ignored for the sake of promoting this dumb startup
@moonasha
@moonasha 2 года назад
just another case of Veritasium making a bait video to make experts respond
@mbharatm
@mbharatm Год назад
Amazing, thought provoking 2 part video on analog computing. Veritasium never disappoints!
@steveipsen6293
@steveipsen6293 2 года назад
One of my first "computer" classes in engineering school was learning to wire up an analog computer and solve differential equations. Because I had to "assemble" the hardware for the process, it felt much more hands-on than when I took a punch deck to the little window, and waited for up to 20 minutes for the compiler to tell me I had no idea how Fortran worked. At the time, I really appreciated that parameters on the analog could be changed quickly in order to see how different currents, voltages, resistance, etc. affected the outcome. Of course, now with the speed of digital processors, the efficiency of Python libraries, and the Interwebs, I have largely gotten to appreciate the digital world. Now, Derek has got me jazzed to buy a portable analog. $200 on Ebay?
@neeneko
@neeneko 2 года назад
Yeah, my computer classes in engineering school had a similar thing, though with us it was opamps. It was not a full class, but we did it around the same time as learning FPGAs and having to implement complex programmable digital logic, so it was a good reminder of 'digital logic with an ADC/DAC pair is not always the best or simplest solution'
@swapode
@swapode 2 года назад
While it's absolutely not the same thing, I encourage newish programmers to write a 6502 emulator. It's about as close as one can realistically get to building your own CPU hands on, which IMHO gives a worthwhile different perspective to the field than the now common approach to never leave the comfort of interpreters and virtual machines.
@TheWhatnever
@TheWhatnever 2 года назад
This is missing any mention to the other big alternative: Photonics. Startups like Lightmatter have shown that this is another very potent alternative. And I believe its benefits, of not being limited by electronic bandwith/losses and the ability to use one circuit to calculate the same calculation multiple times at the same time by using multiple colors/wavelengths is just astonishing. It was also left out that a big problem of these systems is the bottleneck in the conversion from general compute to these analog domains.
@Xenko007
@Xenko007 2 года назад
Hopefully he covers this topic in the future
@perc-ai
@perc-ai 2 года назад
how are u so smart
@KWifler
@KWifler 2 года назад
Probably because it is also an emerging system. But also because photons are used like electrons as the actor, a new actor, while the video is explaining two fundamentally different ways to act.
@ChristopherCricketWallace
@ChristopherCricketWallace 2 года назад
I was waiting for him to get to photonics, too. It's a HUGE opportunity to crazy amounts of parallel processing. And then there's quantum computing white whale, too...
@blueredbrick
@blueredbrick 2 года назад
I want my positronic brain patch
@coleballenger4595
@coleballenger4595 Месяц назад
12:48 They did my man Alex so wrong there lol! Great video as usual.
@timobakenecker7314
@timobakenecker7314 11 месяцев назад
This video really has put new aspects to my knowledge of AI in total. Thanks for that!
@NotWhatYouThink
@NotWhatYouThink 2 года назад
Great episode. Hadn’t considered the mix of digital and analog computers in a complementary fashion. I guess it’s not what I thought!
@WeponizedAutism
@WeponizedAutism 2 года назад
True, but the actual impact of this is not what you think.
@mushin111
@mushin111 2 года назад
Jesus, could you astroturf a bit harder please?
@LeoStaley
@LeoStaley 2 года назад
Until the 90s, US war ships used mechanical calculators to calculate aiming the guns, something that would be perfect for your channel.
@deusexaethera
@deusexaethera 2 года назад
I see what you did there.
@dieSpinnt
@dieSpinnt 2 года назад
BS! Fourier ... ROTFL
@joesterling4299
@joesterling4299 2 года назад
The biggest issue is distortion. Inexact calculations due to imperfect components, degradation of the data when transmitted (wired or wireless), external EM interference, all conspire to make the use of analog a special challenge. Mixing digital and analog to play to the strengths of each along the way intrigues me. I'm old enough to have experienced the full evolution of digital computing. My mindset is therefore quite biased toward it. What you propose would be quite the eye opener for me, if it actually can be made to work as prolifically as current digital technology.
@WilcoVerhoef
@WilcoVerhoef 2 года назад
I assume there's a lot to be discovered on the topic of self-correcting algorithms, or even error-correcting analog circuits that compensate partially for the inaccuracies. Like what Hamming codes are for digitally transmitted data.
@slippio
@slippio 2 года назад
nature exists in chaos, technology is more and more approaching the chaos orchestra.
@StevenSiew2
@StevenSiew2 2 года назад
Distortion really? I am under the impression that the biggest problem with analog computer is NOISE. You can never get rid of noise in an electrical system. Even if the hardware has no distortion, the inherent thermal noise in the system will cause some small calculation error.
@leftaroundabout
@leftaroundabout 2 года назад
@@StevenSiew2 that's true, but noise is something that AI needs to deal with anyway because the inputs will always be noisy to begin with. It can actually be useful to _add artificial noise_ while training a digital NN, to avoid overfitting issues. (Stochastic gradient descent can also be seen as a way of making the training “noisy”). As long as the pertubations are small and random, training won't be affected negatively. Distortions however are hard to deal with. You may be able to train a model on a particular chip that has such and such distortion; because the distortion properties don't fluctuate and constant-but-unknown biases, the weights will ruthlessly overfit to this particular chip, and then it probably won't work at all on another copy.
@Opsse
@Opsse 2 года назад
As a PhD student in this field, I can answer some of your questions. Yes, we usually talk more about noise than distortion. And thermal noise is not the only issue, there is read and write variability, resistance drift in time, the resistance of interconnections, ... However, it is true that neural networks can sometimes take advantage of the noise to avoid overfitting, but only a reasonable amount of noise and only in some cases. Self-correcting algorithms and error-correcting are options, but it's not that easy. Usually, this kind of method sacrifice the performance or requires more energy (which is the opposite of what we want). About the mixing digital and analog, they presented it nicely in the video, but the digital/analog converters required a lot of energy (sometimes more than the vector-matrices multiplication itself). So we don't want to do it too often.
@Paul-rs4gd
@Paul-rs4gd Год назад
I can see this analog technology being used in special purpose AI processors attached to normal digital computers. It makes sense - they could provide very large scale, cheap and energy efficient Neural Net acceleration. Since it appears that 'scale' is the most important thing for AI, it is really important to bring down the cost and energy consumption, so we can all run GPT3 on our laptops :)
@wattafakka4186
@wattafakka4186 2 месяца назад
great video, I always wondered about neural networks. Now I got it!!👍👍
@aetre1988
@aetre1988 2 года назад
My dad's "Back when I was your age" stories on computing were about how he had to learn on an analog computer, which, according to him, you "had to get up and running, whirring at just the right sound--you had to listen for it--before it would give you a correct calculation. Otherwise, you'd input 100+100 and get, say, 202 for an answer." he hasn't been able to remember what make/model that computer was, but i'm curious. any old-school computer geeks out there know what he may have been talking about? Era would have been late 60s or early 70s.
@GDScriptDude
@GDScriptDude 2 года назад
It sounds like your dad's computer was before the invention of the transistor. There was an analog computer at the electronics lab at the university of Hull, UK (when I was a student there in the 80s) that had moving parts. I remember when it became unstable and the professor sprinted across the lab to shut it down before it self-destructed. Something spinning suggests a sine wave generator for example.
@sapinva
@sapinva 2 года назад
Yeah, just like analog synthesizers. You have to let them warm up to a stable temperature first or they would constantly drift out of tune while playing. This was later solved with digital controllers.
@murmamirrmohaimen2271
@murmamirrmohaimen2271 2 года назад
Maybe the older mechanical calculators. Linus Tech Tips did a video on those. Super interesting stuff.
@urlkrueger
@urlkrueger 2 года назад
I can't address your question directly but in the later half of the 1960's I worked on a helicopter simulator, used to train military pilots, in which all computations simulating flight were performed by analog circuits made up of transistorized (no IC's) operational amplifiers and servo motors with feedback. This whole machine was housed in a 40 foot long semi trailer. In the rear of the trailer was a cockpit from a CH-46 helicopter including all the controls and instruments but the windows were frosted over so you were always flying IFR in a fog, i.e. no visuals. Next as you moved forward was an operator's station where you could control parameters such as air pressure and temperature, activate failures such as engine fire or hydraulic failure and such. The remainder of the trailer contained a row of electronics racks on each side housing the amplifiers, servos and other circuits that performed all the calculations. We can look at main rotor speed as an example of how it worked. Rotor speed was represented by the position of a servo motor from 0 to 120 degrees. The position of the motor was determined by the output of an amplifier whose inputs were derived from many variables such as engine power (there were two), collective control position and altitude. Attached to the servo motor was a potentiometer whose output drove a cockpit instrument but was also fed back to amplifiers/servos which were used to calculate engine power and such. There were many such subsystems with feedback loops interconnecting them so that failures were very difficult to diagnose. Often the only way to resolve a problem was to take a guess at which part might have failed and replace it. Also routine maintenance was very labor intensive as the many potentiometers would wear and need to be cleaned and then realigned which might take an hour for each one. As a young man I was totally amazed and fascinated by this technology. As an old man I can't believe that it really worked at all. But it did, at least some of the time.
@dick7540
@dick7540 2 года назад
Back in the day, circa 1957, I was an Electrical Engineering student at the City College of New York. In one of the labs we constructed an Analog Computer using physical components like Motors, Gears, etc. There was absolutely nothing binary/digital involved except weather you passed or failed the course. A couple of years later I worked with a Bendix G15 computer with an optional DDA (Digital Differential Analyzer). The DDA was an analog computer Input and output were analog. You can look upon Google. Search for " Bendix G15 computer with dda "
@tenou213
@tenou213 2 года назад
I'm a little disappointed by the title but impressed by the content. It's less "we're building computers wrong" and more "old method is relevant in a niche application". There's also the eventual plans for fully commercial quantum supercomputing clusters and ever faster internet connections which might further limit the applicability of these chips going forward. However, building processing-specialized chips instead of relying on graphics cards seems really promising in the short term so long as the market stabilizes.
@johnbotris8187
@johnbotris8187 2 года назад
Derek actually made a video a few years ago explaining why veritasium would start using clickbait titles (to appease the youtube algorithm)
@internettoughguy
@internettoughguy 2 года назад
It got you to click didn't it?
@dinglesworld
@dinglesworld 2 года назад
It’s for the click bro. And for good reason. If any channel deserves to clickbait, it’s this one.
@alwinsebastian7499
@alwinsebastian7499 2 года назад
@@dinglesworld agreed 100%
@Blaketarded
@Blaketarded 2 года назад
its not really niche when ai and algorithms are used everywhere.
@ChrisWalker-fq7kf
@ChrisWalker-fq7kf 4 месяца назад
That analog neural network was really interesting. But to me it's still essentially digital, i.e. discrete. In a normal digital solution you might have 16 possible values for the weights which would be encoded as 4 bits and would need to undergo addition/multiplication. But in the "analog" solution you encode the weights by setting one of 16 distinct voltage levels. The available voltage levels are quantised, not continuous so it's still a discrete system. It's great that you can do addition by just summing currents and multiplication by changing resistance. But you can even do this with binary: AND gates are multipliers and OR gates are adders if you only have 1 bit of data (1 OR 1 gives an overflow condition but the "analog" design will need enough voltage levels to avoid overflow also e.g. 7 + 13 would give the answer of 16 if this was the highest voltage level). I'd say it's still digital but it's not binary. It's multi-level logic.
@mikegiles1821
@mikegiles1821 Год назад
Very informative. Thanks for posting!
@TimeBucks
@TimeBucks 2 года назад
Amazing video!
@sheikhsumibegum2108
@sheikhsumibegum2108 2 года назад
Wow nice video
@kishungamer4036
@kishungamer4036 2 года назад
Nice video
@aidanl.9946
@aidanl.9946 2 года назад
i've always mused about this to myself, i always thought 'why not use analogue to calculate certain things', theres lots of stuff in physics that's extremely hard to calculate, but just 'happens' in the real world in an efficient way, the surface of a bubble for instance minimises surface area very rapidly in a way that takes no effort on the bubbles part, but is incredibly hard for a digital pc to calculate. the tricky part (and the reason people doing this are smart scientists/engineers and i'm not) is figuring out how to wrangle "the bubble" into a portable and responsive piece of hardware, and it's super cool to see efforts made in this direction are having success
@jimmysyar889
@jimmysyar889 2 года назад
Same thought. I used this technique to figure out a way to solve mazes super efficiently with flowing water. I think that’s what’s happening with quantum computers also.
@yoshienverde
@yoshienverde 2 года назад
It always comes back to the drawbacks Derek mentions at the beginning of the video: Analog processing is single-purpose, error-prone, and hard to repeat. As such, for your physics example, it would invalidate A LOT of the data you get back, since you cannot guarantee a certain level of falseability, auditability, and error margins. You CAN get there, but you start requiring A LOT of boilerplate circuitry around the actual solution solving hardware. As a silly and basic example that is almost trivial nowadays, but still there, you can think of the necessity of adding a lot of surge protection and current stabilization to a circuit to ensure that the natural unsteadiness of current in the power grid won't skew your results. And that's even just taking into account discrete and "simple" issues to calculate. Imagine processing data for some chaos-related physics theory, and basically getting pure rubish at the end, because even the slightest micro-volt level of disturbance automatically distorts everything. How about external interference? Or electromagnetic interference between the actual wires in the circuit? As I said, not imposible to tackle, but you suddenly have an overhead of 90% boilerplate just to make the results useful on anything practical. I can't even imagine all the engineering that must have gone into those Mythic semi-analog chips for AI, just to keep everything tidy. The fact a Realtek-sized chip can give you one third the performance of some nVidia Quadra (or similar) card, for a fourth of the power consumption of a cheap entry-level mobile Core i3 is just astounding!
@yoshienverde
@yoshienverde 2 года назад
To be clear, these Mythic chips point towards a future resurgence of analog processors not dissimilar to what digital ones brought in with their unparalleled versatility. Outside of very bespoke chips for very high amounts of money, probably in the realms of very high research, science, etc; I can see a general idea of modularity at a functional level. Say, you manufatcure analog chips that can do some very important but expensive math calculations that are common for most science in some specific branch (say, a lot of transformations, or integration, maybe some Lorentzians, and so on). Then, at research groups, institutes, university, they go and do the same as electronic engineers do with good old breadboards, and DIY some complex formulae on the fly, test their hypothesis, and iterate over the formuale as needed. Imagine those astrophysicists doing 2k term polynomials, being able to duct tape a dozen chips together, the same way electronic engineers use logic gates as basic digital units, and getting the results out in a couple of hours, instead of having to write a piece of software that will take a couple of days to run, a week to write, and any mistake or failed result requires another week to debug just to make sure it failed because you were wrong, and not because you input a 5 where a six should have gone when writing all 1500 terms for one of the formulae
@squeakybunny2776
@squeakybunny2776 2 года назад
Yes I've always thought this too. Aside from the negatives mentioned in the vid and comment above: "if you can't calculate it, let nature do it" I've used the term 'calculate' here, but I think it applies in a broader sense. If something is too hard to manufacture / produce precisely maybe nature can do it better.
@DrVonJay
@DrVonJay 2 года назад
@@yoshienverde wish I understood what you were saying but great rebuttal
@michaelperry9180
@michaelperry9180 4 месяца назад
Funnily enough, this video series helped me understand a bit better how analog music production works. "Modular setups" look a lot like the computer you used to model the Lorenz System.
@photorealm
@photorealm 7 месяцев назад
When I started thinking about artificial neural nets, I just assumed they would really only happen on specialized analog computers in the future. Then google and others along with more powerful digital computers made it work pretty darn great. I love being in this time of history, watching so much science fiction slowly become reality.
@siemensmolders4131
@siemensmolders4131 2 года назад
Interesting video, but felt a little bit too hyped up for me ^^ The discussed challenge appears to be a highly specific application; matrix multiplication. The solution shown here was an analog ASIC (application-specific integrated circuit), which is a type of chip we've been making for over half a century. Once a tasks becomes both computationally expensive and very specific, the fastest method has always been to make a specific chip for it. Nor is analog multiplication anything new, I remember being taught the little analog multiplier circuit with the Gilbert cell over a decade ago.
@matteod2567
@matteod2567 2 года назад
most of his videos are like this lol
@aceman0000099
@aceman0000099 2 года назад
I believe Derek found a little niche to focus on since he did the video on the ancient Greek analogue computer, which had an almost identical conclusion
@ejpmooB
@ejpmooB 2 года назад
I feel he is on to something here ... maybe the real benefit is that you don't have to make all these specific chips, because in principle one fairly big analog one could do everything you threw at it. But it feels a bit scary to me too, because you are getting closer to biological systems.
@danielraymond3045
@danielraymond3045 2 года назад
Yeah, the reduction in power consumption I'd imagine is mostly due to it being an ASIC, not being analog. There are quite a few digital AI inference ASICs coming onto the market as well - I'm curious to see which ones will reign supreme
@mori3327
@mori3327 2 года назад
Hi: Unfortunately I can not speak English, so I have to use the program I have installed on my phone for translation, except that I can speak Persian, I do not speak any language other than Persian and no other language. I can not speak, so if I said words or sentences and the special program mistranslated into your language, I apologize in advance to your esteemed father, from the bottom of my heart and from the bottom of my heart, that the letters and words Call me to the end and call me a person orDo not consider me a rude person and do not consider me a rude person and judge me correctly and after reading my writings, just put yourself in my place for a few minutes and imagine yourself in my place, maybe if you have an awakened conscience and There was love and affection in your hearts, of course, if you did not have pride and arrogance, understand me and give me the right, and again, maybe, maybe, maybe you did something and you did it for me and you took me from You saved this great tribulation that I hope will not happen to any living thing, of courseI do not very much hope that anyone will take my hand and save me from this misfortune, but I am writing so that I can at least be comfortable in front of my conscience and not blame myself later if I cry out for this cruelty, oppression, and captivity that has fallen on me. All the doors are closed to me, even the eyes of God are blind to see the oppression that is being done to me, and the ears of God are blind to hearing the cries of my constant cries, my midnight cries, my daily sufferings, my daily prayers and the jurists and The cries of my every moment from this oppression, oppression and cruelty that from the beginning of my life, fromThe first events of my life that I remember are deaf. So how can I hope for others when God has done nothing for me and trapped me in a cruel, cruel, cruel, and oppressive family? I am Morteza, I am from Iran, I am blind, I am 34 years old, I am unemployed because in our country, Iran, there is no work for healthy people, let alone disabled people like me. I live with my family in a small town in Iran. Of course, in appearance, they are my family, but in reality, they see me as their own enemy, and with me, who am their child, only bI was born blind, and I was not to blame for this, but my parents, because they were illiterate, considered me as a disgrace only to myself for my blindness and disability, and were always tortured as severely as possible. They beat me to the point where they threatened me with a knife. They put me to sleep and put the knife on my neck, and I was terrified and scared. Get up and until fullThe thin skin of my hand burned, they would not let me go. And they tortured me many, many times, to the point that my brothers, with the support of my parents, tortured me in front of my parents' indifferent eyes, and told me that I was blind. You are and you should be tortured to the extent that they created the belief in me and in my mind that anyone who is blind or disabled should be tortured because it is a disgrace to the family and society. And unfortunately, their torture is still going on, and only a kind of tortureThey do not consider it for him, so how can I complain to someone who considers my law as his property, of course, the current laws in Iran. Unfortunately, the government does not have a place for people like me to go and live. I really have no choice, either I have to commit suicide or I have to stay in the same house that my family has turned into hell for me, under the severe mental and physical torture that my family inflicts on me. And more than ever with a horrible gradual death that tormentedI want to stay. So if you still have a little mercy, fairness, conscience, compassion, love, humanity, knowledge, ideology, humanity and altruism, help me, hold my hand, reach my cry. If you are in contact with institutions and organizations affiliated with the International Committee of the Red Cross, or if any of you are a member of human rights organizations or the International Committee of the Red Cross or Red Cross organizations in free and pro-human rights countries, listen to my writings, my voice and my cry. Come on, maybe they're a little fair and think of me and a way toSave me from the clutches of ostensibly wolves, of course, if human rights institutions and organizations and the Red Cross actually support human rights, if you are a citizen of a free country or a citizen of a free country like the United States, Canada, Australia, Great Britain, Norway, You have Sweden, Denmark, Finland, and the European Union. Send me invitations, arrange for me to leave this house, which is worse than hell, so that maybe I too can taste freedom. If you have capital, you are rich, you have money, at least help meLeave the messengers, so that all the photos on my personal pages in the messengers and social networks are from 8 years ago, that is, for 8 years ago and 5 years ago. I have not even traveled for many years, because our city is a small city. It is sad for a disabled person to leave home, it is sad for a disabled person, the streets of our city are not adapted so that a disabled person can easily be at least a little out of home. And I'm really depressed at home, especially with this family that is always torturingAre. Of course, I do not care if I travel or not, because my problems are so great that not traveling travels much more than my other or other problems. In your opinion, can a person who eats only one meal in 48 hours, which is a good meal, even a moderate meal, but also any garbage he gets, take pictures, travel or not, and other things? Slowly, someone who breaks his heart at every moment, of course, the work of my heart is no longer broken and my heart is on fire andIt is burned, and this fire is getting more and more hot and burning, and it is burning and ashes my whole being. So I hope that if you do not hear the sound of my heart breaking, at least be fair and feel the smell of my heart burning, away from arrogance, arrogance, misguidedness, ethnicity, race, skin color, shape and appearance for the sake of humanity and for Humanity and honor that you have, just imagine yourself in my place for a few minutes, see yourself in my place and think, if far away from you, God forbid, you would be in my place.Did you ??? Were you satisfied that someone was making fun of you or disrespecting you, or did you laugh at you in response to your message and pain and heart ?, so my dear friends and those who wrote my writings to you Please, if you do not want to do anything for me, at least do not make fun of me, do not laugh, and if you do not want to help extinguish the fire inside me that burns my heart, at least do not spill oil and gasoline with ill-considered language, words and expressions, and this is far from knowledge, honor, It is humanity and family originality. If you want to give me anyPlease help me do not send me a message on RU-vid because I can not transfer messages from RU-vid to the translation program that I have installed on my phone, and as a result, because I am not fluent in any language other than Persian, I can not understand the meaning of messages. Let me know what you sent me on RU-vid. If you wanted and could and the possibility of helping me in any way, whether financially, materially or spiritually, please contact me on WhatsApp or Telegram, because in WhatsApp and Telegram I can easily send messages.00989358205228 00989182804420 00989358205228 00989182804420 00989358205228 00989182804420 00989358205228 😞😞😞
@StratEdgyProductions
@StratEdgyProductions 2 года назад
This was a banger of an episode. I was enraptured the entire time. Tight story telling with a great hook and title. You're a pro, man.
@Strawberry_ZA
@Strawberry_ZA 2 года назад
Fancy seeing you here ❤️
@oDxrk
@oDxrk 2 года назад
hm
@trec_log
@trec_log 2 года назад
hook, line and thinker
@memyselfandi6364
@memyselfandi6364 2 года назад
Damn Canadians keep blowing my mind. TELL THEM TO STOP IT!
@killercuddles7051
@killercuddles7051 2 года назад
SARS CoV-2 was patented with UNITED STATES after being developed by Pirbright Institute in UK
@granitfog
@granitfog 17 дней назад
A small point, refering the sum of inputs needed to stimulate a neuron, you called it "bias" but "threashold" is a better descriptor of the phenomenon. In fact the offical term is "threashold potential" (potential referring to charge needed to do work, the work being depoarization of the membran and propagating an impulse)
@dominikhauk4638
@dominikhauk4638 Год назад
This has to be the most insightful and entertaining channel on youtube
@keithsmith3118
@keithsmith3118 2 года назад
When I was in the Navy I worked on the Fresnel Lens Optical Landing System. There was no 1% error. It was .005 VDC tolerance over a minimum 5 VDC base. The computers had a lot of math to solve to target the hook touch down point for each aircraft. It was completely analog and op amp driven and has been around for over half a century. I've witnessed many many old analog machines in manufacturing since then. Analog technology isn't new technology, it's forgotten technology pushed aside by the digital technologies. I'm happy to see it hasn't completely died.
@KarthiSrinivasan
@KarthiSrinivasan 2 года назад
There's an entire field of research called neuromorphic computing/engineering looking into this very problem. It was pioneered by Carver Mead in the 90s and has seen a lot of interest lately.
@lxschwalb
@lxschwalb 2 года назад
I was waiting for him to either mention the words "neuromorphic" or "memristors"
@jecelassumpcaojr890
@jecelassumpcaojr890 2 года назад
I remember reading about Mead's analog stuff in the 1980s, something related to hearing. Perhaps my memory is wrong.
@rule1dontgosplat
@rule1dontgosplat Месяц назад
Holy crap… I remember seeing the ALVINN van somewhere in the 1980s. Not sure if it was on PBS or something like that. That’s hilarious.
@laikavoid3364
@laikavoid3364 11 месяцев назад
Such an amazing video! Great work!
@pavanagrawal6397
@pavanagrawal6397 2 года назад
Fantastic video and I learnt a lot being a biologist. Small correction, neurons (the real ones) are indeed analog in the sense that they can tweak their output and fire, fire more, fire less just like an analog computers. This happens by a combination of changes in neurotransmitters, their dumping at the synapses and adding neuropeptides that can change ‘gain’ from the neural networks.
@michaelmeichtry316
@michaelmeichtry316 2 года назад
Exactly! The analog behavior of neurons is closely modeled by the analog current/voltage exhibited by the tweaked transistor cells, as so well demonstrated and visualized in the video.
@kalliste23
@kalliste23 2 года назад
Neurons have a lot going on inside, and things are happening outside, that affect what they do and when they do it. It amazes me that computer neural networks work at all, let alone as well as they do.
@vyor8837
@vyor8837 2 года назад
Ya, so take what he's wrong about in the field you know and apply it to the field I know(comp sci) and suddenly the entire video is a load of rubbish.
@grumpystiltskin
@grumpystiltskin 2 года назад
@@kalliste23 Don't get me started about the neurons in a squid vs a human... they have fewer, bigger and more complex neurons.
@blucat4
@blucat4 Год назад
@@vyor8837 Not a load of rubbish, just amazingly primitive compared to what it's trying to mimic. And also use specific. And not really capable of learning new kinds of tasks. ;-)
@Anomynous
@Anomynous 2 года назад
"Simple tasks like telling apart cats and dogs." You can find more difficult task but this is already an incredibly complex task expecially when they are images
@henrypetchfood
@henrypetchfood 2 года назад
This is exactly the point though. Trivial for a human to do, hard for a computer.
@Cyrribrae
@Cyrribrae 2 года назад
@@henrypetchfood I literally just had a friend tell me a story about their mother misidentifying a pomeranian as a cat haha. Maybe not always trivial.
@anders5611
@anders5611 2 года назад
@@henrypetchfood It's trivial for a human because evolution produced neural circuits capable of solving this very hard problem. Our own minds are the least aware of what they do best.
@duckseverywhere8119
@duckseverywhere8119 2 года назад
True, but Derek's point is that in the grand scale of what we'd hope to achieve with analogue computers (in the future), telling apart cats and dogs is a simple expectation - yet it's still hard to do with current technology.
@user-lx3xc6ti3p
@user-lx3xc6ti3p 8 месяцев назад
amazing episode, well explained! . amazing episode, well explained! .
@dt-wq7ql
@dt-wq7ql 7 месяцев назад
Excellent presentation. My brain never got much past my spirograph set. It was functional at some stage . 😮
@faux_grey
@faux_grey 2 года назад
In this video you have also given the most simple, straightforward explanation for AI training and inference I've ever seen.
@masterbulgokov
@masterbulgokov 2 года назад
"Better suited" is the key. Quantum computing will fall into the same clause: there some things quantum computing is "better suited" for.
@BreaksFast
@BreaksFast 2 года назад
quantum computers (one that use physical q-bits) are only hypothetical, but people talk as if they already exist in reality. They don't, there is not a single, fully functional quantum computer on the planet, and there might never be.
@ninjafruitchilled
@ninjafruitchilled 2 года назад
@@BreaksFast Sure they exist, they just don't have very many q-bits.
@RyanGrissett
@RyanGrissett 2 года назад
@@BreaksFast The computers do exist, but there is a lack of understanding in programming them to do classical computing problems.
@scyfrix
@scyfrix 2 года назад
@@BreaksFast They can and do exist, albeit with very limited qubit counts. The first experimental demonstration of one was in 1998. D-Wave Systems are selling computers with 2048+ qubits right now.
@jamesx9881
@jamesx9881 2 года назад
@@BreaksFast Tell that to IBM?
@simon1386
@simon1386 10 дней назад
Most beautiful and clear explanation of AI and neural networks to date.
@ProfRvS
@ProfRvS Месяц назад
Thanks for this vieo (and many others you are making) - I have become a big fan of your channel! However, watching this video in particular in connection with your clickbait video got me thinking, because I will be recommending this video for my students because of the parts on AI, ANNs and Imagenet and I never would have expected to find these excellent sections from your title (clickbait ...) Thus, I only saw these by chance, since I watched the video out of general interest. This lead me to asking myself whether it wouldn't make sense to have something like alternate titles for different (target) groups - something RU-vid could spend a bit of AI research on, maybe? Other than that: keep up the good work!
@dekev7503
@dekev7503 2 года назад
This just goes to show that no knowledge is useless. When I was in my final year of my undergraduate degree ( Electrical Engineering) I took a course on analog computers and the general consensus was that this field was obsolete. That year was the last year that the course was taught as it was phased out in the new curriculum.
@scottmarquardt8770
@scottmarquardt8770 2 года назад
Yeah, the old Navy fire control systems - along with directional aspects of sonar/radar - were analog from beginning to end, and the math required to come up with a fire-control solution that was stabilized in 3d on a moving ship, was intrinsic. It didn't compute as we think of it today - the problem and the solution were just a single feedback loop. I remember early in my training when I grasped this, it seemed like magic. Completely steeped in digital computation in my current work, it still seems more magical.
@manuelsilva8528
@manuelsilva8528 Год назад
This all went a little over my head, as i'm not a computer guy. But as far as i understand, neural net works are a sort of hybrid of digital and analog, with current between the neurons (transistors/bits) doing the analog computing and the neurons firing or not, doing the digital part of the computing. What i didn't understand was: 1. Are there are programs/algorithms to this or it's all based on teaching the network? 1.1. If so, are the algorithms/programs saved in a sort of adjacent storage unit that works with the neural network (doing the inputs and/or outputs), or are those built in the network itself by analogy, wich would make the neural network a single purpose device? 2. If one has to "teach" these neural networks, can their lernings be replicated for mass production of those neural networks, or would you have to teach every single one of them?
@pbinnj3250
@pbinnj3250 5 месяцев назад
I cannot express all of my appreciation for this video. I understood it and I gained an enormous amount from it. If I sound unduly excited, it’s because I thought this stuff was beyond me. Thank you.
@WahPony
@WahPony 2 года назад
Processors have always been slapping new modules on to cover different types of problems. The GPU was added to do repetitive operations, receivers of radio signals usually have an analog de-modulator to make practical doing signal processing on them, quantum computers are right around the corner where 12 or so qubits could be connected to your processors in a seperate box via thunderbolt to perform verification and encryption tasks, GPUs now have internal tensor cores to perform the operations you discussed (and even a bit more general) at lower bit depth, and even processes like "multiply then add" have separate modules inside the processor to compute more efficiently then multiply first, then add as separate operations.
@joefish6091
@joefish6091 2 года назад
Speaking of radio receivers, think SDR. software defined radio, a revolution in radio..
@chychywoohoo
@chychywoohoo 2 года назад
Quantum computers are not right around the corner
@Kyrator88
@Kyrator88 2 года назад
Quantum computers are not gonna be beside your computers unless you have a massive shed with space for a state of the art helium/nitrogen cooling system on hand
@SimonBuchanNz
@SimonBuchanNz 2 года назад
@@Kyrator88 I'm not completely discounting the possibility of solid state room temperature qubits, but yeah, excitement about personal quantum computers is pretty silly.
@TheDXPower
@TheDXPower 2 года назад
A quantum computer is not required for performing quantum-safe encryption/decryption. NIST is very close to standardizing one of many candidates that provides this functionality.
@Crowald
@Crowald Год назад
So, this was Harold Finch's solution in Person of Interest. His ability to create an autonomous observant AI to identify dangerous behavior was the result of Rosenblat, and he did it 15 or 20 years before anyone else would even attempt to do so. Missed an opportunity to mention him in PoI. Neumann was mathematics, Turing is the father of modern computing, but Rosenblat was a maverick on the nature of neural networks.
@johndawson6057
@johndawson6057 Год назад
Oh my god thank you for bringing this up. Ever since i watched that show I have been set on learning everything and anything about AI. It has inspired me set me on my current course in Comp Science.
@stargaming1635
@stargaming1635 Месяц назад
Your an absolute legend! I love your videos ❤
@yourright4510
@yourright4510 11 месяцев назад
While it may be true that we are reaching a limit we’re not quite certain what computational power some new neural networks will need. This is for future applicable outputs needed. Hinting at the new analog data calculations coming into the forefront.
@ElectroBOOM
@ElectroBOOM 2 года назад
Awesome information!
@Mani_Umakant23
@Mani_Umakant23 2 года назад
I gave you your first like 😁
@N____er
@N____er 2 года назад
@@Mani_Umakant23 Why would you like such an unoriginal comment that provides so little value or thought?
@Mani_Umakant23
@Mani_Umakant23 2 года назад
@@N____er Aise hi sexy lag rha tha.
@40.vedantdubey8c6
@40.vedantdubey8c6 2 года назад
@@N____er Don't say anything bad about ElectroBOOM he is such a wonderful creator
@40.vedantdubey8c6
@40.vedantdubey8c6 2 года назад
Hi sir I am a big fan of yours \
@sternis1
@sternis1 2 года назад
I remember my friend once made a completely analog line follower robot. He implemented a PID controllers using Opamps and trimming the parameters with 3 variable resistors. It actually worked quite well!
@gamedominatorxennongdm7956
@gamedominatorxennongdm7956 2 года назад
That's pretty creative and clever of him.
@allowambeBOWWAMB
@allowambeBOWWAMB 2 года назад
opamps are wonderful
@omniverideus
@omniverideus 2 года назад
That sounds like the micro-mouse project. In the UK at schools in the late 90's it was a challenge for electronics students (electives yr10-12). Really fun finding solutions to the problem of navigating a maze either on paper (a line) in a 3d maze or both.
@EpicVideoGamer7771
@EpicVideoGamer7771 2 года назад
Your comment was stolen :/
@dickslocum
@dickslocum 11 месяцев назад
Looks just like the breadboard programing process I was taught during the late 60s in my Introduction to Digital Programing associate degree program. . If you are using electricity to produce the output it is not an analog computer. it is using digital technology to regulate the voltage, current and amperage for your O-scope.
@NovaPax
@NovaPax Год назад
From my understanding, we're kinda doing the same thing in quantum computing. The "quantum time crystals" made by Google are somewhat like the analog chips, and we run digital checks on the data once it's done. I've also seen people layer GPT with Wolfram Alpha and other more specialized programs to improve accuracy.
@seeker_of_knowlage3568
@seeker_of_knowlage3568 Год назад
Intersting
@akinoshimo
@akinoshimo 2 года назад
When I was an engineering student in the 1970's, courses on Analog Computers was required after a first course in linear differential equations. (using slide rules was allowed in class, but not electronic calculators). In your description of analog computers compared to digital computers, we used operational amplifiers to create integrators and differentiators, so there are transistors (tubes before bipolar transistors) in analog computers not just resistors and capacitors only.
@akinoshimo
@akinoshimo 2 года назад
@africancheez Engineering is a great profession. Along with Analog Computers we had to take classes with Digital Computers. As a student, I used the HP1100 (CPU , paper tape reader, 1MB hard disk), IBM370 (teletype and punch cards), and PDP11 (punch cards) to learn Fortran IV, Pascal, and Basic. My professors were experimenting with the TRS80, Timex, and Apple 1 personal computers (most arrived as kits) as a hobby (top speed was 1MHz...1977 timeframe). In the computer lab, there was a desk top PC made by Commodore (commodore PET with a built-in cassette tape for mass data storage), but only Grad Students were allowed to use it. It was good to know the difference between solving certain problems using a program on a digital computer versus solving the same problem using a far simpler analog computer. good luck to ya.
@theonetrueanthonylong1843
@theonetrueanthonylong1843 2 года назад
I can only speak for my associates degree program but we still do that. This past semester we used IC opamps to make integrator and differentiator circuits. They were some fun labs. I love seeing some of the signal art people make on oscopes.
@thomasmaughan4798
@thomasmaughan4798 2 года назад
@africancheez "we have had the technology for a long time, but the world just wasn't ready for it yet." It was never just a technology problem. Do you want your data in someone else's warehouse? Very often, no. Actually I believe it is never optimum since you lose some 4th amendment rights regarding search and seizure. On the other hand, a small company that cannot justify the expensive and reliable talent needed to guard company secrets has no good choices and might as well go "in the cloud" but even there dividing up the cloud (some to Amazon, some to Microsoft) would prevent or reduce complete and total disaster.