I have to disagree that we live in an analogue world. The world is comprised of discrete particles, which you can count precisely. There is also a minimum size, i.e. the Planck length. An analogue world would be infinitely divisible, which it clearly isn't. It's certainly a very fine resolution and gives the appearance of being analogue on the macro scale but at its core, it is digital nonetheless.
Hi, may i know what is the future for current fresh VLSI engineers? expecially for frontend VLSI? As de morgans law also nearly ended. Please comment on this.
Yeah, tried explaining this on her prior episodes, at others in threads. Photonic is to Bianary signal bandwidth, as Digital speed was vs of Moris code days.
You always do great videos with concise, accurate and understandable information about a subject area that you obviously know so much about. I'm a computer scientist and physicist and I can't imagine my life without the excitement of watching computer and other technologies evolve. For me the added extra that your videos bring to this technology-following corner of the RU-vid universe is how you manage to express the sheer joy and excitement that technology followers like me have in seeing all of this progress. What a time to be alive! I really do think that the biggest reason why I try to look after my health is so that I can live as long as possible to experience as much of this journey as I possibly can.
Can you do a little more detail on each topic? Boolean logic through gates is something many of us have learned in basic digital classes. A little more on how these new structures work to solve logic would be great.
AWEsome Thank You ! many years ago was involved in many different types direct photonic sensors and actuators (no electrons involved) and demo projects but with electronic compute/control. Now getting close to all photons all the way! SWEET!
This is an inspiring video ! I send this message as constructive feedback - It would be helpful if the speed of narration slowed by eg 20% - I have shared this with a family member who has hearing challenges and the captions have to be read at such high speed it is not possible to also look at the wonderful images 🤗
Great info Anastasi but please use the Gerund to describe the activity... 'Computing power', 'Biological Computing, ... Computing not Compute. Computing = activity , Computer = physical object ; both are nouns.
Your enthusiasm is palpable. But, I wanted to correct the statement made around 1:20. It's not the movement of electrons but rather the electromagnetic waves that transmit information. It's all at the speed of light but depends on the transfer medium
A mine of knowledge and information really fascinating and compelling, I imagine that among the computers you mentioned the quantum computer is the closest one.
Very interesting and well-presented video. Has there been any testing to see if "quantum entanglement" occurs between a brain cell on a chip and the host brain it came from? It would make for some weird Science and Scifi if it actually connected the host to the computer at a distance.
I always thought it was interesting tech. But acknowledging that they can run truly parallel processes by using different colors of light oped my eyes to how powerful it can be. Definitely looking forward to seeing how this advances.
SUPERB. The analogue devices you briefly showed resemble 'gate kit' teaching aids made in the 1990s in Cambridgeshire, England which demonstrated how PCs basically function across a confusing range of manufacturing back then. The BBC Acorn computers were especially resilient and gave colour gamuts far superior to anything else, thus the 'gate kits' were made to resemble them and helped young people to transition from the learner stage to usage (including simple programming back then) at a level of understanding which is missing today unless students take it up at tertiary level. Ironically, Acorn was 'bought out' after failed cooperation with Apple by IBM and a part of Microsoft in order to close it down. Acorn used RISC while the Americans took the CISC directions which, as we now know, required ever more powerful (and hotter) PUs. The heart of the irony is that Acorn's RISC team remained intact around Cambridge University and eventually formed the ARM design company upon which most phone PUs and Apple's M processing units are based. Given that ARM's people (mostly university teachers/researchers) had been bitten by US companies in the past they copyrighted all and every patented change so as to make a small profit for further development -- therefore not falling foul of commercial fads and hell-bent greed and reduced choice of computer type suiting different purposes well into the future. They also reserve the right to deny licenses to companies harming people and the planet by charter as a legacy to succeeding generations. Most of the original Acorn engineering team are either retired or deceased by now but the principles of ongoing research and business 'ecosystem' were ready to make big strides in fog and dew technologies which would have taken conventional corporates years to harmonise with now. They do not have a 'box' to have think outside of!
Very good video. Anastasi, do you know any content about RISC-V, references on how to use it, development kits, languages, etc? One other thing, do you think analog processors are a good field, especially now with these AI accelerator chips?
Search- RISC-V intel. Intel has embedded a RISC-V chip in one of their CPUs. Regardless, I have "develop an app using ASM for RISC-V" on my bucket list. Simply because there's an app for that and I'm down with anything about FOSS meets bare metal.
That thing with punishing brain cells sounds extremely terrifying. Like the start of every science fiction dystopia where the machines eventually riot and kill all humans.
This may be way off topic that as I was watching your videos surfing the net. One of my main interests is quantum physics and it's relation two classical physics I am and trust me I am no way at the level that I see all the scientists out there I am just fascinated by puzzles. But I see the difference between quantum physics and classical physics in an analogy is like if I were to look into the sky and see a passenger jet way off in the distance it would be moving comparatively slow as it would be moving if I were standing on cross proximity to it passing me by. So what I'm seeing is when I'm standing and exerting the physical world from my perspective it is moving through time apparently slowly to if I were to observe it from the quantum level so in essence this is why we get the irregularities between the two physics
I think the "analog" (they still work in discrete states, just multiple ones) computers that are coming along now, will be the next big thing. Basically if you can make a computer that thinks in multiple bits at it's base level you can dramatically simplify the interconnections between parts by having each wire in a chip transmit multiple values at once.
Question, for anybody that might be able to help me on this, are those analog chips, being designed to be able to restructure themselves to run different AI software programs, if so fantastic, if not what value added is it having a chip that is forever fixed to performing what type of software action. also is it just me or does biological chips and photonics chip seems more an evolution of the current exiting chips, the biological will have the ability to restructure itself to run different software programs, ( right now we have different specialised chips for this) while the photonics will improve the speed and reduce the energy usage of chips, which seems to be something we have always been searching for.
Thank you, Anastasi. As always, great presentation! What impact will A.I. have on the world 20 years from now? Via advanced generative transformers and their successors, all trainable creativity will be synthesizable. Given that ALL of IT is just binary patterns, all of IT will be driven by design documents or written/spoken specifications. Think about it. Let's take a fairly simple example: Soon, some enterprising A.I. group will download a bunch (thousands) of apps from the app stores and work out methods for inferring apps, either from examples, design documents, written or spoken features and actions lists, or all of the above. This particular example will happen before 2030 in my view.
@@hygrobiology Agreed! I was merely using her suggested metric. Indeed, things are happening so fast already that it's becoming hard to precisely target "when". A clear signal that the Technological Singularity (TS) is near, if not already that we've crossed it's event horizon. (Here using the sense of TS related to becoming increasingly less capable of forecasting technological advance rather than the TS sense of having achieved AGI.)
I agree that future of machine learning lies within neuromorphic photonics, GPU's are already heavily bottlenecked. I'm sure the big players will continue with conventional silicon for the next decade until they hit the limits of transistor (.3 nm for silicon I believe and I doubt they will get even to there with tunneling/overheating etc). Great video btw!
Great video! It will be interesting to see where it goes, but I think analog computing will be a part of it. open gate/close gate can only take you so far.
I'm subscribing because she is intellectually capable and also because her conscience is amazing. And of course she's extremely beautiful. I'm a subscriber.. 😏
On a less serious matter (well sort of) you should check the mathematics behind Wavelets and their applications to tech, this will be crucial for light based technology as well as radio, just to mention a couple crucial technologies that will be deeply improved upon thanks to this.
Oh, and I would like to use something like at least two deep neuralink interfaces linked together externally using a massive "self-programming" neural net to link them that would work by reacting to or acting on the brain. Basically an attempt at merging an AI neural network with a biological neural network.
“Our platform enables the largest most powerful…. But we don’t care about making your life better, we care about giving your leaders and your ruling class the tools they need to control you more easily. And we’re good for the ENVIRONMENT too!
Analog's problem now (integrated designs) and from the beginning (like when digital memory was still discreet shift registers!) is noise and linearity. Photon technology might reach the requirements or clever error correction using near current scale integration. Who knows, but don't count Intel or IBM out. They know how to build stuff. To get an insight into the kind of human mind that's required to create the AI algorithms needed in the future, Midjourney AI or DALL·E by Open AI brings it home in a visual way everyone can understand. It's actually a little spooky. Qbits, we have to solidly verify what we think we're creating.
I see AGI advancing through use of home scaled Dojo-NAS-Compute hybrid systems. Attach these to distributed I/Os, cars, security sensors, & a second-skin onesie that includes imbedded sonors & transmitters. Human-Machine-Interface.
Computers for home or office need common sense and learning. This will enable them to fix themselves and be more helpful to the user. They will know and work toward the main goal of what you are trying to do. Maybe a neural net/ conventional computer hybrid?
We live in exciting times where a single lifespan has seen the growth of computers to help mankind. Let’s root for and encourage continued support for those engineers that will help this wave grow and make the future of computing possible.
Modernly the perception of computer/artifical entities, have been the logic of metalic/hard elements structure. Truth is, they will look absolutely replica to anything flesh form, rendering in neuro access brain 🧠 structure.
I remember hearing, back in the late 90s, someone theorizing that, one day, it will no longer be about solid state computing-it’s just a crutch for us. That one day, biotechnology will surpass hardware: creating organisms to think, process etc.
1st time viewer. Came here because of the algorithm. It would appear you deliver good content. Do you ever get used to the her voice? 😄 (And I don’t mean her accent)
I love your channel dose your letter you pot out cover the same subjects as on your Vids? the reason i ask is you are hard to understand for me because my hearing is damaged.
I love the optimism on how AI can help humanity solve disease climate change etc.... I wish the media would jump on that bandwagon instead of focusing on potential negative impacts on humanity
Huge ethical questions with what Cortical Labs is doing. Very disturbing with the punishing of the neurons etc. I mean now, it might seem ok - because it's only a handful of neurons playing pongs. But the goal is obviously to scale well, well beyond that - and that's where the ethical issues will start becoming absolutely glaring.
I thought along the same lines. What happens if the system becomes 14:55 sentient, can eventually feel emotionally, and we’re punishing it on a regular basis? Horrible concept. Unforgivable.
Young lady I'm quite intrigued by you a question for you if everything is set at AC voltage and is calculated in watts per square foot each square foot is calculated to 2 1/2 w question one if DC is applied instead of AC. Energy consumption would be? I'm at theoretical scientist and a sophomore and Applied Mathematics this would be a question in India where I teach we call it a doubt we never use the word question we only use the word doubt that way you can explain at that exact second. :-) thank you young lady just an old cowboy saying hi.
@anastasi The first transistors were Germanium. Then 6 years later came the Silicon type. "Compute" is verb, I think you mean to say "computer", a substantive?