Your video always great. It gave me a bigger picture to see what's happening in the world of technology right now and the future. Keep up the good work 👍🏻
Finally got a chance to look at the house and see how it goes and if you want to be a part of the address is it possible to get a better idea of what you want to do that 😉😉😉
The chip could be designed in a way to enable capturing and restoring 'snapshots' of the individual neuron states. It could also be hooked up the traditional storage mediums and 'taught' to use it like you would record data on a piece of paper.
Well yes and no... in a complete system it can of course simply store weights and biases off chip and access them later... a kind of neuromorphic cache stored on a standard digital retrieval system. And storing a machine learning model is just that, it really is just text in the form of weights and biases. But internally? Well yes the processor will immediately 'forget' its previous state on every single iteration ... its forgetting things millions of times a second faster than you can... lol. But ultimately the 'memory' is a system that results from the current state of voltages between axons on the chip... which do change over time yes... so yes, it can forget things, that's a thing. Unlike your brain however, which also has an issue with recall? Well you use paper and pen or a smartphone to remind yourself... the machine can as I said store such things and is even pretty much psychic since it can also rely on the machines around it to help it out. You can't access someone else's head to get a clue about something you never learned... these machine s can of course.
Why don't create a supercomputer with thousands of these chips for mimicking a whole human brain? We can call it Heuristically Programmed Algorithmic Computer .
Thanks so much for the comment! My visual style is heavily influenced by Nerdwriter1, kaptainkristian and the vox playlist: earworm. The best advice I can give is to just start creating content and you will build your skills over time!
Great advice and help. Thank you. I did a catalog for 20 years and web ms front page web. Now learning wp. Practice site GeeWhiz.store testing fun site. Old site RMConnection.com
This is awesome!! So, is this "as simple" as replacing a GPU with an MSC160 Board - putting aside the need to re-write the CUDA Kernels into whatever language or waiting for google to update TensorFlow? How does data get into the TrueNorth chips? Can the system be setup to directly stream raw video data, from a video camera, into the MSC160 and then some HW interrupt would trigger "the kernel" to run?
thare is a companey called brainchip that reacently signed a contract with NASA to send a neuromorphic chip into space. the chip is called akida and its main function is to be implanted into sensors on rovers and the like to make those sensors inherently smart and not rely on a central computer.
Great video! Happy to discover a new topic I haven’t heard of yet, especially already in product stage. Just realized I haven’t watched your past 2 videos, will do this now.
YAWN! This has been done TO DEATH already. It's just asynchronous dataflow architecture. Check out GreenArrays, etc. It's also a basic structure of the GOOD FPGAs and CPLDs. IBM has been known to do this -- make a project to just hype and troll to the extreme for marketing purposes. Anybody seen Watson these days?
I’ve always wanted to get into A.I/ Robotics as of late, I’m 27 now but don’t wanna work in food manufacturing because it’s a dead end job, in the near future, A.I could replace my job by 2030, and that’s not a doubt, I think by 2030 the Mckinsey report said that 800 Million jobs would be lost at the start of that decade by advancements in automation/ robotics/ A.I, but by 2022 100 Million new jobs will emerge from A.I/ data science, I’ve started to learn C programming for robotics and python for A.I this month
@@bassplayer807 I started working with PCs back in 1986 and shortly thereafter A.I. was a thing, I believe it was a thing before that. ELSA was the A.I. at the time for PC while Apple introduced Lisa. A.I and SciFi go way back but given the newer designs of CPUs and GPUs A.I. are more practical these days then decades ago. C seems to have a serious future given its overall age so you are wise to pick it up now; python also has a solid future. Unfortunately for me Basic, Fortran and a few others were quickly phased out while C just kept chugging along and I never got into C as deep as I should have. Automation and/or A.I. will be the new slave labor of the future and will cut costs dramatically and also require less manual labor but there will always be some form of human labor to maintain the robots until they become fully aware and all bets are off then... LOL... Too bad Corporations turn excess profit into GREEDY PROFITS rather than reducing the product price. Greed is the worst drug the Earth has and those controlled by it could not spend their wealth if they lived 5 lifetimes yet GREED causes them to obtain MORE... Take Big Oil for example, did you know their profit margin is 85 Trillion Dollars per year? Big Oil could pay off the US Debt and buy every American a 50,000 USD Vehicle and still profit over 20 Trillion Dollars each year. Imagine what it would be like to get a new 50k vehicle for FREE every year, how nice would life be... And by every American I mean every Man, Woman, and Child. NOPE Big Oil HORDS their Profits rather than give back to those that make them rich but I digress... Cheers...
Congratulations! You are incredibly basic. You said the obvious comment, the comment which plagues all of the comment sections of almost every video about Artificial Intelligence.