Тёмный
MIT Embodied Intelligence
MIT Embodied Intelligence
MIT Embodied Intelligence
Подписаться
MIT Embodied Intelligence is a group of labs in the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) working in Machine Learning and Robotics. Here you will find videos about our research as well as talks by invited speakers in our seminar.

More information about MIT Embodied Intelligence can be found at ei.csail.mit.edu/

Information about accessibility can be found at accessibility.mit.edu/
Комментарии
@jazzvids
@jazzvids 19 дней назад
Thank you for this! presentation starts 19:33 :)
@the_engineer97
@the_engineer97 2 месяца назад
This is so intriguing. I am so inspired
@LeoTX1
@LeoTX1 4 месяца назад
Thanks!
@herbertarnold6372
@herbertarnold6372 4 месяца назад
Promo-SM 😕
@LeoTX1
@LeoTX1 5 месяцев назад
It's a good presentation. Very useful for me! Thanks a lot!
@jeremydy3340
@jeremydy3340 5 месяцев назад
Talk starts 32:57
@Shintuku
@Shintuku 5 месяцев назад
Does this presentation correspond to some paper? It would be nice to have access to the slides/citations, very interesting stuff
@jeremydy3340
@jeremydy3340 6 месяцев назад
Talk starts at 16:30
@hansbleuer3346
@hansbleuer3346 7 месяцев назад
Interesting explanation
@CandidDate
@CandidDate 8 месяцев назад
I think a sense of humor in robotics would lead to clownish appeal.
@DhruvMetha
@DhruvMetha 8 месяцев назад
Starts at 15:25
@aennmatyasbarra-hunyor5506
@aennmatyasbarra-hunyor5506 8 месяцев назад
Great one, thank you! I would like to be part of it. One day it will be possible.
@JosephHeck
@JosephHeck 9 месяцев назад
Content actually starts at 17:30, and the speaker's audio starts at 18:30
@araldjean-charles3924
@araldjean-charles3924 11 месяцев назад
For the initial conditions that work, have anybody look at how much wiggle room you have. Is there an epsilon-neighborhood of the initial state you can safely start from, and how small is epsilon?
@cbasile22
@cbasile22 Год назад
is there any formal course that covers multi agent RL? I find it confusing thus far. Thanks!
@AngeloKrs878
@AngeloKrs878 Год назад
1:07 subtitles "my experience with drugs couldn't be better"
@dbp_patel_1994
@dbp_patel_1994 Год назад
😂
@bhaskartripathi
@bhaskartripathi 2 года назад
I was always confused by MA-MDP. You made it look very simplistic. Mathematical notations were very concise and research paper ready.
@keeperofthelight9681
@keeperofthelight9681 2 года назад
how to do convolution lstm and other things on JAX more tutorials please sir Mathew Johnson
@kshitijshekhar1144
@kshitijshekhar1144 2 года назад
flax is a high level nn library built on top of Jax, check out its documentation. It's a very new library, built for flexibility. And you can make a mark in that by making PRs
@ImtithalSaeed
@ImtithalSaeed 2 года назад
u and a confuse me
@georgemu7464
@georgemu7464 3 года назад
Very insightful
@devjaiswal1685
@devjaiswal1685 3 года назад
Thank you sir
@adamantinebipartite4732
@adamantinebipartite4732 3 года назад
Nazi.
@LB-fx1kx
@LB-fx1kx 3 года назад
Great work!
@iandanforth
@iandanforth 3 года назад
Really enjoyed the presentation. The 'Puzzle' slide is problematic. All three have 'lots of wiring', the camera has smaller wires in a better package.
@syedshahid8316
@syedshahid8316 Год назад
I live in karachi Pakistan I like your
@p.z.8355
@p.z.8355 3 года назад
How do you linearize the KG without getting into exponential complexity ?
@p.z.8355
@p.z.8355 3 года назад
How do you combine selfsupervised learning with declarative knowledge ?
@pakistanbtsarmy2625
@pakistanbtsarmy2625 3 года назад
👌
@ImtithalSaeed
@ImtithalSaeed 3 года назад
Which book I can refer to
@f150bc
@f150bc 3 года назад
The diehold foundation is along with the suspecious observers pushing a 12 thousand year cycle of super nova and magnetic reversal which brings a catastrophic event please debate them on their theory's they have tens of thousands of people following them. I fear that the theory might be partially right. Find them on RU-vid under those names.please look into this thanking you in advance Carl.
@fredxu9826
@fredxu9826 3 года назад
This is a great talk. Personally I haven't had the prerequisite for manifold learning, but the idea behind hybrid message passing is quite profound. Just wandering: if you have a Bayesian GNN where the prior encodes the linear assumption, would that be equivalent to the GNN + PGM model presented here? or is there a limit to the expressiveness of a Bayesian prior?
@thanasisk
@thanasisk 3 года назад
Great talk, thank you for uploading.
@DistortedV12
@DistortedV12 3 года назад
Amazing work
@wgharbieh
@wgharbieh 3 года назад
Talk starts at 6:00
@harrysaini7702
@harrysaini7702 3 года назад
Can we get the PPT plzz.....??
@alinouruzi5371
@alinouruzi5371 3 года назад
good
@TheRcCrazyFan
@TheRcCrazyFan 4 года назад
Starts at 12:08
@AvindraGoolcharan
@AvindraGoolcharan 4 года назад
Starts around 7:03
@viktoriyat1815
@viktoriyat1815 4 года назад
this was amazing, thank you so much for uploading it!!
@Mefaso09
@Mefaso09 4 года назад
Starts at 11:20
@mitembodiedintelligence8675
@mitembodiedintelligence8675 2 года назад
Thank you! I have updated the video so that it starts playing from the very beginning! -Ge
@AndersonSilva-dg4mg
@AndersonSilva-dg4mg 4 года назад
thank you for sharing information
@dimwitquack
@dimwitquack 4 года назад
exellent
@SaifUlIslam-di5xv
@SaifUlIslam-di5xv 4 года назад
Here from Reddit. (Y)