Тёмный
Linear Algebra
Linear Algebra
Linear Algebra
Подписаться
Real Spectral Theorem
26:27
8 лет назад
EM algorithm and missing data part 2
51:37
8 лет назад
4 12 16
1:10:25
8 лет назад
Комментарии
@beetsreliable
@beetsreliable Месяц назад
Amazing!
@Aylou07
@Aylou07 7 месяцев назад
El peor camarógrafo que he visto en mi vida
@amirkhazama7464
@amirkhazama7464 11 месяцев назад
u are really great sir ! these are really awesome lectures !
@alikiaee1307
@alikiaee1307 Год назад
thanks for the great class
@LoganDunbar
@LoganDunbar Год назад
32:28 "It's getting crazy right, everything is a vector space..." 😅 It's vector spaces all the way down!
@user-ko9dn2ve9e
@user-ko9dn2ve9e Год назад
1) data sparseness & overdispersion, do they connect? 2) if we collapsed/combined some categoria (rows, collumns, cells) for the sake of the parsimony & inerpretability & agriment with X2 assuptions (mne>=5 (3?)), minimization of overdispersion, etc., ... does a some "stop" rules exists or may be considered?
@user-ko9dn2ve9e
@user-ko9dn2ve9e Год назад
may be: Rdf=N-K?😶
@Anonymous-cw4yd
@Anonymous-cw4yd 2 года назад
First of all, thank you so(×2) much for uploading this lectures course on linear algebra. Time stamp: 37:20. I would prove uniqueness of additive identity in the following way: let 0 and 0' be additive identity of V. By definition of additive identity, 0=0+0'=0'. Thus 0=0'.
@fatriantobong2097
@fatriantobong2097 2 года назад
omg..all in one lecture that liberates my understanding
@juanlemod
@juanlemod 2 года назад
It’s 2022. I’m struggling in upper-division Proof-Based Linear Algebra. I earned an A in lower-division Linear Algebra since it was computational-based for the most part, but writing proofs is a whole another beast. This comment may be years late, but this video has been very helpful. Thank you for sharing.
@LinearAlgebra
@LinearAlgebra 2 года назад
Very glad to hear it helped you! Proofs are indeed a new vista, and linear algebra is the introduction for many of us.
@reptilewithsadhumaneyes
@reptilewithsadhumaneyes 2 года назад
I am endlessly grateful for you sharing these lectures. Thank you so much, this playlist has exposed a lot of my unknown unknowns that have been haunting me for years.
@debanwitahajra
@debanwitahajra 3 года назад
Will W+Z not be W? 20:23
@monkeycarz
@monkeycarz 3 года назад
Foundations clearly communicated for a whole branch of statistics.
@coles5451
@coles5451 3 года назад
Professor reminds me of Elon Musk
@insoucyant
@insoucyant 3 года назад
Thank You Prof. It helped me tremendously.
@prashant0104
@prashant0104 3 года назад
Thank you!
@aiavicii4243
@aiavicii4243 3 года назад
way too abstract n formulized, hard to understand
@dsvdsv8466
@dsvdsv8466 3 года назад
He needs to work on his lecture skills. Of course hes apdo so give him benefi of doubt.... but still.. if u cant do the job
@bizzaaach
@bizzaaach 3 года назад
Springer has made the book available online: link.springer.com/book/10.1007/978-3-319-11080-6
@jiaruchen2167
@jiaruchen2167 4 года назад
Video zoom too much, so I couldn't see the whole blackboard.
@non_sense7441
@non_sense7441 4 года назад
This is great
@miguelangelhernandezgodine7801
@miguelangelhernandezgodine7801 4 года назад
I had problems understanding what adjoints are, but you explained it just perfectly. Thank you so much for this classes.
@bilalzahory7765
@bilalzahory7765 4 года назад
Good lecture, but the class doesn't seem to care lmao
@youcefhenka3352
@youcefhenka3352 4 года назад
In proof I think We have ORTH[Null(f)]=closure[range(f^{*})] Is it correct?
@vikdfr
@vikdfr 4 года назад
Lecture Notebooks - github.com/jasonmorton/504notebooks
@chrisdc3643
@chrisdc3643 4 года назад
I think there is another proof of Riesz theorem (showing that a given function is an isomorphism of same-dimension spaces), which is a little bit more direct. Thanks a lot for your video, the definition of adjoint is very clear !
@mattbroerman
@mattbroerman 4 года назад
Words for the ages 30:10 " There is no such thing as missing data. Just a misspecified model." I haven't seen this topic presented this way, I really liked it. Got pointers for reading up more on this presented this way?
@kamalpreetrakhra8071
@kamalpreetrakhra8071 4 года назад
I have a question. Is there a precedence of taking a random sample of one category of the dependent variable so as to have similar proportions to the second for a three category dependent variable. My category proportions are 0.77, 0.20, and 0.027. Is there any other way to model the three category dependent variable for these proportions.
@ahmedhakimalex5514
@ahmedhakimalex5514 4 года назад
dear professor can you till us the text u use in this course
@loltrysten
@loltrysten 4 года назад
Linear Algebra Done Right - Axler
@sudeshkumar-hu7fj
@sudeshkumar-hu7fj 4 года назад
Nice
@caio868
@caio868 4 года назад
I tried other videos on youtube and the audio works fine, but this set of videos have sound only on the right side of the headphone. Is there a problem with the audio?
@alexhixson556
@alexhixson556 4 года назад
Hi, are there any good problem sets/exams that go with these lectures?
@davidk7212
@davidk7212 4 года назад
In the first 10 minutes this video answered all the questions I was left with after reading the first two sections of chapter 5 in Axler. Thank you.
@jamestourkistas764
@jamestourkistas764 4 года назад
Awesome lecture. Thanks.
@jamesclerkmaxwell676
@jamesclerkmaxwell676 4 года назад
Thank you for this lecture
@jamesclerkmaxwell676
@jamesclerkmaxwell676 4 года назад
This lecture is gold
@jamesclerkmaxwell676
@jamesclerkmaxwell676 4 года назад
Great lecture. What book are you using for these lectures ?
@ninnymonger
@ninnymonger 3 года назад
Axler's Linear Algebra Done Right. They seem to be referencing the second edition.
@azzaea
@azzaea 5 лет назад
This is very simple, yet amazingly put together. Thank you!
@frank256256
@frank256256 5 лет назад
Hi thank you for the great lectures, in 23:00 we do not need to assume u_1 is different from zero , since we already made the assuption that the u's are linearly indep . amazing teacher !,I hope you can do lectures on more algebra topics ! or any other advanced math topics. subsribed to this channel :)
@lucasm2214
@lucasm2214 5 лет назад
so its called the rank nullity theorem but you still use range/image instead of rank to define "dim(range(V))"
@fortuneinc.4480
@fortuneinc.4480 6 лет назад
On point
@FariborzGhavamian
@FariborzGhavamian 6 лет назад
This was very clear. Thank you!
@ginocastillo2385
@ginocastillo2385 6 лет назад
where is he? if it is possible to tell me this in order to look for more related information. I consider their classes really impressive, but to start I need some source with easier information to process. thank you
@LinearAlgebra
@LinearAlgebra 6 лет назад
Thanks! This is Jason Morton, at Penn State lecturing.
@kleytonnascimento429
@kleytonnascimento429 6 лет назад
What's the name of the professor? Does he have other courses online?
@kristenchou
@kristenchou 6 лет назад
Sensing some uncanny resemblance to Mr. Musk. :D
@AjeetKumar-ln1nq
@AjeetKumar-ln1nq 6 лет назад
Could any one provide the notes of the lectures ?
@AjeetKumar-ln1nq
@AjeetKumar-ln1nq 6 лет назад
Please provide the notes of all the lectures.
@oli.s.5550
@oli.s.5550 6 лет назад
His ponounciation of words was a bit sharp (or maybe low quality of my headphone), but it's so much better with 1.5 speed :D Try 1.5 speed!
@2false637
@2false637 4 года назад
2X speed babyyy
@PedroRibeiro-zs5go
@PedroRibeiro-zs5go 6 лет назад
Thanks the lecture was awesome, really enjoyed it!!
@LinearAlgebra
@LinearAlgebra 6 лет назад
Thank you, glad to hear it!
@spectralsequence6028
@spectralsequence6028 6 лет назад
Are you using linear algebra done right by Sheldon Axler in these lectures?
@davidk7212
@davidk7212 5 лет назад
I don't think so. The notation he's using is different, and the topics aren't ordered in the same way. In any case, these videos are a great supplement to Axler. The Algebra 1M videos put up by Technion are also a great resource to use with it.