Тёмный

Graph Node Embedding Algorithms (Stanford - Fall 2019) 

Machine Learning TV
Подписаться 37 тыс.
Просмотров 68 тыс.
50% 1

In this video a group of the most recent node embedding algorithms like Word2vec, Deepwalk, NBNE, Random Walk and GraphSAGE are explained by Jure Leskovec. Amazing class!

Опубликовано:

 

8 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 42   
@sasankv9919
@sasankv9919 4 года назад
Watched it for the third time and now everything makes sense.
@i2005year
@i2005year 3 года назад
15:30 Basics of deep learning for graphs 51:00 Graph Convolutional Networks 1:02:07 Graph Attention Netwirks (GAT) 1:13:57 Practical tips and demos
@sm_xiii
@sm_xiii 4 года назад
Prof. Lescovec covered a lot of material in 1.5hr! It was very engaging because of his energy and teaching style.
@jayantpriyadarshi9266
@jayantpriyadarshi9266 4 года назад
Thank you for this lecture. Really changed my view about GCNs
@sanjaygalami
@sanjaygalami 3 года назад
What's the major point that strik to your head? Lets others know if it convenient for you. Thanks
@ernesttaf
@ernesttaf 4 года назад
Great Sir, Congratulations for your oustanding teaching capabilities. It really change my life and my view on Graph Network. Thank you very much, Professor
@TheAnna1101
@TheAnna1101 4 года назад
Awesome video. Please share more on this topic!
@Commonsenseisrare
@Commonsenseisrare 11 месяцев назад
Amazing lecture of gnns.
@fredconcklin1094
@fredconcklin1094 2 года назад
Classes so fun. The death here is different than the death in Computer Vision due to NSA death.
@vgreddysaragada
@vgreddysaragada Год назад
Great work..
@gautamrajit225
@gautamrajit225 4 года назад
Hello. These lectures are very interesting. Would it be possible to share the GitHub repositories so that I can get a better understanding of the code involved in the implementation of these concepts?
@znb5873
@znb5873 3 года назад
Thank you so much for making this lecture publicly available. I have a question, is it possible to apply node embedding to dynamic graphs (temporal)? Are there any specific methods/algorithms to follow? Thanks in advance for your answer.
@Olivia-wu4ve
@Olivia-wu4ve 4 года назад
Awesome! Thanks for sharing. Will the hands on session be posted?
@MingshanJia
@MingshanJia 4 года назад
Wanna learn the whole series...
@wwemara
@wwemara 4 года назад
ru-vid.com/group/PL-Y8zK4dwCrQyASidb2mjj_itW2-YYx6-
@MrSajjadathar
@MrSajjadathar 4 года назад
@Machine Learning TV yes, and please share the link where you shared all the graph representation learning lectures. i will be thankful..
@eyupunlu2944
@eyupunlu2944 4 года назад
I think it is this one: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-YrhBZUtgG4E.html
@EOh-ew2qf
@EOh-ew2qf Год назад
43:40 I have a question for the slide here. How can you generalize for a new node when the model learns by aggregating the neighborhoods and the new nodes doesn't have a neighborhood yet.
@AdityaPatilR
@AdityaPatilR 3 года назад
Deeper networks will not always be more powerful as you may lose vector features in translation .And due to additional weight matrices the neural networks will be desensitized to feature input.Number of hidden layers should not be greater than input dimension.
@ShobhitSharmaMTAI
@ShobhitSharmaMTAI 3 года назад
My question at 31:00, what if previous layer embedding of same node is not multiply with Bk like Bk hv(k-1)...what will be the impact on embedding...
@alvin5424
@alvin5424 4 года назад
Any plans to publish lectures 17, 18 and 19?
@MachineLearningTV
@MachineLearningTV 4 года назад
Yep! Soon we will upload new lectures!
@kanishkmair2920
@kanishkmair2920 4 года назад
In GCN, we get a single output. In GraphSAGE you concatenate it to keep the info separate. So at each step, the output H^k will have 2 outputs, isn't it? If not, then how are they aggregated and still kept separate
@paulojhonny4364
@paulojhonny4364 4 года назад
Kanishk Mair hi, I didn’t understand either. Did you find anything about it?
@kanishkmair2920
@kanishkmair2920 4 года назад
I tried to work on pytorch geometric using it (SAGEConv). Not sure how it works but looking at it's source code might help
@sm_xiii
@sm_xiii 4 года назад
I think the concatenated output is the embedding of the target node. And it depends on the downstream task to further process it, by passing it through more layers, before having the final output.
@eugeniomarinelli1104
@eugeniomarinelli1104 3 года назад
where do I find the slides fo this lecture
@ramin5665
@ramin5665 Год назад
Can you share the hands on link?
@deweihu1003
@deweihu1003 3 года назад
On behalf a people from a remote eastern country: niubi!!!!
@phillipneal8194
@phillipneal8194 4 года назад
How do you aggregate dissimilar features ? For example sex, temperature, education level for each node ?
@baharehnajafi9568
@baharehnajafi9568 4 года назад
Hi, where can I find the next lectures of him?
@MachineLearningTV
@MachineLearningTV 4 года назад
We will upload them soon
@wwemara
@wwemara 4 года назад
ru-vid.com/group/PL-Y8zK4dwCrQyASidb2mjj_itW2-YYx6-
@MrSajjadathar
@MrSajjadathar 4 года назад
Sir can you please share Tuesday lecture
@MachineLearningTV
@MachineLearningTV 4 года назад
The past Tuesday?
@MrSajjadathar
@MrSajjadathar 4 года назад
@@MachineLearningTV yes, and please share the link where you shared all the graph representation learning lectures. i will be thankful..
@MachineLearningTV
@MachineLearningTV 4 года назад
It is available now. Check the new video
@kognitiva
@kognitiva 3 года назад
ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-7JELX6DiUxQ.html "what we would like to do is here input the graph and over here good predictions will come" Yes, that is exactly it! xD
@user-je6nw3ow5z
@user-je6nw3ow5z 4 года назад
Where can I get slides?
@ducpham9991
@ducpham9991 4 года назад
you can find it at here web.stanford.edu/class/cs224w/
@jcorona4755
@jcorona4755 Год назад
Pagan porque vean que tiene más seguidores. De echo pagas $10 pesos por cada video
Далее
Graph Representation Learning (Stanford university)
1:16:53
Deep Graph Generative Models (Stanford University - 2019)
1:22:31
POV: Your kids ask to play the claw machine
00:20
Просмотров 9 млн
Knowledge Graphs - Computerphile
12:05
Просмотров 104 тыс.
Think Fast, Talk Smart: Communication Techniques
58:20
PyData Tel Aviv Meetup: Node2vec - Elior Cohen
21:10
Просмотров 11 тыс.
ChatGPT: 30 Year History | How AI Learned to Talk
26:55
Graph Embeddings
31:39
Просмотров 34 тыс.
Understanding Word2Vec
17:52
Просмотров 77 тыс.
Cosmology | Lecture 1
1:43:03
Просмотров 545 тыс.
Variational Autoencoders
15:05
Просмотров 496 тыс.