Тёмный
No video :(

Lecture 6 | Training Neural Networks I 

Stanford University School of Engineering
Подписаться 186 тыс.
Просмотров 513 тыс.
50% 1

In Lecture 6 we discuss many practical issues for training modern neural networks. We discuss different activation functions, the importance of data preprocessing and weight initialization, and batch normalization; we also cover some strategies for monitoring the learning process and choosing hyperparameters.
Keywords: Activation functions, data preprocessing, weight initialization, batch normalization, hyperparameter search
Slides: cs231n.stanford.edu/slides/201...
--------------------------------------------------------------------------------------
Convolutional Neural Networks for Visual Recognition
Instructors:
Fei-Fei Li: vision.stanford.edu/feifeili/
Justin Johnson: cs.stanford.edu/people/jcjohns/
Serena Yeung: ai.stanford.edu/~syyeung/
Computer Vision has become ubiquitous in our society, with applications in search, image understanding, apps, mapping, medicine, drones, and self-driving cars. Core to many of these applications are visual recognition tasks such as image classification, localization and detection. Recent developments in neural network (aka “deep learning”) approaches have greatly advanced the performance of these state-of-the-art visual recognition systems. This lecture collection is a deep dive into details of the deep learning architectures with a focus on learning end-to-end models for these tasks, particularly image classification. From this lecture collection, students will learn to implement, train and debug their own neural networks and gain a detailed understanding of cutting-edge research in computer vision.
Website:
cs231n.stanford.edu/
For additional learning opportunities please visit:
online.stanford.edu/

Опубликовано:

 

8 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии    
Далее
Lecture 7 | Training Neural Networks II
1:15:30
Просмотров 340 тыс.
MIT Introduction to Deep Learning | 6.S191
1:09:58
Просмотров 439 тыс.
Mansan oshdi😅
00:22
Просмотров 1,7 млн
SCHOOLBOY. Последняя часть🤓
00:15
Просмотров 1,1 млн
This is why Deep Learning is really weird.
2:06:38
Просмотров 379 тыс.
Lecture 3 | Loss Functions and Optimization
1:14:40
Просмотров 876 тыс.
The moment we stopped understanding AI [AlexNet]
17:38
Просмотров 872 тыс.
Lecture 5 | Convolutional Neural Networks
1:08:56
Просмотров 642 тыс.