Тёмный

Tutorial 97 - Deep Learning terminology explained - Batch size, iterations and epochs 

ZEISS arivis
Подписаться 12 тыс.
Просмотров 8 тыс.
50% 1

Code associated with these tutorials can be downloaded from here: github.com/bnsreenu/python_fo...
The batch size defines the number of samples that propagates through the network before updating the model parameters.
Each batch of samples go through one full forward and backward propagation.
Example:
Total training samples (images) = 3000
batch_size = 32
epochs = 500
Then…
32 samples will be taken at a time to train the network.
To go through all 3000 samples it takes 3000/32 = 94 iterations  1 epoch.
This process continues 500 times (epochs).
You may be limited to small batch sizes based on your system hardware (RAM + GPU).
Smaller batches mean each step in gradient descent may be less accurate, so it may take longer for the algorithm to converge.
But, it has been observed that for larger batches there is a significant degradation in the quality of the model, as measured by its ability to generalize.
Batch size of 32 or 64 is a good starting point.
Summary:
Larger batch sizes result in faster progress in training, but don't always converge as fast.
Smaller batch sizes train slower but can converge faster.

Наука

Опубликовано:

 

4 апр 2021

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 20   
@mohammadnoman4818
@mohammadnoman4818 6 месяцев назад
Thanks. Great Explained Bech Size, Iteation and Epochs... problem solved
@raminghorbani2073
@raminghorbani2073 Год назад
Thanks a lot...It was very helpful for me
@ruslanjabrayilov7363
@ruslanjabrayilov7363 2 года назад
Great tutorials. I had trouble understanding several concepts in deep learning and although I have watched/read many online resources, your videos are by far the best ones. So thank you for the effort!
@ZEISS_arivis
@ZEISS_arivis 2 года назад
Glad it was helpful!
@anizobaobinna9238
@anizobaobinna9238 3 года назад
Great work.. I can now finally understand the difference between Batch size and Epoch!!
@ZEISS_arivis
@ZEISS_arivis 3 года назад
Great to hear!
@LuciaSilva-ek3qr
@LuciaSilva-ek3qr 2 года назад
Excellent and clear explanation. Tkx.
@tilkesh
@tilkesh 2 года назад
Thank you very much.
@surflaweb
@surflaweb 3 года назад
Finally I could understand this topic. Thanks Bro, keep it up. 💯👋👋👋🇵🇪
@ZEISS_arivis
@ZEISS_arivis 3 года назад
Glad to hear that
@salmaachaq44
@salmaachaq44 2 года назад
in the summary you said that smaller bath sizes can converge faster and before you mentioned that it takes longer to converge ... could you clarify please? Thanks.
@fetamedia788
@fetamedia788 2 года назад
can calculate batch-size ?
@1998manja
@1998manja 2 года назад
Hello, may i ask for my final year project? I had 1320 images need to classify 5 classes. Either 30 mini batch size or 50 mini batch size is good to use in deep learning training? I am using laptop with single CPU. And currently my epoch is 10 epochs. And I am still not understanding the concept of mini batch size and its calculation.
@ZEISS_arivis
@ZEISS_arivis 2 года назад
Please watch this video on Batch Size. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-OSY7hWADMZk.html Hopefully, it fills the gap in your knowledge about batch size.
@khanmahmuna
@khanmahmuna 2 года назад
I am doing ANPR project I have 346 train images and 84 test image what batch size and epoch should I choose I'm still confused
@akberalikhan4659
@akberalikhan4659 2 года назад
Generally, for smaller datasets (
@ricardonunes6949
@ricardonunes6949 2 года назад
So… the weights will be updated only after each batch size?
@ZEISS_arivis
@ZEISS_arivis 2 года назад
Yes, weights are updated during training after each batch size during batch gradient descent. You may find this useful: datascience.stackexchange.com/questions/27421/when-are-weights-updated-in-cnn
@saxamrathod9138
@saxamrathod9138 3 года назад
Hi bro how are u bro
@ZEISS_arivis
@ZEISS_arivis 3 года назад
All good
Далее
136 understanding deep learning parameters batch size
11:38
How I'd Learn AI (If I Had to Start Over)
15:04
Просмотров 763 тыс.
CUDA Explained - Why Deep Learning uses GPUs
13:33
Просмотров 232 тыс.