Тёмный

Machine Learning | The Vapnik-Chervonenkis Dimension 

RANJI RAJ
Подписаться 54 тыс.
Просмотров 53 тыс.
50% 1

In Vapnik-Chervonenkis theory, the Vapnik-Chervonenkis (VC) dimension is a measure of the capacity (complexity, expressive power, richness, or flexibility) of a space of functions that can be learned by a statistical classification algorithm. #MachineLearning #VCDimension
Machine Learning 👉 • Machine Learning
Artificial Intelligence 👉 • Artificial Intelligenc...
Cloud Computing 👉 • Cloud Computing Tutorials
Wireless Technology 👉 • Wireless Technology Tu...
Data Mining 👉 • Data Mining & Business...
Simulation Modeling 👉 • Simulation Modeling Tu...
Big Data 👉 • Big Data Anaytics
Blockchain 👉 • Blockchain Technology
IOT 👉 • Internet Of Things
Follow me on Instagram 👉 / adhyapakh
Visit my Profile 👉 / reng99
Support my work on Patreon 👉 / ranjiraj

Опубликовано:

 

15 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 27   
@RanjiRaj18
@RanjiRaj18 3 года назад
For notes👉 github.com/ranjiGT/ML-latex-amendments
@purplemelodies9482
@purplemelodies9482 4 года назад
At around 10:12 you told we cannt classify A B C D using circle... but we can put point C inside circle and remaining points outside the circle.
@gauravdas5002
@gauravdas5002 7 месяцев назад
The reason is that the circle cannot be used in 2 dimensions. The circle is used with 3 dimensions. Now you might ask why it cannot be used in 2 dimensions that would be because the textbook says so. If anyone has the time or interest to find out why this is so they can reply to this thread as it would be helpful.
@icewater2762
@icewater2762 7 месяцев назад
@@gauravdas5002 I think we cant use a circle , because the points BCD are collinear in the diagram shown in the video
@sg04f
@sg04f 3 месяца назад
Thanks. Really good video. I had difficulty understanding this for years. Your 5 minutes helped immensely
@RahulSingh-up8jo
@RahulSingh-up8jo Год назад
10:27 you say that the VC dimension of the 2 dimensional space is 3 but at the end 11:22 you gave an example saying three colinear points can not be shattered. So how is the VC dimension 3?
@oscura15
@oscura15 Год назад
Thank you so much! In the first 4 min of your video i understood more than in my lecture class.
@RanjiRaj18
@RanjiRaj18 Год назад
Welcome Laura, Glad it helped!
@Marimenezesg
@Marimenezesg Год назад
I was reading the Vapnik article and I just couldn't understand this concept. I wish I came to your video sooner. Great explanation!
@RanjiRaj18
@RanjiRaj18 Год назад
You're most welcome
@SAN-te3rp
@SAN-te3rp 4 года назад
Sir please make a video on probability approximate correct(PAC)🙏
@bdurgaramprasad4165
@bdurgaramprasad4165 Год назад
Sir, Can't we just draw a circle for the negative point in case of the collinear example you have provided??
@crazytech3550
@crazytech3550 2 года назад
Thanks for the video sir for giving best explanation 👍
@sg04f
@sg04f 3 месяца назад
Thanks!
@RanjiRaj18
@RanjiRaj18 3 месяца назад
Welcome!
@baderal-hamdan2265
@baderal-hamdan2265 Год назад
In the last case which was colinear what will be the VC dimension in this case?
@shloksuman8164
@shloksuman8164 6 месяцев назад
collinear ones can't be shattered via line hypothesis class, so no VC dimensions
@palavracomentada
@palavracomentada 4 года назад
Hello Ranji, I'm Studing The Vapnik-Chervonenkis Dimension like Complex Mesure, have you some video about it?
@seyeeet8063
@seyeeet8063 4 года назад
what does shatter mean here?
@RanjiRaj18
@RanjiRaj18 4 года назад
Spread of points
@seyeeet8063
@seyeeet8063 4 года назад
@@RanjiRaj18 it does not make sence
@Marimenezesg
@Marimenezesg Год назад
@@seyeeet8063 It took me a long time to understand the concept because of this word. And that is how Vapnik explains it... I personally think i t was a terrible choice lol
@sibinsam8050
@sibinsam8050 4 года назад
The max no of points that can be shattered by a function which you define
@teegnas
@teegnas 4 года назад
Hello Rajni, please try and make videos on multidimensional scaling & Sammon's mapping. The ones you have covered are linear dimensionality reduction techniques, but it would be great if you make videos on some of the non-linear techniques.
@RanjiRaj18
@RanjiRaj18 4 года назад
I have some topics lined up after this afterwards I shall consider topics under non linear techniques. Thanks for your attention.
@teegnas
@teegnas 4 года назад
@@RanjiRaj18 thanks for your prompt reply ... sure you can go ahead according to your decided playlist ... I would love to watch them!
@sanchitbhalla1176
@sanchitbhalla1176 3 года назад
Kehna kya chahte ho😅
Далее
Machine Learning | Concept Learning
18:24
Просмотров 56 тыс.
PAC Learning and VC Dimension
17:17
Просмотров 17 тыс.
Слушали бы такое на повторе?
01:00
VC Dimension
21:09
Просмотров 42 тыс.
VC Dimension
17:42
Просмотров 86 тыс.
Lecture 07 - The VC Dimension
1:13:31
Просмотров 190 тыс.
ML Was Hard Until I Learned These 5 Secrets!
13:11
Просмотров 322 тыс.
Model Complexity and VC Dimension
21:20
Просмотров 33 тыс.
Machine Learning | Instance-based Learning
10:03
Просмотров 31 тыс.