Nice video, good explainations! For one-vs-all: What has to be done if more than one classifier returns "+"? What is exactly meant by "highest confidence" in this context and how do you compute it?
Really clear, thanks. Just one doubt about the All-Pairs strategy: Taking for instance the yellow class related to X1 or X4, how it's voted? Because it should be the negative class for both h1 and h6 classifier, so i'm trying to understand how the Yellow class can be voted, can you help me? Thanks in advance
I THINK (stress on the thinking part) that is exactly how it is voted: by presenting a negative value for h1 and h6, and "None" for all others, the algorithm should place the entry as a yellow class.
logistic regression is a directly binary classifier (predicts with probabilities actually, 0 or 1, -1 or 1), NOT a multi class as mentioned in the video... right?
"Classic" logistic regression is indeed usually just binary (have another video on that): ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-9BcxOiwE4Ds.html But you can have a vector of outputs which corresponds to the probability of each class.
@@JordanBoydGraber actually the binary problem of 0 vs 1 is a regression problem, as it predicts not 0 or 1 but the probabilities of such classes... Thank you:)