Its correct. the rank r1, r2 is 50 and 150 respectively. since frequency is inversely proportional to rank, hence we can say roughly f1 = 3f2 (because r2=3f1).
I don't wanna disrespect you but i don't understand your way of teaching i saw 18 lacture till now but not getting one of them , in my pov you are worst teacher teacher i have met
Doubt here is that , in question it'd given for any word given its a noun or verb values But he takes for a given noun and value as not matching which is simply not true. Can anyone please answer my doubt
Imagine she is an Australian cricketer, and you are bowling at 130kmph and you bowled a bouncer ball. It is in the height of her head. She might just bend her head, so as to leave the ball . In this case, you can say, "I made her Duck". "Duck" in this context refers to the act of bending head.
This course is so bad.... The lecturer is not even trying to explain the basics. He is just continuing on his pace. The material is good but the audio and this teacher's teaching skills are zero.
Worst teacher ever seen in my life. He don't even know English properly. His vocabulary is worse. These kind of professors should be fired out immediately of IITs. They are polluting the teaching process....
In electrical engineering, statistical computing and bioinformatics, the Baum-Welch algorithm is a special case of the EM algorithm used to find the unknown parameters of a hidden Markov model (HMM). It makes use of the forward-backward algorithm to compute the statistics for the expectation step.
A Viterbi decoder uses the Viterbi algorithm for decoding a bitstream that has been encoded using a convolutional code or trellis code. There are other algorithms for decoding a convolutionally encoded stream (for example, the Fano algorithm).
A Hidden Markov Model (HMM) is a statistical model which is also used in machine learning. It can be used to describe the evolution of observable events that depend on internal factors, which are not directly observable.
Hidden Markov Model (HMM) is a statistical Markov model in which the system being modeled is assumed to be a Markov process - call it - with unobservable ("hidden") states. As part of the definition, HMM requires that there be an observable process whose outcomes are "influenced" by the outcomes of in a known way.
Naïve Bayes has a naive assumption of conditional independence for every feature, which means that the algorithm expects the features to be independent which not always is the case. Logistic regression is a linear classification method that learns the probability of a sample belonging to a certain class.