I like indices!!! 1:49 | Gathering Data 2:21 | Preparing Data 4:03 | Model Selection 4:30 | Training 6:46 | Evaluation 7:24 | Parameter Tuning 8:55 | Prediction
Bah I never even finished the first step when I tried to replicate this. I got back from the store with the beer and wine and everything just went downhill from there.
You've done a great job here with the explanation of the processes of building a ML model. So clear, easy to understand and quite helpful to even someone without prior knowledge of ML. 👏
It is great lecture and explains the topic very clearly and simply.i will follow all the videos because comparing to other programs and books this the most clear videos I’ve seen so far
for people who are looking to get into deep-learning. take a look a tensorflow, it is a library for python and it makes designing and training a neural net very easy. i am 13 and even i have made a speech-recognition algorithm for my AI-assistant (much like google home)
Bike Vids say all you want. At the end of the day, i make a good wage with it. Your hate isnt going to do anything :) p.s. python isnt the only language I know, i also know C#, Javascript and Ruby
*quick summary:* - machine learning is all about seeing some examples of input-output pairs and then being able to predict the output for new inputs - basically, you feed a bunch of examples to a machine, and the machine will start to learn about the defining characteristics of your examples - therefore, it is extremely import that you feed it good examples! Generally, the more examples the better, but you also want your examples to have the distinguishing features in them. - once you gather some good examples (with distinguishing features), you generally clean it up, plot it, do some statistical analysis, etc - then you choose one of the many different machine learning models (e.g. linear, neural network, etc). Each has its pros/cons. Depending on your examples, and your time constraints, you will pick one of these models - you will then tune some parameters of the model (again how you do this depends on your examples and time constraints) Hope that was helpful! Thanks for the awesome video :)
I am using this same learning technique to learn about machine learning, by watching many videos about it then seeing what I can understand about similar ideas talked about by different people
Very good explanation and elaboration. I like this kind of demo where there is a direct elaboration of the topics unlike other video tutorial difficult to understand beside of the accent of language.
For this use case Chemometrics approach is best I think. Would be nice to relate images, spectral signatures and have that for training, test and validation dataset. This would mean of course working not just tabulated data but the fusion of images, spectral data and lab measurement data
Wow input model and output . If output is acceptable then fine if not feedback to obtain right answer. Explained nicely...great to visit this channel .
1. DATA COLLECTION/GATHERING: +Collect features. e.g.: 1. Alcohol concentration.=>Hydrometer. 2. Color.=>Spectrometer. +High quantity & quality of data needed. 2. DATA PREPARATION: +Randomization. +Visualizations. +Data split: training+testing/evaluation. 3. CHOOSING A MODEL: +Among many in the community today. e.g. tensor flow. 4. TRAINING MODEL: +Example: y=m*x+b. The only values I can adjust/train are: m & b. +In machine learning, there many m's since there are many features. +These m's are denoted using a matrix referred 2 as w(weights). +The b's are organized into another couple matrix referred 2 as b(biases). +After training once & getting a prediction, adjust the weights, w & biases, b. 5. EVALUATION: +Test model against data that has never been used 4 training. +Representative of how the model would perform in real world. +Great split ratio example: 80% training & 20% evaluation. 6. PARAMETER TUNING: +Example of such a params: 1. The no. of epochs; the number of passes of the entire training dataset the machine learning algorithm has completed . 2. Learning rate; how far we shift the line of y=m*x+y in each step. +The parameters are referred 2 as the HYPERPARAMETERS. +Tuning is more of an art than a science. i.e. it's an experimental process depending on the specifics of: 1. My dataset. 2. Model. 3. Training process. 7. PREDICTION: +Doing sth useful, for example, in this case answering the question on whether it's bear or wine.
Very interesting. How would you handle situations where datapoints from two different categories overlap? A white wine that is close in colour and alcohol content to a white ale? Also, the model you describe is a linear split between the categories. But is that always the case?
Great pace but the lack of accuracy may lead a newbie to big confusion. 1-The shape of b is not correct, 2-you illustrate linear regression while it is a logistic regression case and 3-we choose model parameters using validation data set before the model evaluation using test data set not after.
Seven steps of machine learning: 1. Gathering Data; 2. Preparing that Data; 3. Choosing a Model; 4. Training; 5. Evaluation; 6. Hyperparameter Tuning; 7. Prediction. In my previous jobs, normally the data are gathered already. I need to clear data, link tables, and choose a model ...
Super good and informative video! But I would suggest to take of the glasses or do something about the reflection on the glasses. It's a bit annoying. Other than that, perfect paste of the information and good presentation.
The best way to learn machine learning is to study basic math for ML - multivariate calculus, linear algebra, mathematical statistics, etc - and get yourself jump into the graduate-level, well-known textbooks such as ESL or PRML. And then u start some data analyis projects with reliable teammates and apply what u have learned to the data.
Great presentation. Clear, concise, tidy conceptualization. Well done! Just a very generalised AI / business question, that persistently defies a rigorous answer: Short/medium term growth/gain, at the expense of deep, wide-spread longer term sociological issues? Anyone? I agree, the genie is here, no going back. And a specific question: Whereto once AI presents better presenations than Mr Guo? Narrow AI has its set of issues, but it has nothing on the next Iteration. Intelligence is the ONLY differentiator that keeps Homo Sap as the Apex Predator,.....ever wondered where this leads to? An Intelligence race? Against logic chips? What could possibly go wrong?
:O good video ! It has been a great summary for only ten minutes. Thanks, I will share it with those friends that ask for how neural networks works without technical details