Hey dude, just be confident and stop asking sorry if people dint understand, you did great and you are doing great so be confident and make more details and your intro is very loud look into it 😊
Hey Shraavya! please make a video on these topics. College - IIIT Allahabad Dear All, Your C2 evaluation is scheduled on Tuesday 8th November, during your class time of 9:00 AM to 11:00 AM. Syllabus includes: KNN, Perceptron, Dimensionality Reduction Techniques: PCA, MDS, Isomaps, LLE, TSNE, UMap, MLE, Naïve Bayes Classifier, Decision Trees, Random Forest, Bagging, Boosting: Ada boost, XG Boost (and any other topic discussed/assigned in class). All the best, Dr. Muneendra Ojha
Explain Rosenblatt’s perceptron model. How can a set of data be classified using a simple perceptron? Using a simple perceptron with weights w0, w1 , and w2 as −1, 2, and 1, respectively, classify data points (3,4); (5, 2); (1, −3); (−8, −3); (−3, 0).
Hiii mam. u r explaining all thing very well . I am engineering student. We have a subject in third year in mechanical branch is AIML. I am student in snjb engineering coe chandawad. Under sppu University. So Aiml theory exam is on 29 Jun. ..please explain topics shortly. It is IT based subject fully therotical. very hard. I request to u please make video shortly.
Mam please make video according t o jntua syllabus of ML mam... I saw all your videos mam it's osm mam I can easily learn the things mam ... Please do the video of 5 units mam please reply mam
Mam can you please make vedios on introduction to data science (IDS) according to JNTUH r18 syllabus.your vedios are very helpful and and make as soon as possible because my exam are nearer.
Hlo mam I like your way of teaching So can you please make more vedios on my syllabus?? SECTION-A Basics: Biological Neuron, Idea of computational units, McCulloch-Pitts unit and Thresholding logic, Linear Perceptron, Perceptron Learning Algorithm, Linear separability. Convergence theorem for Perceptron Learning Algorithm. Feedforward Networks: Multilayer Perceptron (MLP), Gradient Descent, Backpropagation, Empirical Risk Minimization, regularization, autoencoders. Implementing MLP with Keras, fine tuning Neural Network Hyperparameters SECTION-B Deep Neural Networks: Difficulty of training deep neural networks - vanishing/exploding gradient problems. Better Training of Neural Networks:Reusing Pre-trained Layers: Transfer Learning with Keras, unsupervised pre-training, pre-training on an auxiliary task Faster Optimizers: Momentum Optimization, Nesterov Accelerated Gradient, AdaGrad, RMSProp, Adam and Nadam Optimization, Learning rate Scheduling Avoiding Overfitting through regularization: l1 and l2 regularization, Dropout, MC Dropout, Max-Norm Regularization Newer optimization methods for neural networks (Adagrad, adadelta, rmsprop, adam, NAG), second order methods for training, Saddle point problem in neural networks, Regularization methods (dropout, drop connect, batch normalization). SECTION-C Custom Models and Training with TensorFlow, Loading and Preprocessing Data with TensorFlow Recurrent Neural Networks: Back propagation through time, Long Short Term Memory, Gated Recurrent Units, Bidirectional LSTMs, Bidirectional RNNs Convolutional Neural Networks: Deep Computer vision using CNN: Convolutional Layer, Pooling Layer, CNN Architectures: LeNet, AlexNet. GoogLeNet, VGGNet, ResNet,Xception, SENet,, Object Detection SECTION-D Generative models: Restrictive Boltzmann Machines (RBMs), Introduction to MCMC and Gibbs Sampling, gradient computations in RBMs, Deep Boltzmann Machines.
Mam one doubt in this unit they having a feed forward neural network that is single layer and multi layer and other topic perceptron also has single layer and multilayer both are same ah mam pls reply exam is near
Can u pls make video on maximum likelihood estimation ,bias and variance ,bayes estimator ,parametric classification ,regression and model selection procedures plsss it would be a great help if u do our exams start from July 10th atleast before July 6th if u do it will be very helpful 🙏🙏🙏
It seems you are missing comparison with zero. A perceptron using thresholding to determine the output. Greater than 0 then output = 1, else output = 0.
4.3. Consider two perceptrons defined by the threshold expression wo+w₁₁+w2x2>0. Perceptron A has weight values wo = 1, w₁ = 2, w₂ = 1 and perceptron B has the weight values wo=0, w₁ = 2, w₂ = 1 True or false? Perceptron A is more general than perceptron B. (more general than is defined in Chapter 2.) Mam can you please explain this by Sunday ..
4.3. Consider two perceptrons defined by the threshold expression wo+w₁₁+w2x2>0. Perceptron A has weight values wo = 1, w₁ = 2, w₂ = 1 and perceptron B has the weight values wo=0, w₁ = 2, w₂ = 1 True or false? Perceptron A is more general than perceptron B. (more general than is defined in Chapter 2.) Mam can you please this by Sunday...