I use a graphics tablet similar to the ones from WACOM. The brand is called gaomon. On the software side I use: - Xournal++ for writing the notes - OBS for recording screen and audio - FlowBlade for cutting the videos - VS Code for the coding-heavy videos like this one: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-ISZwydaKZNY.html - plus smaller tools like GIMP for photo editing, Notion for collecting video ideas etc. I work entirely on (Arch) Linux. All the files (handwritten notes and source code) are uploaded to the GitHub Repo of the channel: github.com/Ceyron/machine-learning-and-simulation
⚙ My Gear: (Below are affiliate links to Amazon. If you decide to purchase the product or something else on Amazon through this link, I earn a small commission.) - 🎙 Microphone: Blue Yeti: amzn.to/3NU7OAs - ⌨ Logitech TKL Mechanical Keyboard: amzn.to/3JhEtwp - 🎨 Gaomon Drawing Tablet (similar to a WACOM Tablet, but cheaper, works flawlessly under Linux): amzn.to/37katmf - 🔌 Laptop Charger: amzn.to/3ja0imP - 💻 My Laptop (generally I like the Dell XPS series): amzn.to/38xrABL - 📱 My Phone: Fairphone 4 (I love the sustainability and repairability aspect of it): amzn.to/3Jr4ZmV If I had to purchase these items again, I would probably change the following: - 🎙 Rode NT: amzn.to/3NUIGtw - 💻 Framework Laptop (I do not get a commission here, but I love the vision of Framework. It will definitely be my next Ultrabook): frame.work As an Amazon Associate I earn from qualifying purchases.
I have watched a lot of ML videos on the internet especially abt probabilistic AI and I have to say that even though you have a small audience you are probably the best RU-vidr out there making videos about that. Keep on with your work, it is a blessing for more and more graduate students haha!
Thanks a lot, Mukunthan. :) I really appreciate your support. The channel is still small, I hope to reach a wider audience in the future. Feel free to share it with peers & friends. :)
@@MachineLearningSimulation will do my best to recommend my friends group. Am expecting this kind of channels to get recognised...This channel motivates begineers and experienced data science people to play with probability and statistics
Thanks a lot :). Much appreciated. I must say, I do not have much experience with PyTorch. You probably refer to the Pyro library. Unfortunately, I could not teach that with the necessary conference. Though, I believe that many ideas should easily translate from TFP to any other probabilistic framework.
At 8:00, you state that the random variable H is distributed by the piecewise function you wrote. However, isn't this notation slightly imprecise? Since writing "H ~" does not explicitly notate that the distribution is conditioned on the value of W. So rather it would be "H|W ~" or "p(H|W) = "
Awesome video! What if the weather were to have 3 categories (bad, average, good) and the happiness were to be a continuous number whose mean and standard deviation were dependent on the choice of weather?
That's a great question. 😊 Indeed, these DGMs are extremely flexible. You should be able to model this in Tensorflow Probability as well. A common application for which you have discrete distribution over multiple categories (which is also called a "categorical distribution" ) that influences mean and standard deviation in a normal distribution is a Gaussian Mixture Model. I cover them later in the series on probabilistic machine learning. Do you have a specific question regarding your proposed DGM for the weather?
@@MachineLearningSimulation really really grateful for your response! I’ll have to checkout those videos! And yes I do. Given one of the three weather categories, there is a mean and std that describe the amount of money spent online. If weather is bad, a website gets an average of 700 dollars of business with std of 50 dollars. If the weather is average, the website gets an average of 500 dollars with std of 40 If the weather is good. The website gets only 200 dollars with a std of 30. Finally, the probability of the weather being either good, average, or bad sums to 1. 0.5 good, 0.4 average, 0.1 bad. It would be awesome to see this worked out.
That sounds a lot like the mentioned Gaussian Mixture Model. Then maybe check out this video from the channel: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-atDp5bkzej4.html Let me know if that helped 😊
@@MachineLearningSimulation thank you 🙏. Also, would these be considered formal machine learning approaches? This is a new field for me. Thanks! Really really great content. A gem.
You're welcome 😊 I'm happy when I could help. Yes, I would consider them "formal Machine Learning approaches". If you want more details and more mathematical derivation, then I would either recommend the book "Pattern recognition and machine learning" by Christopher bishop or "Machine learning, a probabilistic perspective" by Kevin Murphy. But be aware that they are extremely tough mathematically. I try to have a lower difficulty throughout the videos. For context: Gaussian Mixture models are commonly used to cluster data points (that is also how they are implemented for instance in scikit-learn), but they are equally valid generative models, i.e., machine learning models that can produce new (previously unseen) data observations.
Great question! Yes, there are multiple: For instance, check out this video: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-l2f6Ic6SeqE.html on how to use the Automatic Differentiation features of TensorFlow Probability together with gradient descent optimization in order to perform a Maximum Likelihood Estimate. Chances are also, given you select correct distributions, that you can find analytical expressions for the parameters of the involved distributions. Let me know if that helped :)