Тёмный

I Trained a Neural Network in Minecraft! 

William Yao
Подписаться 2,1 тыс.
Просмотров 1,8 тыс.
50% 1

In this video, I show a neural network that I fully built and trained inside of Minecraft!
0:00 Demo
3:51 Introduction to the Theory
7:54 Implementing Forward Pass
11:37 Implementing Training
14:42 Training Timelapse
16:41 Testing
The coolest parts of the video are the Training Timelapse and Testing sections, so for a shorter viewing experience, jump ahead to those! In addition, there are much better resources for neural networks online, so please treat all the theory sections in this video as helpful context rather than rigorous explanations.
Datapack repo: github.com/williamyao27/Neura... (Minecraft 1.20.6)
Mattbatwing's Neural Network: • I Made an AI with just... (my inspiration!!)
Vijayasaradhi Indurthi's article on MNIST PCA: / visualising-mnist-data... (I used one of his diagrams)
Music:
"No Good Layabout", "Subwoofer Lullaby", "Vibing Over Venus", "Bossa Antigua", "Lobby Time", "Dispersion Relation", and "Awesome Call"
Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 3.0
creativecommons.org/licenses/b...
"Subwoofer Lullaby"
C418, cover by Astrophysics ( • Someday I'll See You A... )
Some additional notes that I couldn't cover in the video:
- The loss function includes a weight regularization term that was not shown in the video for simplicity.
- I used minibatching during the training process with batch sizes of 10. Updating the parameters on batches of data, rather than on each data point, is generally important in machine learning, but it's extra important for my model because it helps reduce the cumulative effect of rounding to the nearest integer. Intuitively, rounding the gradient over a batch results in less precision loss than rounding on every point.
- When using a scale factor of 10^4, the maximum true value that can be represented in my system is on the order of 10^5 (since 32-bit ints reach 10^9), which is more than sufficient for any MNIST network. However, since the scale factor gets squared during multiplication, the intermediary value stored during multiplication then gets scaled by 10^8. Thus, if I chose even a slightly higher scale factor of 10^5, the intermediary value would very likely exceed the ceiling. Ultimately, the bottleneck for the scale factor comes down to how you implement multiplication, and there are definitely smarter ways than what I chose. Anyway, this was all a bit too much to explain at 9:44.
- You might notice that my description for how I implemented exponentiation does not exactly line up with what is shown on screen at 11:02. Before breaking the exponent into 0.1, 0.01, 0.001, and 0.0001, I first subtract ln(2) = 0.6931 from the exponent as many times as possible, and for each time I'm able to do so, I eventually multiply the final result by 2. TL;DR this allows my implementation to work for a greater range of exponents because the scoreboard limit for multiplying by an integer like 2 is lower than for multiplying by a float like e^1 = 2.7183.
Special thanks to all my friends who supported me in this idea from start to finish!

Игры

Опубликовано:

 

29 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 28   
Далее
I Made an AI with just Redstone!
17:23
Просмотров 822 тыс.
Harder Drive: Hard drives we didn't want or need
36:47
УРА! Я КУПИЛ МЕЧТУ 😃
00:11
Просмотров 1 млн
Наше обычное утро 💕
00:42
Просмотров 1,6 млн
How AIs, like ChatGPT, Learn
8:55
Просмотров 10 млн
Breaking Minecraft with your Dumb Ideas
14:48
Просмотров 2,5 млн
AI Builds Stuff in Minecraft | Mindcraft
12:44
Просмотров 497 тыс.
When Optimisations Work, But for the Wrong Reasons
22:19
Can You Beat Hitman 3 Without Breaking ANY Laws?
22:37
Simulating the Evolution of Rock, Paper, Scissors
15:00
The Math Behind Building An AI Using DNA #SoME3
11:57
How to train simple AIs to balance a double pendulum
24:59