Тёмный
Edgar Programmator
Edgar Programmator
Edgar Programmator
Подписаться
Cubic interpolation between 2D points
2:18
6 месяцев назад
The perceptron neuron: the simplest AI
8:14
6 месяцев назад
Solve any equation using gradient descent
9:05
7 месяцев назад
A simple algorithm for 2D Voronoi diagrams
3:27
11 месяцев назад
How to Tell if a Point Lies on a Line (Segment)
2:16
11 месяцев назад
How do I rotate a 2D point?
2:05
Год назад
Limit
9:29
Год назад
Infinito
0:38
Год назад
Point in polygon (Python3)
2:29
Год назад
Calculus in Python
5:18
Год назад
Convolution
6:19
Год назад
The Recursive Square Function
3:37
Год назад
Merge Sort - Recursive Procedure
1:42
2 года назад
Merge Sort example
0:54
2 года назад
Angles from 3 points in computer
2:59
3 года назад
Комментарии
@timelessMotivationchannel
@timelessMotivationchannel 18 дней назад
For a second I thought my iPad is possessed
@AliKamel2004
@AliKamel2004 22 дня назад
Thanks very much 🇮🇶 🥰
@cedrsc
@cedrsc 27 дней назад
I've found a better way that allows you to get the distance between the point and the line. First, check if the segment is longer than distances between each end and the point (i.e. the projection of the point on the segment line is in the segment) Then, we need to compute the area of the triangle between the ends of the segment and the point. It can be easily done using Heron formula (check wikipedia for details) Then you just have to double the area and divide it with the segment length and you have the distance between the point and the line. You can now check if the point is near enough in your context to be considered on the line.
@rushikeshdeshmukh2034
@rushikeshdeshmukh2034 Месяц назад
Your video is very effective. Can you please tell me which tool you use to draw the diagrams and animate?
@CuongPhan-pt5ff
@CuongPhan-pt5ff Месяц назад
This is what I am looking for, very easy to understand. Thanks for sharing
@starplatinum3305
@starplatinum3305 Месяц назад
Please do a 3d version maybe ? Or a quaternion video ?
@starplatinum3305
@starplatinum3305 Месяц назад
Amazing
@jcaceres149
@jcaceres149 2 месяца назад
However, this algorithm is not optimal in the worst case, and it does not deal with unbounded Voronoi cells
@potatomo9609
@potatomo9609 2 месяца назад
whats with all the jump scares? 😭
@uncleole503
@uncleole503 3 месяца назад
this is very different from Fortune's algorithm
@unveil7762
@unveil7762 3 месяца назад
This is very cool thank you
@trumpgaming5998
@trumpgaming5998 4 месяца назад
Okay but why don't you explain why this method doesn't work sometimes for particular degrees depending on the function
@trumpgaming5998
@trumpgaming5998 4 месяца назад
For instance if you wanted to minimize cos(x) = c1 where c1 is a constant, using gradient descent one way or another yields you that c1 = 0, but the constant term in the taylor expansion of cos(x) is 1 since cos(x) = 1 - x^2/2 + ... This means that you have to include at least the 2nd term for this to work, or even a higher degree depending on the function other than cos(x) in the example.
@ritwickjha3954
@ritwickjha3954 4 месяца назад
when the ray casted from the point crosses a vertex, the one intersection is counted twice (because 2 edges are defined to have that point), which will give wrong answers
@stevencrews5796
@stevencrews5796 4 месяца назад
Thanks so much for this! I needed to find centroids of irregular polygons for a Matter.js project and your explanation and code examples got me up and running quickly.
@shihyuehjan3835
@shihyuehjan3835 4 месяца назад
Thank you so much for the video!
@NeoZondix
@NeoZondix 4 месяца назад
You're Chopping it
@gutzimmumdo4910
@gutzimmumdo4910 4 месяца назад
what's the time complexity of this algo?
@tedlorance6968
@tedlorance6968 4 месяца назад
Out of curiosity, is there a known or best-guess optimal or near-optimal value for the padding in the algorithm? Perhaps related to the mean distance between the sites?
@matinfazel8240
@matinfazel8240 5 месяцев назад
very helpful :))
@aleksandrstukalov
@aleksandrstukalov 5 месяцев назад
Is there any research paper that you took this algorithm from?
@EdgarProgrammator
@EdgarProgrammator 5 месяцев назад
No, I couldn't find an easy, step-by-step algorithm for building Voronoi diagrams (unlike Delaunay triangulation algorithms, which are easy to find). That's why I created this video.
@Kewargs
@Kewargs 3 месяца назад
​@@EdgarProgrammatorWhat about the Fortune sweep algorithm?
@zzz_oi
@zzz_oi 5 месяцев назад
this channel is art
@EdgarProgrammator
@EdgarProgrammator 5 месяцев назад
Thank you
@jamesgaither1899
@jamesgaither1899 5 месяцев назад
Edgar who is that guy? XD
@richardmarch3750
@richardmarch3750 5 месяцев назад
this is exactly how math should be ngl
@GustavoCesarMoura
@GustavoCesarMoura 5 месяцев назад
lmao the jumpscare
@ignSpoilz
@ignSpoilz 5 месяцев назад
Omg the nun face why 😭😭
@EdgarProgrammator
@EdgarProgrammator 5 месяцев назад
idk 😐
@aleksandrstukalov
@aleksandrstukalov 5 месяцев назад
This is an awesome explanation of the algorithm! Thank you for sharing such a helpful content!❤❤❤
@toddkfisher
@toddkfisher 6 месяцев назад
Would a sixth degree polynomial in x be referred to as "x hexed"? Really like the video.
@LEGEND_SPRYZEN
@LEGEND_SPRYZEN 6 месяцев назад
We are taught this in high school class 12.
@korigamik
@korigamik 6 месяцев назад
Bro this is cool. Can you share the source code for the animations in this video?
@stuart_360
@stuart_360 6 месяцев назад
oh its good , but i thought i will be able to apply it in my exams lol
@rosettaroberts8053
@rosettaroberts8053 6 месяцев назад
The second example would have been solved better by linear regression.
@beautyofmath6821
@beautyofmath6821 6 месяцев назад
Beautiful and very well made video, I personally loved the old tv vibe to this, not to disregard the instructive yet nicely explained method of gradient descent. Subscribed
@bernardoolisan1010
@bernardoolisan1010 6 месяцев назад
Why squaring the function? do we always need to square the function to solve it via gradient descent?
@nguyenthanhvinh5942
@nguyenthanhvinh5942 6 месяцев назад
Gradient descent is finding optimal minimum point of the function f(x), not finding solution of f(x)=0. However, optimal point of any f(x) is exactly the solution of f'(x) (derivative function of f(x)). So, in case your function has only one variable, to find the solution of f(x)=0, you can replace the derivative term with f(x) and so on. If your function has more than one variable, you can't replace, cause there's only one function has been given, so you do not know that function is depends on which variable (as mentioned above, if you have one variable, f(x) is derivative function depends on x when you use Gradient Descent to find solution). So, the solution is using Least Square Approximation method as the video has shown. Function f^2(variable) always has optimal minimum point. If minimum point's value is 0, it is the solution. If not, GD still finds optimal minimum point, but it is not the solution.
@jadeblades
@jadeblades 6 месяцев назад
genuinely curious why you put that in the intro
@jamesgaither1899
@jamesgaither1899 6 месяцев назад
Where did you get the idea for the intro? It's kind of hilarious and terrifying and I love it.
@devrus265
@devrus265 6 месяцев назад
The video was helpful
@roberthuff3122
@roberthuff3122 6 месяцев назад
Panache defined.
@MrBrassmonkey12345
@MrBrassmonkey12345 6 месяцев назад
alan watts?
@AhmAsaduzzaman
@AhmAsaduzzaman 6 месяцев назад
Yes, solving the equation x^5 + x = y for x in terms of y is much more complex than solving quadratic equations because there is no general formula for polynomials of degree five or higher, due to the Abel-Ruffini theorem. This means that, in general, we can't express the solutions in terms of radicals as we can for quadratics, cubics, and quartics. However, we can still find solutions numerically or graphically. Numerical methods such as Newton's method can be used to approximate the roots of this equation for specific values of y. If we're interested in a symbolic approach, we would typically use a computer algebra system (CAS) to manipulate the equation and find solutions.
@AhmAsaduzzaman
@AhmAsaduzzaman 6 месяцев назад
AWESOME Video! Thanks! Trying to put some basic understanding on this: "We seek a cubic polynomial approximation (ax^3 + bx^2 + cx + d) to cosine on the interval [0, π]." Let's say you want to represent the cosine function, which is a bit wavy and complex, with a much simpler formula-a cubic polynomial. This polynomial is a smooth curve described by the equation where a, b, c, and d are specific numbers (coefficients) that determine the shape of the curve. Now, why would we want to do this? Cosine is a trigonometric function that's fundamental in fields like physics and engineering, but it can be computationally intensive to calculate its values repeatedly. A cubic polynomial, on the other hand, is much simpler to work with and can be computed very quickly. So, we're on a mission to find the best possible cubic polynomial that behaves as much like the cosine function as possible on the interval from 0 to π (from the beginning to the peak of the cosine wave). To find the perfect a, b, c, and d that make our cubic polynomial a doppelgänger for cosine, we use a method that involves a bit of mathematical magic called "least squares approximation". This method finds the best fit by ensuring that, on average, the vertical distance between the cosine curve and our cubic polynomial is as small as possible. Imagine you could stretch out a bunch of tiny springs from the polynomial to the cosine curve-least squares find the polynomial that would stretch those springs the least. Once we have our cleverly crafted polynomial, we can use it to estimate cosine values quickly and efficiently. The beauty of this approach is that our approximation will be incredibly close to the real deal, making it a nifty shortcut for complex calculations.
@sang459
@sang459 6 месяцев назад
Elegant
@newmoodclown
@newmoodclown 6 месяцев назад
i thought my screen got dust, but unique style. Nice!
@ananthakrishnank3208
@ananthakrishnank3208 6 месяцев назад
Thank you for the video!! Took some time to grasp the second example. No surprise. This gradient descent optimization is at the heart of machine learning.
@mourensun7775
@mourensun7775 6 месяцев назад
Want to know how you made this video animation?
@hallooww
@hallooww 6 месяцев назад
what text to speech do you use
@markzuckerbread1865
@markzuckerbread1865 6 месяцев назад
Awesome vid, instant sub
@darkseid856
@darkseid856 6 месяцев назад
what is that intro bruh
@zacvh
@zacvh 6 месяцев назад
Bro this video is so fire. I get so annoyed by the voices in my actual school videos that they make you watch and this is a huge step up from that it actually makes this seem like it’s a top secret information like you’re debriefing the first nuclear tests or something
@KP-ty9yl
@KP-ty9yl 6 месяцев назад
Excellent explanation, immediately subscribed 😁