Name: Daniel Tubbenhauer, or simply Dani. Pronouns: they/them.
This channel is a place to store all my math related videos; visual and diagrammatic in nature, of course. Please check out the various playlist for some order behind the chaos of my videos.
All slides are available on www.dtubbenhauer.com/youtube.html
I apologize for all the nonsense that I said so far, and all that I am going to say...
Feel free to contact me if you have comments or suggestions: Write to me via dtubbenhauer@gmail.com, or anything that works for you.
An Artin braid of n-strands is the fundamental group of an unordered configuration in R^2 as B_n = π_1(UConf(R^2)). Following A^1-homotopy the path using the interval y: [0,1] -> X can be replaced with a path using the affine line w: A^1 -> X. The spectrum of a polynomial ring k[t] is equivalent to the affine line A^1_k = Spec(k[t]). Let there be more than one strands (= paths) n > 1 for the Artin braid group B_n. Do ramifications of primes p correspond to crossings of strands in B_n? A corollary following Alexander’s theorem is that these Artin braid groups would be knots. This follows the analogy that primes should correspond to knots proposed by Mazur. See “Thoughts about primes and knots” (2021) by Mazur. At least geometrically, I’ve seen ramifications points for algebraically closed fields presented visually like crossings of strands of an Artin braid group. I am thinking these crossings do correspond because the universal knots/links described by Thurston are branched covers/ramifications. A knot or link L is a universal knot if every closed oriented 3-manifold can be represented as a covering of S^3 branched over L. Similarly, branched covers of Riemann surfaces geometrically described ramifications of ideals of a field k. In this case it is for the set of prime ideals p of the field k which by definition is the spectrum of the field Spec(k).
Haha, I also like sets, no question. But if you want to sell apples, then you need talk bad about oranges 🤣 Just kidding. Set theory was clearly very successful - shout out!
Thank you for this series, it is a great supplement for my regular Algebraic Theory course. You manage to somehow give a very nice intuition to a rather advanced and abstract field of mathematics. Also for me a very helpful aspect was focusing for a moment on the most crucial (normal?) examples, as we usually in mathematics are very careful not to loose any pathological case - and at the beginning it's easy to get lost in them. Thank you again and greetings from the Vienna University :)
The problem with considering the two cases duals is that the dual of the vertex is not the edge, but areas between the edges. If the process of replacing vertices with edges, as described, was a dual, then applying the same algorithm to the dual should get back to the original, which it doesn't.
Thanks for the comment, I am glad that you liked the video ☺ Haha, I am not sure how an “example” is supposed to look like (as you require AoC) 😂 But here we go: Consider R as a Q vector space and choose a basis B, using AoC. The any choice B→R extends to a linear map R→R. Most such maps are not continuous. I hope that helps!
Nice little video...Can you please include Topos Theory in this playlist? A nice 15 minute video will do, provided you won't be including it in your Alg. Geom. playlist. Thanks for your consideration and keep up the great work!
Your explanation is a bit incorrect. The statement that every even number can be expressed as the sum of two prime numbers is the Goldbach Conjecture. In other words, the statement that every integer can be expressed as the sum of three prime numbers is the Goldbach Conjecture.
I’m not sure I followed how we can make a finite open cover with affine varieties. From my understanding an open cover is a family of open sets. Affine varieties are exactly the closed sets in the Zariski topology (is that correct?) So a finite family of affine varieties is a family of closed sets. I understand that sets can be simultaneously open and closed…. I’m just not sure I’m following what space the open cover is coming from.
Good question. For an affine variety itself, it is open with respect to itself. So its covered by itself. (You are correct that such a variety is closed with respect to the parent space, but we can take it with respect to itself.) In general, you want to use the so-called distinguished open subset. These work as follows. Take a polynomial function f on the variety, and take the set of elements that do not vanish f(v) not zero. These sets are open and “very large”. Does that make sense?
_Math has the Reals_ Double or nothing? Mathematicians: let's go! _Math now has the Complex_ Double or nothing? Mathematicians: let's go!! _Math now has Quaternions_ Double or nothing? Mathematicians: let's go!!! _Math now has Octonions_ Double or nothing? Mathematicians: let's go!!!! _Sedenions entered the chat_ Oh Shi- go back!
@@VisualMath It's a shame you didn't include the famous Tao-Green theorem about arbitrarily long arithmetic progressions in the sequence of Prime Numbers...perhaps for another video? I believe this was one of the reasons Tao won the Fields medal.
When I look at the big RU-vid channels and their questionable quality, then I rather stay small 🤣 Haha, just kidding. I am glad that you like the channel, you feedback is really appreciated ☺
That map [t] -> [t,t^-1] looks familiar. Maybe this is the difference between multiplicative group G_m and additive group G_a ([t] -> [t]). Although, over fields of characteristic 0 they are equivalent. See 2.2.2 and 2.2.3 of “Complex cobordism and algebraic topology” (2007) by Morava. It also reminds me of inverting a Lefschetz motive for some reason. I think Emerton’s answer to “Why does one invert G_m in the construction of motivic stable homotopy?” on MathOverflow at least gets some of what I was after with the quote: “… I believe that inverting G_m is same thing as inverting the Lefschetz motive”.
Yes, that should be the difference between the multiplicative group and the additive group. But they are not isomorphic (equivalent) - not sure what you mean with that 🤔
@@VisualMath Sorry, I think I was wrongly conflating the multiplicative group with the multiplicative group law and the additive group with the additive group law.
@@Jaylooker Ah, no worries. I get confused all the time. The way I remember that they are not the same is via the coordinate rings (polynomials versus Laurent polynomials) 😀
@@VisualMath Good point. That clarifies things. I think the same maps appear again with Laurent polynomials having rings R[t, t^-1] and polynomials having rings R[t]. These are still related. The localization of a commutative ring S away from an element s ∈ S is a universal way to invert s. One example is localization of polynomial ring Z[t] which is the Laurent polynomial ring Z[t,t^-1] which provides one map. The other map is the identity of a polynomial ring. Localization also apply to categories and this localization of categories is what I had in mind when inverting the Lefschetz motive.
Fascinating. But what I'm puzzled about is that: As a covariant functor, Hom (-, X): Cop to Set should preserve the composition of Cop (because it reverses the composition of C) . So, precisely, should the Cop on the left side of the slides be C?
Hmm, excellent question. A typical “sign” error? I have no idea 🤣 Let me still try: First, a contravariant functor F from C to D is a functor from C^(op) to D. Ok, this way we can get rid of the “contra” and focus on usual functors. Next, it seems then its en.wikipedia.org/wiki/Yoneda_lemma#Contravariant_version Maybe what is confusing is that I tried not to mention contravariant functors?
@@VisualMath Hmm, I may get it. By studing functors via functor category, as objects in this functor cat Fop: Cop to Dop must be the same as F: C to D in some sense. So if define F: Cop to D, then is it just the same as Fop: C to Dop? Since arrows are more significant than objects, covariant functors masy just provide a "reference" for contravariant functors. And which one is co- and the other is contra- makes nosense though. Well, I also agree with the idea of not to mention contra-. From the learner view, maybe describing both C and Cop simutaneously is better?(since the usage different symbols for C and Cop in the previous video) Then every contra- functor may just constructed wih aid of the functor C to Cop. I think this may be helpful.
@@M0n1carK Yes, exactly. At one point we have to face a choice whether we prefer, say for groups, f(ab)=f(a)f(b) over f(ab)=f(b)f(a). I feel the first is nicer 😅 Whatever is then studied in CT should then be an extension of "familiar" constructions, hence I like to ignore contravariant functors 😀
@@VisualMath I am waiting to see the video about scheme🙂. It is hard for me to understand the concept and its uses and differences between algebraic variety and scheme.
Aha! I just understood why you need rad(J). Which might explain why I took the numerical analysis/optimization qualifier instead of algebra in the distant past (40 years ago?). Cramming all summer for real/complex and num/opt was stressful, of course, but I'm pretty sure I wouldn't have passed algebra.😮
I am glad that the radical now makes sense 😀 I can feel you: sometimes it takes me years to understand something. That is why talking with people is so important 🙂
Curiously enough, Michael Penn just posted an algebraic geometry video today where he says he isn't able to wrap his mind around the concept of sheaves
@@VisualMath thank you for your kind words. It’s incredible that you still keep replying to fans/supporters/students. I want to thank you again for sharing your knowledge and enthusiasm, and for all the effort you put into this channel. I took AT last semester but I’m sure I’ll come back to this playlist later on
Great video. But, what I have learned about "solvable" group just requires the quotients to be Abelian, not prime cyclic ( named "supersolvable" ). What confuses me is that, why do we have the meaning of definition "solvable" other than "supersolvable"? It seems sufficient we just define "supersolvable" then solve the problem of radical solution. And what I have learned is through "solvable" groups... Is it just a result of generalization to some extent?
The example to keep in mind is the alternating group A4: it is solvable but not supersolvable as the Klein four group Z/2Z x Z/2Z appears (that one is not cyclic). That the alternating group A4 (or rather the symmetric group S4) is solvable is the reason why there is a formula for the roots of polynomials of degree 4. Thus, the notion supersolvable is not enough for polynomials and that is why we need the generalization solvable. I hope that makes some sense 😀
@@VisualMath Really helpful! It reminds my mistake. For a supersolvable group, it must have additionally Gi is nomal subgroup of G ( which I have carelessly ignored ). And moreover, it also reminds me that when a normal series is refined to a composition series, the factors must be prime cyclic. It is equivalent and goes well! Sry for my mistake and tks for your help!😄
Hello again my friend. Just randomly stumbled across this video, and wanted to ask you to do a video on Sphere packing (in arbitrary dimensions), which is apparently a growing and blossoming field these days. Also, I wish to collect some ideas of it for work in particle theory on the physical side of things. In any case, some insight from you on this area would be useful and certainly entertaining. Thank you for all your work and contributions to mathematics education to a broader audience. Also, I love your new Algebraic Geometry series! Can't wait for more!
Thanks for checking in, its always good to have you here ☺ Sphere packing is certainly fun. Last time I checked not that much was known (for the nonregular or lattice case), but you are correct that the fields is growing very fast. I will have another look. I enjoy doing the AG series - thanks for the suggestion!
This is beyond great, unlike other videos that are not straight to the main idea! Would you like to make some videos about rings of differential operators, particularly with polynomial coefficients? It is highly related to Gröbner Bases (and of course, Weyl Algebra). I am currently studying it for my thesis. Thank you! Also, I have already hit that subscribe and like button ;)
Thanks for watching 😀 I guess you are studying some form of algebraic geometry? At the moment I am not planing anything on the Weyl algebra, but we will see what the future holds.
Great video. But when I saw the video at the end, I had doubts about the set S, shouldn't S be containing s(a) ≠ 0 (otherwise [1] can not be contained in S ) ? Sorry to bother.
I noticed the pattern F is iso if there exists a G: D -> C where GF equals id_C and FG equals id_D, also F is equiv if there exists a G: D -> C where GF iso id_C and FG iso id_D. Is there an even weaker notion where G: D -> C where GF equiv id_C and FG equiv id_D? And if said weaker notion exists then are there infinitely many of such notions each weaker than the last?
Hmm, that is an interesting question. In the usual categorical setting, I have never seen the notion of “equivalence of functors”. However, when you go to higher categories, then there are many more notions of “equal”, so you should get the infinite hierarchy if you go to higher categories. Maybe these two links help? mathoverflow.net/questions/402558/does-there-exist-a-definition-of-equivalence-of-functors mathoverflow.net/questions/7666/lax-functors-and-equivalence-of-bicategories?rq=1
@7:00 Functors are not vanilla arrows. They must be arrows between arrows *_and_* between objects, otherwise they make no sense. So in CAT you cannot ignore the objects. That's why you cannot get an element-free definition for a _full functor._ So Category Theory is definitely not "just about the arrows". It is only that an _emphasis_ is on the arrows.
It depends where to put the emphasis 😂 My take is that the objects do not matter. Not in the sense that you do not need them, but rather that you should not care about them 😀
@5:20 oh man, what a downer. I really like your series and relaxed delivery, but Mathematica™? Seriously? That prices out a lot of poor kids (and myself). Can't you bend a little to redo interactives in SAGE or Maxima or similar. In Jupyter you can use Sympy and Galgebra (the pypi library, not the gui Geogebra, although the latter is useful too) combined with Plotly. You have to support free-libre software dude. So much of the world runs on free-libre, we all should give back by refusing proprietary software. (I do realize the irony of posting this on youtube.)
Well, nobody is perfect 😅 and every subscription model (free or paid or in between like RU-vid 😁) has advantages. Even Python has some advantages 🤣 Anyway, thanks for the additional references, those might indeed be useful for someone.