Help me keep making videos: paypal.me/mlbakermath Class field theory seminar 2015-01-31 MC 5403 Speaker: Michael Baker Video courtesy of Amanda Chan csc.uwaterloo.ca/~mlbaker/w15
This is a great lecture. Thank you very much. I just got fascinated by category theory and have browsed a number of videos, but this is my fave so far. Excellent exposition. Please do more if possible!
I laughed so hard at 7:28 when you said "well, now is the part of the talk where I basically list every mathematical object I know to you." Yep, that's categories.
A tour de force! A course on category theory in two hours. Amazing job. Just one small point: worth pointing out perhaps that in the category of Sets, the morphisms are not the functions between sets under the usual definition: ie sets of ordered pairs of elements, such that if (a, b) and (a, c) are in the set, then b = c. This is because although the domain is well defined for such sets of ordered pairs, the codomain is not. Maps in sets instead are ordered pairs of functions together with their codomain, ie objects looking like (f, B) where range(f) is contained in B. We can say whether or not f is monomorphic without knowing B, but having B enables us to say whether or not a map is epimorphic (if range(f) actually equals B) and therefore to bring the whole apparatus of duality into force. Also, the equation of (fg)h with f(gh) requires B to be specified, for it to work with set functions.
Thanks. The data of a category, however, allows one to determine the source and target of any morphism. So there is nothing wrong with defining Set by saying that its objects are sets, and the morphisms between two objects (i.e. sets) A and B are precisely the functions f : A -> B. For me, and most others, the codomain of a function is part of its data anyway. There's no point in scaring people for no reason.
Thanks for your comment mlbaker, whoever you may be. You prompted me to do a little research on all this, or at least to roust around in some of my old textbooks. I am grateful for that, as it cleared a few things up for me. I am guessing, by the way, that this may be a generational thing. I was taught foundations by JH Conway a fair old time ago (yes! I have sat at the feet of the sage....), and in those days functions were sets of ordered pairs defined as above. If students are now taught that functions are something else, well fine and dandy. But there is a bit of history to all this. We didn't get there all in a moment. If you need a reference by the way, look up Cohn's Universal Algebra, still regarded as a classic (and which has a lot of overlap with Category Theory). He defines FUNCTIONS on page 4 in terms of ordered pairs only, and a MAPPING on page 12 in essentially the (f, B) format that I used above. Stoll, Introduction to Set Theory and Logic, page 34 defines a function as a set of ordered pairs such that if and are members of f, then y = z. No mention of a codomain. I would also refer you to Saunders Mac Lane's excellent "Mathematics: Form and Function". On page 131 he writes: "this convention (that a function carries with it a specified codomain) is usually not made in elementary mathematics...". The book was published in 1986, so maybe he would not feel it necessary to make the comment today, if you are right. If... The Wikipedia entry on function (mathematics) states: "In mathematics, a function[1] is a relation between a set of inputs and a set of permissible outputs with the property that each input is related to exactly one output. " No codomain mentioned. Three paragraphs later, the entry does a smart about-face: "In modern mathematics,[3] a function is defined by its set of inputs, called the domain; a set containing the set of outputs, and possibly additional elements, as members, called its codomain; and the set of all input-output pairs, called its graph." OK, so we do need codomains after all. I just wonder how many math majors realize the significance of all this, and why graphs alone are not enough. I think people need to be made aware of the reasons behind the definitions, otherwise their understanding will be limited. Yes, even if it involves "scaring" them (what a strange word to use!) and certainly it would not be for no reason. Hope this makes the point of my comment clearer? I do recommend MacLane's book by the way. The master (who virtually invented category theory) is always worth reading. Best wishes, Rory
Just one last comment on this. In the category of sets, even such an excellent teacher as Steve Awodey is liable to confuse mapping (with domain and codomain specified) with function (ie the graph of a mapping, the set of ordered pairs). I refer you to page 4 of his Category Theory (2nd edition). He writes: "by the way, this is, of course, what it means for two functions to be equal: for every argument, they have the same value". I rest my case.
Although I don't doubt you're correct, amusingly enough for 99% of mathematicians this subtlety will never be relevant (as evinced by many people I've spoken to agreeing that the codomain should be part of the data of a function). Also, I've probably never met anyone who doesn't use "function" and "mapping" synonymously. ¯\_(ツ)_/¯
Then you and your colleagues are all much more enlightened than I was at your age. Evidently it's a generational thing as I thought, and people now have a better mathematical education than their parents, like they have better teeth.
So, explaining what I understood so far in my terms I know: A category is a directed reflexive transitive multigraph. For any two arrows with a common middle point, you have to be able to give the "transitively merged" (or "combined") arrow. This arrow must be the same as one of the arrows if the other is 'the' reflexive (i.e. identity) arrow. Furthermore, taking three arrows 1,2,3 sharing two middle points, you can combine them pairwise in any order; ((12)3) or (1(23)) to get always the same arrow.
+Paul Frischknecht Yes, arrow composition in a category is reflexive (all objects have an identity arrow) and associative. We must consider a Category as a directed graph since it is important to know the domain and co-domain of arrows (see: wikipedia - function composition). Functors between two categories are graph homomorphisms. If we reverse the direction of the arrows in a Category C we get the dual category C^{OP}. We can define covariant and contravariant functors via this duality. A covariant functor preserves the orientation of the arrows (and thus the overall structure of our category) e.g. F:C -> D => Hom(c, c') -> Hom(Fc, Fc') where Fc, Fc' are objects in the category D. A covariant functor G:C^{OP} -> D is actually the same as the contravariant functor H:C -> D, and thus contravariance reverses the orientation of arrows in our category.. Thus, functors are structure preserving (in either a covariant or contravariant way) and generalize graph homomorphisms. We can also discuss natural transformations between Functors in exactly the same way. Consider a graph (Category), where the nodes (objects) are all the possible Functors (F_n) between two categories C, D and our edges (arrows) are natural transformations. Then, this holds axiomatically as a category since any two natural transformations i: F_1 ->F_2 and j: F_2 -> F_3 can be composed j * i = k: F_1-> F_3. Identity is also obvious.
About minute 12:30 : Are you sure that the integers are an initial object in the category of rings? Yes, you can send 1 to the multiplicative identity, but you can also send it to zero, which is still a homomorphism, giving you 2 different morphisms.
@@litsky hmmmm yeah I'm pretty sure that's not true at all. They take 1 to an idempotent element. For instance, you can map the integers, Z, to Z+Z (here + is the direct sum) in several ways: you can take 1 to (1,0), (0,1), (0,0) or, your favourite, (1,1). There is also the possibility that you were somehow taught a completely different definition of ring homomorphism, which of course we could have already figured out had you given a more explanatory response. I hope you weren't taught mathematics by having you repeat after the teacher! Cheers
@@joaocandeias7093 You misunderstood me. They take 1 to 1 by *definition*. It is literally an assumption we make. Check the Wikipedia article - my convention is the mainstream one, not yours. The concept you have in mind is something different, which mathematicians have taken to calling a "rng".
I might not come back to see any replies, but what applications does category theory have to computer science? I've heard that they're connected but idk how
Check Bartosz Milewski's talks for the connections with computer science specifically, or look up the Curry-Howard-Lambek correspondence, or learn about the Haskell programming language
@@hywelgriffiths5747 Gotta love it when several months of learning topology makes up* just one of the preliminaries of big boy CS. But thanks for the recommendations.
You were giving an example of a functor at about 43:00. I don't think I understand what the example is because I don't see how you could map an arbitrary group to some finite dimensional complex vector space. I remember you said that something only applied to compact lie groups, and that's why I asked.
No, there I was just saying that a functor from G (viewed as a category) to the category Vect_C^fin of all finite-dimensional complex vector spaces amounts merely to a representation of G, i.e. to a group homomorphism G->GL(V) for some finite-dimensional complex vector space V. Here GL(V) is the group of all invertible linear operators on V. This all makes sense for any group G. The facts I stated about being able to *decompose* any such representation into a direct sum of irreducible subrepresentations, however, require more assumptions (e.g. that G be finite, or a compact Lie group).
just one thing to note onthe definition of the composition map at around 4:00I think you have reversed the Hom(c , c') and Hom( c' , c'') Hom(c , c' ) after Hom(c' , c'') is not equal to Hom(c', c'') after Hom(c, c') you have better to reverse Hom(c , c' ) after Hom(c' , c'') by Hom(c', c'') after Hom(c, c')
***** There are a handful of books on category theory pitched at around this level (and higher -- and by that I don't necessarily mean "higher category theory" :). Are there any texts besides Mac Lane that are worth mentioning here, particularly for self-study? Any personal favourites? Cheers, thanks.
supermanifold Personally I quite like Steve Awodey's book "Category Theory", though that is (he admits) based on Mac Lane's "Categories for the working mathematician". There is also a set of 4 lectures on RU-vid by Awodey which are quite good, and go through the things in this lecture in more detail (though still a bit rushed...). I find it better to read the same stuff multiple times by different authors to get a feel for what's going on. "The joy of cats" is also not bad, and available online for free too.
He comes out as someone that put an effort in to teach the way he himself would have wanted to be tought. Very smart guy, logical and structured. Like the way he keeps talking when he erase the blackboard.
Please dont make such remarks that CT cannot solve important problems in maths. You just have to look at the tremendous impact that CT had on Algebraic Geometry and The work of Lawvere among others. Also one cannot pre decide a problem that a theory may eventually solve. If CT is just a language then so is whole of Maths for that matter...
Richard Arline I'm fully aware of the developments you're referring to. Although they would hardly have been possible without CT, it is absurd to claim that category theory alone was responsible for the development of algebraic geometry. My point is that there is no free lunch. You cannot expect to derive profound geometric truths just by sitting down and studying structural properties of some category. At some point along the line you are going to have to actually work with the material itself, rather than merely the formal structure.
mlbaker I think category theory can be studied for its own sake as much as group or ring theory or even topology can. There comes a time when the subject becomes mature enough to pose nontrivial questions internally. I suppose we can say group theory by itself never really "solved any important problems in maths" either. But once you show something is a group, you get a whole theory of results for free. A simple example in category theory is that localization of categories have nice properties with regards to limits and colimit constructions. Therefore, once you show sheafification is a localization construction, you get all the nice colimit and limit constructions for free. We should probably avoid saying things like "category theory is just a language" or "it can't solve problems". Great intro to the subject otherwise!
The univalent foundations of mathematics is a tremendous result in pure math. With category theory we can redefine our entire topos in terms of many things, and ZFC becomes a category. We do model theory this way! Categories are responsible for the development of the more modern type theories such as Dependent Type Theory and Homotopy Type Theory, all incredibly relevant in computer science and constructive mathematics. You could say the seL4 (formally verified microkernel) is an important problem in computer systems design which could not be solved without the underlying category theory that went into the type systems of the Haskell programming language and the Isabelle/HOL interactive theorem prover. Without these type systems we wouldn't have automated theorem provers at all. This brings to mind the famous 4 color theorem. Vladimir Voevodsky got the fields medal for a long and complicated proof which contained a bug. He proved a theorem which was false and got the maximum award for mathematics for it. In 2003 a counterexample for his theorem was published, and it took him almost 10 years to realize his proof was in fact incorrect. This lead him to work on univalent foundations and constructive mathematics. Without results and unresults like this, the intuitionist rigor of constructive mathematics would not be gaining so much relevance.
Hmm, I think I saw the video you are talking about where Voevodsky talks about his mistakes that motivated him to move towards HoTT/UF. But I thought the mistake discovered (by Carlos Simpson?) in 2003 was related to something else (his early work with Kapranov on Grothendieck's homotopy hypothesis/correspondence?), not anything directly related to his Fields medal work on motivic cohomology and the Weil conjectures. But I don't know any of this first hand, so I may have confused some things. But it is good to know that Voevodsky is working on the computer verification of his motivic cohomology work. The techniques for modeling definitions and proofs, and the proof tactics he will need to specify, may useful to move forward other areas of math. It may even have a wider impact. I was struck by some remarks he made that encoding basic algebra for HoTT was straight forward until he came to fields, where there are three or four different ways to do it. Each way may be useful, I think; maybe the different granularity of using Univalent foundations and the discipline of striving towards constructivity (rather that classical logic and set theory for foundations) may provide genuine insight by revealing distinctions that are collapsed together in older foundations. New ways of generalizing from those finer distinctions, probably by using category theoretic tools, could rewrite certain areas of math and applications.
Having digested all this (and what a wonderful, if semi-manic, presentation by the way), I find myself left with a perverse craving to see the presenter kitted out in ostentatiously handcrafted elven cosplay. #latenitemath #wizard? TYVM!