This is Alexander Farrugia's and Giorgio Grigolo's submission to the second 3blue1brown Summer of Math Exposition. #some2 #mathematics #combinators #logic Music: Icelandic Arpeggios - DivKid
As far as displaying large combinator systems goes, it would probably be easier to appreciate their structure if drawn as a tree as opposed to with tons of nested parentheses.
@@Godfather-qr6ej Sorry, yes. To mock a mockingjay is a 1985 book by Raymond Smullyan. However, my intention was to point to the 1996 post by David Keenan called "To dissect a mockingjay", which is partly based on Smullyan's book. I hope you can find it now. I already edited my original comment to fix this mistake.
@@jackozeehakkjuz yes I found "To dissect a mockingjay" can I ask you a question? where are combinators usefull? I've heard about them from people in lambda calculus as a way to make a tiny turing complete machine. and from APL users who use combinators in some practical way I dont know how?
The single combinator from which S and K can be defined goes well beyond Baker. I was a student of Corrado Boehm in 1985 (from Boehm trees in lambda calculus) and he taught us about the single combinator in 1985... This video reminded me of his fantastic lectures on combinatory logic, thank you.
Very nice presentation, thank you. A suggestion for future videos: when you show some kind of process, e.g. the SKIBC rewrite/translation, it would be really nice if all previous steps were shown along with the current step, e.g. with an additive/accumulative animation. That way, I could compare steps n and n+1 which would help me understand what the rules mean and how to apply them.
@@xactxx what is your opinion of concatenative calculus? i tried to understand the contribution but could not. it just seems like a style change and an implementation detail of adding a fictitious data structure like a stack to create a context for the program. for a calculus that is supposed to be context-free like combinatorial logic, concatenative calculus proposes a big ol context underpinning it.
Nice. I had this book by R. Smullyan that explained this with birds. It was cool. This is a very explanation of the subject. Trying to find out how to get [OR]xy, [AND]xy. And maybe others, but with either of those you get NOR or NAND, and those are universal.
Yes, 'To Mock a Mockingbird and Other Logic Puzzles'. :) We tried to keep the same letters that Smullyan used. The only exception is the N combinator, which Smullyan simply leaves as KI. Otherwise all the other combinators have the same letter as in Smullyan's book. Regarding AND and OR, do you want a hint? :)
@@xactxx I think it would be something like [First]xy[True]y for OR and [First]xyy[False] for AND. I think. And yes, that's the book I was talking about! :)
@@msolec2000 Here's a hint. [And]xy = xy[false]. Now try [Or] yourself. :) Also, try to 'compile' this [And] definition to obtain a combinator expression for it.
Effectively, combinatorial logic with only the iota-operator shifts most of the "work" to the order of evaluation and its manipulation (for a single rewrite rule). To get other combinators from iota, you force nested lazy evaluation of self-application on the right... it's amazing to think what can be expressed mostly with judicious application of parentheses (ask any LISP/Scheme programmer 😄).
Excellent! Do you have something new in the works? In one of the comments you mention "to mock a mockingbird" which is the classic book about combinators, but your presentation seems more modern and focused on the application to computation. It would be nice to know what references you used... it would be also nice to see how this compares to the lambda calculus, or how combinators and lambda calculus are the same... Also the argument is that any computation can be done using combinatory logic but the proof is not obvious (although the result is believable as yet one more example of the church turing thesis)... the fact that one can achieve turing completeness with such simple systems make one wonder whether this has something to do with the basic structures of life. Perhaps the laws of physics build simple systems like combinators, which evolve into what we call life...
Since you only need one symbol (iota) coupled with two grouping symbols (parentheses), you can represent a program using instructions that only take two bits each, with a value left over. Then if you want to support side-effects, do a UTF inspired 'if the chunk is 11 then load another chunk to extend the code' for whatever side effects you wanna add to your architecture/CPU. Though maybe we need a proper padding instruction (usually nop or int 3)
@antoniolewis1016 Same opinion here... it would have been nice if they had mentioned that combinators associate by default to the left, so KIxy is shortcut for ((KI)x)y
hahaha... this brings an important point that is often left out in these type of results... many things are turing complete but computer languages are not only about producing computations but about making it easy for humans to express those computations... combiators are interesting for what they say about the nature of formal languages but they are probably not going to replace any real world programming language any time soon (though functional programming is valid and useful but even pure programming language go beyond implementing combinators)
the real challenge: can you make a proper combinator (a combinator which could be written as a function of N arguments that returns a pure juxtaposition of those arguments) that can compute everything? i'm pretty sure it's impossible. It has been proven that there exists a proper combinator X such that X and I=SKK together can compute everything, but one truly universal proper combinator has yet to be found.
At least first 14 minutes doesn't tell how to verify that stuff works. For example 1:51 in video says KIxy reduces to y. Can I verify it? Or is it just by definition? Why it's not K(Ix)y -> Kxy -> x? At time 3:00 "we have written program" what it does? "This is how combinators logic avoids variables altogether". We write x, y everywhere, isn't it variables? "Our program is simply string of combinators" Well, isn't Ix reduces into x? Isn't program "x" then? Summary: in my opinion this video is very.... very confusing.
Not an expert in this either but it's a mix of both. The first few stuff tells us how notations work. Kinda like the PEMDAS of combinatory logic. So if you have KIxy, it is a signal for you to evaluate it as K(I, x) then y K(I, x) -> I. Then I(y) since I takes in one argument.
Agree. I don't get how KIxy reduces to Iy. It would help if precedence/order is shown. My initial understanding was: KIxy K(Ix, y) K(x, y) y But apparently it should be Iy?
One would first need to represent integers (for example, as a pair of naturals with appropriately defined operations), then move on to define rationals, as pairs of integers. Then reals could be defined using infinite lists. This video did not define lists (we didn't need them for our Fibonacci program) but lists may also be defined in terms of combinatory logic.
@@theunknown4834 writing this in python seems like a nice challenge, if you want a language that can evaluate these combinators with less of a hassle (and builtin lazy evaluation!) I would highly recommend Haskell.
Unfortunate naming for C params, btw, better use Cfxy -> fyx, otherwise the second arg is expected to be used as a function (f, g, and h are usual names for functions, whereas x, y, and z - for values (I know functions are values, it's about their intended usage, not the type)), which is not the case
@@user-tk2jy8xr8b fair enough. The reason why we used fgx is for consistency with the variables used for B and S. Essentially, B and C are the two halves of S, and using the same variables for S, B and C drives home this point.
It is because other combinators can be expressed in terms of S and K alone, for example: I = SKK (1) B = S(KS)K C = S(BBS)KK where C can be written using (1) as C = S((S(KS)K)(S(KS)K)S)KK and so on...
@DeclanMBrennan The iota combiator together with the machinery to interpret it is turing complete (this is like saying that not only you need a program but also a compiler that runs the program)... that sounds mind blowing but then you realize that there are a bunch of things that are turing complete...I believe that so far, the smallest turing complete machine Wolfram 2-state 3-symbol turing machine... this is interesting because it kind of hints that we can get that type of computation with very simple systems, which kind of leads to how intelligence and life are plausible (the argument is that something as simple as the wolfram 2-state 3 symbol machine is simple enough that it could have happened by chance without any intelligent designer)
@@academyofuselessideasI was using "Turing complete" in the sense of a "Turing complete language" but that's a fair point. Thanks for mentioning the Wolfram TM- looking forward to reading up on it. I believe Conway's Game of Life may be Turing complete as well and somebody with way too much free time made hardware within Game of Life to execute its own Game of Life - a cool example of a simulation within a simulation.
@@academyofuselessideas I just had a quick look at your wonderful channel which I didn't know existed. I feel like a kid in a mathematical sweet shop. 🙂
@@DeclanMBrennan Game of life inside game of life sounds pretty cool... The observation that there are many simple turing complete machines is also part of Wolfram's argument that all physics are automatas (kind of like what we observe is some sort of emergent phenomena caused by very tiny automatas)... I like the philosophy behind some of those ideas but I don't know enough about them to give an informed opinion though
@epgui Part of the confusion arises because they didn't explain how combinators associate... so, it is not clear if one should think of KIxy as (((KI)x)y) or as K(I(x(y)). The way you associate things in this context matters... When working with combinators, the convention is to associate to the left, so Kxy->x would be more pedantically be written as ((Kx)y)->x... Based on that observation KIxy = (((KI)x)y) -> Iy -> y... A fun book about this is "to mock a mockingbird" by Smullyan... they mention that book in one of the comments... Working with combinators take some practice
In simple words, the First symbol is applied first always (lazy evaluation). And since K takes the 2 objects in front of it and becomes the first one, ignoring anything else (KIx -> I)..So KIxy -> Iy -> y.
@@Bobby_101 In other words, everything is implicitly left-associative, and symbols representing functions are treated the same way as symbols representing values? That seems a bit weird syntactically, but if that's the rule then that makes sense.
@samytamim2603 @samytamim2603 awww... that happens with some math explanations... in my experience, it is best not to think to much about the math that makes you feel dumb and instead focus on the math that you enjoy and understand... even professional mathematicians have no idea of what other mathematicians are doing in a different field... Combinators can be fun but they are somehow esoteric anyways... just do what you find fun!
You really only need S and K, which correspond to the two THEN axioms of intuitionistic logic. There is a straightforward way to turn any lambda into a combination of just these two, exploiting the special case I = SKK (Note that the other axioms of intuitionistic logic are just definitions of AND, OR, and FALSE.) If you want to know how, just keep replacing expressions like this: λx. fg →S (λx.f) (λx.g) λx. x → SKK λx. y → Ky Where x,y are variables and f,g are expressions.