I was enamored with Tannenbaum's thesis until I saw this wonderful clip! I was wondering, if it was that easy, why did Linux work and Minx didn't, who likes to increase the lines of code in their project by millions and cause more bugs on the kernel when they can not do that! Thank you david!
The automation fears are silly. It comes from people not understanding value. Human value. Goods will lose value. Services gain value. Services, whether it is graphic design, eating out, shopping, playing, really all boil down to the things we want and don't want. Do you want to do math? Or be a trashman? We want them but don't want to do them ourselves. So we are willing to trade value. Encyclopedia salesman, Paper boys, Meter readers, .... digital tech made them obsolete. Doesn't matter. People will always want stuff done and will always trade to get it. Automation doesn't kill jobs, it just makes us free to do other things. Like entertainment, travel, games...
Yeah... here's an idea. Ask the kernel to set up some memory that a producer can write to and a consumer can read from. The producer and consumer processes can work out some mechanism so the consumer process won't read until producer is done writing and the producer doesn't overwrite until the consumer is done reading. I think I will call this IPC mechanism something ironically funny or maybe just internal datagram protocol.... perhaps 9p?
I don't go into this in this lecture since its focused on the OS aspects, but I don't think anything I do say is inconsistent with the history here (if you do think something in the lecture is actually incorrect, feel free to explain what it is). The most definitive source on how and why C was developed (and desire for a rudimentary type system is certainly part of the story) is Dennis Ritchie's article: www.bell-labs.com/usr/dmr/www/chist.html
Maybe, but whatever impacts AI eventually have it will be hard to attribute them to any of those companies in the way that one can attribute the transistor and information theory to Bell labs. Development of AI has been a long term effort involving thousands of people at many hundreds of different institutions, and the underlying breakthroughs that are still driving much of the work today are (arguably) more attributable to Bell Labs (but not 1944-1949!) than they are to Google/Amazon/Microsoft/Meta.
Since pi is irrational it can't be represented in decimal form in any base. Is then pi squared or pi to the power of pi (or pi to the pi to the pi to the pi) computable to arbitrary precision?
Yes, these would all be computable. One way to think about this is to considering how you would compute them using the infinite series that converges to pi. If you can manipulate that series using rules of algebra to produce a series that converges to pi^pi that would be a way (not efficient, but that doesn't matter) to compute pi^pi with arbitrary precision.
Well of course it would @@DavidEvans so my question essentially was can you actually manipulate that series using rules of algebra to produce a series that converges to pi^pi? I would have no clue as how to do that.
@@jimsteinmanfan80 you don't need to find a power series specifically for pi^pi, even though you could. Just observe there's a power series for pi. I can find pi to whatever accuracy as I want. Whatever I find is strictly a non repeating(in base 10) rational number. I can raise that to the power of itself, no problem. That gives me pi^pi to some measure of precision. Whatever measure of precision I want, there exists another measure of precision I need to calculate pi to in order to make sure pi^pi gets within the precision I want. Therefore it just is computable.
@@DavidEvans This does raise an interesting question though. Pi is defined as the ratio between the circumference and the diameter of a circle. That's a proper and unambigious definition that we know can only describe one number. Now I can make multiple infinite series that converge to pi. But none of those are immediately trivial when looking only at the definition of pi. Couldn't it be that some other number has a similar definition, but that there exists no finite algorithm that allows us to compute it up to any desired degree of accuracy? Pi 'could have been' that number so to speak. That we could only get more precision on pi by physically making a large enough circle and using state of the art equipment to measure the ratio between the diameter and the circumference. We are 'lucky' pi isn't like that, but couldn't there be any numbers that are?
That's about the depth I would have written to if I had cared to @@dekippiesip but I thought completely meaningless since I have no idea without thinking a whole bunch what that measure of precision is and then it is not an algorithm, just a hunch that there is probably some. Of course it gets much harder if you take pi to the pi to the pi to the pi to the pi even though I suspect that is true also. The computation need to get just to one decimal would be absolutely enormous.
6:50 I chuckled at this understatement! Saying "there must be some real numbers that are uncomputable" is on par with saying "there must be some real numbers that are nonintegral." The analogy with interesting numbers is nice, but I like to point out that the computable numbers, just like that of the algebraics, the rationals, and the integers, are infinitesimal in quantity compared to the reals. Waxing mildly philosophical, I like to point out that the "real" numbers are badly, badly misnamed, as the overwhelming majority of them cannnot be experienced in any real sense.
Yup. The probability of finding non-computable random "real number" is 1. Belief in Emperor's New Clothes could be called a kind of experiencing, though?
This is the best RU-vid video on the definitions of binary relations in general, and of functions in particular, that I have ever seen: both rigorous and very clearly explained. So many textbooks and videos fail to make clear that specifying a relation should require specifying three things: a domain D, a codomain C, and a graph G which is some subset of the Cartesian product DxC. Many sources effectively define a relation as just a set of ordered pairs, but since that would be a subset of infinitely many Cartesian products, it is not possible to determine if a "relation" so defined is reflexive -- because that requires specifying a domain. The definition of function used here, which doesn't require functions to be total relations, is also preferable to and more flexible than the narrower definition that many textbooks use, which requires "functions" to have an output for every input.
I have an idea of combining all of the characteristics of a microkernel with that of an exokernel for an operating system that I'm planning on making in the future. It's going to be more of an Atari-tos/Gemdos like one.
It's not depressing. Unix philosophy is to keep things simple and modular. And this is exactly what led to its remarkable success. While Multics died under its own weight.
Stackoverflow: ”Do not confuse the term modular kernel to be anything but monolithic. Some monolithic kernels can be compiled to be modular (e.g Linux), what matters is that the module is inserted to and runs from the same space that handles core functionality (kernel space).” Althought it's hybrid most of it can be considered monolithic. In terms that core part is monolithic.
7:21 This is wrong. In fact you can very easily describe how to construct uncomputable numbers. One example Chaitin's constant, which is the probability of a random Turing machine (given some encoding) to halt. Since the halting problem is not computable, this probability isn't either, however we can proof that such a constant must exist. If I recall correctly, my professor gave us another, very counterintuitive example: 0.10110111011110111110... is not computable. I believe the proof in essence said that this number would require a TM with infinite states, but I don't remember well enough. Please let me know if someone knows the name of this number.
I think you are misunderstanding what it means for a number to be computable. Numbers can exist and be precisely described (in a non-constructive way) without being computable - but if the description is constructive (that is, it says how to mechanically construct the number), then it corresponds to an algorithm that can be described by a TM so much be computable. Chaitin's constant is not computable - the way you and others describe it does not give a method of actually constructing it, even though it precisely describes the number. The number you've written out is computable - assuming that the "..." means repeat the pattern forever, we can easily construct a machine that does that (for any number that can be written this ways), so the number must be computable. In this case, only a few states are needed - you just need to use part of the tape as a counter, and write out the number of ones, followed by a 0, and increase the counter.
Well I disagreed with the statement "if it can be described, that makes it computable" and judging by your answer you seem to agree. Of course, if it can't be described algorithmically it's uncomputable, but that's by definition. But it is a common misconception that TMs are the limit of what is describable. TMs have limits that we understand quiet well. E.g. it's easy to describe constructions with infinite states and TMs can't have that. Regarding the number I posted: Your intuition was mine as well but when I tried to write it out I realized that TMs can not simply read counters if they grow infinitely in size. Again, can't remember a proof of uncomputability, so very happy to stand corrected.
@@iamtheguitar I don't think I actually say that - at least looking at the part of the transcript that is closest to this (starting at 7:52: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-Cuf4zkYwGlQ.html), I say that if we can describe a number in a "straightforward way" then we know how to compute it. I agree this is a bit ambiguous, and maybe there are ways to describe Chaitin's constant that some would think are straightforward, and important to point out that what it means to "describe a number" is not well defined. Regarding your claimed uncomputable number, there is no limit to how high a TM can count - they have an infinite tape. The definition of computable is precise, and any number that you can write one with a repeating pattern is most definitely computable - it can be output to arbitrary precision, by generating any number of digits of the number.
Your number is irrational and transcendental and computable like pi and e, but unlike sqrt(2) and the goldenrod ratio which are irrational but algebraic. There is no polinomial with rational coefficients that will have your number (or pi, or e) as a root. But is perfectly computable because there is an algorithm that can approximate it in finite steps with any arbitrary precision you like. IfvI told you write the first n digits, you know how to do it doesn’t matter if n is 10, 1000 or a goggleplex. It may take millions of years to complete the task but the number of steps is finite.
No, not necessarily. Picokernel is synonymous with nanokernel & microkernel nowadays, for the most part. An Exokernel on the other hand is suppose to allow for much closer access to the hardware & it's resources through user space, by forcing less abstractions on a software developer unlike more conventional operating systems, this allows the developer in question to make as many decisions as possible about hardware abstractions they see fit & should also allow for the bypassing of having to make calls to the kernel to perform operations, greatly increasing performance. The kernel is only there to ensure protection & multiplexing of resources. An exokernel should offer a similar level of security to that of a microkernel, since you still have all of the important in user space, while the os kernel is still very minimal, often smaller than that of a normal microkernel, the only difference is, is that an exokernel offers higher performance than a normal microkernel because of all of stuff I mentioned prior. Basically allows you to design applications that run closer to baremetal from userspace unlike conventional operating systems.
@14:48 worth mentioning that Avie Tevanian went on to NeXT where he built the Mach based NeXTSTEP and then later Apple, which of course, built macOS based on NeXTSTEP. Both of those OSes are hybrid, correct?
Yes, "hybrid" is usually a murky term, but I think it applies pretty clearly here, where NextSTEP was using the Mach microkernel as a component, but inside a BSD kernel.
Maybe that's what marketing folks like to call it today since "inexpensive" is not popular in marketing-speak - but not according to the people who invented it! See the original paper by Patterson, Gibson, and Katz: www.cs.cmu.edu/~garth/RAIDpaper/Patterson88.pdf
0:57 yes its oneway hash just go all combo and you find it. better way hash all data in database is multiple time order that hackers cant know LOL md5(sha256(sha512("12345"))) secured as hell. hash look like md5 that is what hacker try crack, BUT good luck with it md5 is is longh string from sha245. nice rainbow table. even he magically million year later find goin all combos 1-FFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF he still in md5 LOL in between hashed can also reverse orde and plit in midle that too much combos. hacker cant know. there fore it is impossible crack it even password would be "1" and username SOULD BE ENCRYPTED like every data on database. hacker cant have any plaintext only hashed that are hashed multiple time with dirent method even better is random so they cant use own hashes they create and find some combo, but it wont work all users have random series of hashing combos. BEST METHOD EVER! leaked database bcome useless hacking become waste of time
You don't need an OS to code, code is just a way to give instructions to a machine, OS is a fancy way of making your life easier. With harsh, long, low level code, you can then make an OS that will make everything easier for you
The fundamental problem is how much weight backwards compatibility has. Linux was only successful because it allowed you to run code that already existed on POSIX based OSes. It ate BSD's lunch and replaced it due to being less restrictive than OSes that had a genuine Unix heritage. The Biggest impediment to the adoption of Unix was that licensing cost were absurdly expensive. Otherwise many of the 68000 based architectures were ideal for Unix and could have taken us out of the DOS dark ages a decade sooner.
Yes! This is a very good point, it is still not known if the general graph isomorphism problem (Cook's problem #2 and Levin's problem #5) is NP-Complete. The sub-graph isomorphism problem (Cook's problem #1) is known to be NP-Complete. (All of these problems are within class NP, though.)
It looks monolithic kernel is still the only solution. I am sorry. Rust does not work either. Do not be targets. cve.mitre.org/cgi-bin/cvekey.cgi?keyword=rust