I love how in one video he said "if someone wants to make a shirt with this equation on it, i would wear it" and now he is wearing the shirt with the equation on it XD
Finite state machines are everywhere, from stand-alone applications, like this vending machine example, down to the internals of processors and other complex logic chips. The only "memory" they need is a register holding the current state. And that "register" can be implemented with multiplexers (in which the next state is dependent on the mux's current state [outputs] combined with inputs from the system being controlled, and pre-programed input values, determined at design time ), or logic gates that compute the next state on the fly.
+Bob Lake But originally, in older vending machines, etc.- the "state" would just be the current configuration of all of the levers, toggles, et al, that have been changed due to the different types of coins being dropped in.
+Amr ElAdawy Well around here parking meters might give you as much as an hour per quarter ($.25), but they don't print tickets (obviously; they're meters). You'll only get a ticket when paying to park in a lot or garage, which is much more expensive. Badly maintained parks on the fringes of the city might only charge $3/hr, but generally you're looking at $5-25 for the first hour. In Chicago and New York, it can easily be twice that.
I remember getting onto an elevator, with people in it already. I pressed the button for my floor and the guy already in the elevator said that it's a grandpa-elevator. This left me with a puzzled look on my face. The guy continued, "no memory". Meaning that the elevator would not remember me pressing that button and I'd need to press it again when it next stopped. And this happened in Finland where nobody does smalltalk.
+Massimiliano Tron Compiler theory is fairly complicated in general. However, there are tools that are used to generate compilers for simple languages - YACC (Yet Another Compiler-Compiler), and Lex (a tool that generates a lexical analyzer, which is what the professor was describing at the end).
+Massimiliano Tron The simplest technique is probably something called Recursive Descent. It only works with certain languages, but you can use it to hand build (without any tools other than a compiler and and editor) a parser for a fairly simple language such as Pascal. You can find lots of information about Recursive Descent on the web. Like all parsers, you need to start with the formal definition of the language, written in a formal language such as EBNF (which you should also be able to find on the web). Where it get really exciting is that EBNF itself is formally defined and pretty simple. So simple that you can easily write a parser for it, which outputs program source code to parse whatever language the EBNF is describing. This is one way that you can make a basic "compiler compiler".
+Massimiliano Tron You can use whatever existing language you like, the end result is machine code anyway. And that is what your language must be capable of becoming. Hard part ultimately is making a compiler that can translate reliably and quickly, that is where most if not all homebrew stuff fails.
It would be so simple to make a machine that handles overpaying. Just add 3 extra states: 30, 35, 40. - If 30, it dispenses a 5p coin if available, and moves to the 25 state. - If 35, it checks the weight of the 10p coins storage. Not empty? Dispense 10p and move to the 25 state. Empty? Dispense 5p if available and move to the 30 state. - If 40, it dispenses a 5p coin if available, and moves to the 35 state. This way, it refunds exact change if possible; otherwise, it refunds as much as it can without going over.
+I'm Very Angry It's Not Butter You're thinking too much about the actual implementation and that's not really the point here. You would just add those states to the diagram and named the transitions like -5, -10. The fact that the machine is unable to return 10p coin is irrelevant in this case.
Actually you don't even need the 25 state. You can make the output not only depend on the state but also on the input. If the state is 20 and you add 10p, just add logic that says 20 and 10p - - > print ticket, return 5p, state 0
***** I don't think the machine requires any memory to know what coins are available to dispense. Separating the coins does not require a computer; it can be done entirely mechanically. Vending machines have had such non-computerized systems for decades. The machine does not need to keep track of the exact number of coins in each chamber; it just needs to know whether each chamber is "empty" or "not empty". To that end, a non-computerized hanging scale is attached to each chamber. If the chamber is at its default weight (i.e. empty), then the levers within the scale complete a circuit which sends an electric signal indicating that the chamber is empty. However, if the chamber is any heavier than its default weight (i.e. it contains coins), the circuit is broken, and there is no electric signal.
The very first component in a compiler is a lexical analyzer which is perhaps a fancy name for an implementation of a finite state automata. It receives a stream of characters (probably encoded in some form to save space and computation time for reasons I cannot immediately explain) and just by moving from one state to another it knows to detect the appropriate token. It also associates a token with a lexeme which is the value of a detected portion of the stream of characters (for example, if the token is a number, then its lexeme might be the value of the number). Those tokens are then parsed by a syntactic analyzer (or parser). The power in lexical analyzers as said in the video is that they don't use any extra memory and they go over their input exactly once (that is to say, the memory complexity of lexical analyzers is O(1) and the time complexity is O(n)). The way they can be implemented is as simple as a two dimensional array that with the indices of it being the characters themselves and a special flag for each input receiving state (unlike conventional FSAs, lexical analyzers can have more than a single type of input receiving state - each type of state corresponding to each type of token). I personally find the simplicity behind them and generally compiler front ends to be incredible. Compilers are made of components, each extremely efficient and incredibly simple for the job it is doing. Yet together they perform a very important and complex task.
You're not saying anything new. He never said that it isn't memory, but rather that devices with this ability don't need (to *have* ) memory (because they *are* memory). TO HAVE != TO BE. Memory doesn't need memory; it's already it. E.g. a simple computer code parser as he described doesn't need a variable to store the parser's state: the parser, just, *is* in some state. A coupla _while_ and _if then else_ arrows is all that such a parser needs to be in this or that state.
I didn't like the statement that the finite automata is a memory less system. I argue that the retention ability of the system to store the present state is itself a demonstration for the system to have a memory
Thank you so much Professor Brailsford, I love this topic... And I have to admit, at least for me you Sir are the best at explaining something! The enthusiasm is intense, it's like a grandfather telling old stories near a campfire..I just love it.
Fun fact: In PHP your variable names can consist of only numbers. $1 = "Hello World"; // Syntax error: Unexpected '1' (T_LNUMBER) yada yada ${1} = "Hello World"; // works just fine Which is funky, since in the PHP docs it explicitly says "A valid variable name starts with a letter or underscore, followed by any number of letters, numbers, or underscores". It seems like that rule is only enforced by the parser and can be circumvented easily anyway! PHP is also a big mess though, so stuff like that is to be expected.
Felds Liscia Haha, that's amazingly awful. It looks like anything that PHP can coerce to any string works. ${""}, ${[3]}, ${new SplTempFileObject()}, knock yourself out! :D This language never fails to surprise me.
Just to point out that a state machine like that is itself memory. It knows what state it is in and maintains that state until it gets a new valid input.
This brings me back to my Models of Computation course days at uni...and my Foundations of Computer Science course...and my Programming Language Design course...actually, guys, these things are everywhere in computer science XD
jakejakeboom Huh. I don't remember discussing FSAs in my embedded systems course...but that was only one semester years ago, so I could have just forgotten XD
+IceMetalPunk in embedded control engineering we just call them state machines. Very simple to define complex behaviour which is fast to implement and easy to get right with predictable behaviour. Gives you a lot of performance with little risk of bugs. And starting with a state diagram like that shown means it's easy to get your head around the function and make changes when needed.
Are they useful for computer scientists? meaning did you use them in your job? It'd be ironic if you've never had to use them, and me an engineer who knows nothing about all of this am back to learning them because I need them in my job.
Can I hope for a video on Banker's algorithm for deadlock avoidance while resource allocation by operating system. I know there are lot of videos on this here in RU-vid , but I love watching Computerphile videos.
@@yuzan3607they mean a syntax parser can be modelled as a finite state machine. The state machine reads all characters from the syntax input in order. Each character triggers a state transition. If the machine reaches the final state, the syntax is valid, otherwise it's not.
Yes, the missing transition from 5 to 25 is the one that caused me a headache but i'll just call it a feature :D The speck in the eye of his neighbor see and log in ones not to notice.
Am I the only one who finds it extremely frustrating he doesn't use the actual name of the concepts he's discussing?? How are viewers suppose to look into this more if they don't know that concepts he's talking about are Regular Expressions and Context Free Grammars.
I big the difference, I think using those terminologies will distract the normal user from the simplicity of the state machine he is trying to present. the only people will be satisfied with this are the ones who already know alotabout these topics.
I think there is mistake here, the vending machine has not temporary hosting, the coin dropa straight to the coin box. Thus register state is not applicable
The state machine is wrong, you should make states which will result in the machine giving change... For example you can give 3 "10p" coins and en up with 30p and the machine in this state will give you 5 p change and go to finish state. This is really basic...
Depends on how you look it, those computers still have a memory, even if mechanical but they know how much money you put in. It does not know how did it get there or how much does it have totally but when the buyer begins putting in coins there is a temporary memory. Its really like a variable actually because a computer does not have to know anything other than numbers are adding up to it and at 25 (as an example) it will do something, then it will reset, regardless of what the result is. The other thing is, because you mentioned vending machines, either way you put it (mechanical or digital) the machine will get the message of what you want and will wait for the proper amount of money and will no what actions to perform so it can give away the following product. I'm probably complicating things but either way its still a form of memory whether or not it can be altered by the user or the device itself. The only thing with these is that this is rather a "mechanical approach"...?
Vending machines really have been around a long time: the first recorded one was built by Heron of Alexandria, who lived around 10-70 AD, and simply consisted of a machine with a coin slot that would dispense a fixed amount of holy water at a temple.
Thanks much for this video. I used a finite state machine when coding a library for sending SMTP mail back in 1994 -- it for an uncommon programming language. It sure made the coding very easy and you could visualize the code by looking at the graph. The graph was easy to draw because it made sense and the coding was like 15 minutes because I had the Finite State Machine. Thanks for refreshing my memory because I forgot how to even draw a finite state machine after not using htem for so many years :)
Any actual computer with finite memory is a finite state machine. If we live in a finite universe, then all actual computers that will ever exist will be finite state machines.
In computer science “finite state machine” does not mean “a machine with a finite number of states”. It refers to a particular kind of machine called a DFA. DFAs can only accept or reject input (as seen in the video), while most computers can also write to memory... and then read in what they previously wrote. It is a much more expressive and dynamic process than what happens in DFAs. The theoretical model of computation that best corresponds to the machines we use in the real world is a finite Turing machine or “linear bounded automoton”.
No memory in principle but every State Machine implementation I've used (Harel State Machines usually) require lots of memory to handle transitions, guards, timers, the states themselves, entry/exit actions, etc. Qt, Rhapsody, QP, Sparx, etc. are all examples of useful state machines imps that require memory. This is the disconnect with the abstract field of CS and the people that actually do this stuff to make useful software.
He says the vending machine has no memory and doesn't need any, but that's not true. It has memory and uses it to keep track of what the total amount paid is. So it's one piece of memory that only stores the total amount paid thus far, but it's still memory.
Everything in computing is ultimately a FSA, the processor itself is a FSA. This video is misleading as it implies that FSA are some obscure corner case rather than what they are, which is the fundamental concept from which all digital computers are built.
You claimed that this machine has no memory, but that it does remember the "state" it's in. But we could read the entirety of a Turing tape as a number, and call that number a state, claiming then by your argument that the Turing machine has no memory, but only "state". This machine has a finite number of states, which strictly speaking excludes the infinite-tape Turing machine, but any physical (and thus finite) realization falls back into this mess. It sounds like there's no difference between a zero-memory machine with state and a finite-memory machine. Am I missing something?
5:32 - I think that most machines, even in the early days, use the width of the coins. Think there is a couple of numberphile videos about this working towards solids of constant width.
Finite state machines are used all the time in programming. To, well, keep track of states. Not to mention regular expressions which I was a bit disappointed to learn this video didn't mention.
Most countries follow the 1, 2, 5 pattern with their coins and notes. Australia has coins of 5c, 10c, 20c, 50c, $1, and $2. We got rid of the 1c and 2c coins back in 1992, and the 5c coin has gotten to the point where the government really should retire it, too.
I don't understand. The FSA has a memory of one word, in this case up to 25 and worth say, 5 bits. What if its "single word" of memory was worth a trillion bits? And, it had a ridiculously large but technically finite graph of all the operations it could perform on that single word (or number) composed of a trillion bits? (In other words, couldn't you create a Turing complete machine with an absurdly complicated FSA?) As an example, he says you can't have the machine remember which coins led to 15, but adding a few bits to the number you could store that using a bit of math (mangling the number but it would still be extractable by a simple modulo).
Hey, if you're like me and like to write short programs in BATCH (which some could argue is not a programming language but I genuinely don't care), variables have to be surrounded by percent signs (%), meaning you can have any symbol or combination of symbols excluding the percent sign. This also means spaces, leading numbers, you can even name a number 999 should you want to, just by typing %999%. That's why I like using batch.
Why can not the programming lang be considered as some kind of memory?, per example: there should be a part that says "if it starts with a letter then its a variable", the lang itself is holding that information. Don't know about vending/parking machines though since i guess they don't use a programming lang.
This video just gave me the idea to name my variables to start with an underscore and the rest be only digits. People who might read my code will _hate_ me for that ;-) And rightfully so, if I really ever should do it. Hm, I have to give that some further thought I guess.
The application to practical computer science is not checking variable names specifically. It’s REGULAR EXPRESSIONS in general. Every regular expression corresponds to a deterministic finite automoton.
We had to make a program in our C++ class that acted like a vending machine. I clearly wasn't that good as it confused the hell out of me. I got help from asking online; once I got pointed in the right direction I finally understood what do to. That was back in 1999. I think this diagram would be helpful.
i don't think it would have helped. literally implementing the diagram would produce incomprehensible code, that would require this diagram to understand
Where do you draw the line between "state" and "memory"? If an FSM can be in two possible states, is that not the same as one bit of memory? If you have a computer with 2GB of memory, is it not the same as an FSM with 2^(8*1024^3*2) possible states?
Oh my, 25 cents????!!?!?!?!! Here in Canada, street side parking is 2 Dollars an hour! AN most parking garages like down town Toronto or Oshawa charged 4-5 dollars.... and then... the Hospital in Oshawa, Ontario... charges 4 dollars for I think 2 hours of parking. Its an automated system, so the longer you stay the larger the bill.... Yay free health care, boo ambulance bills and paid hospital parking.
In most cases, the parking meters here in sweden, will never reject a overpay. If you overpay, theres 2 possibility things that can happen: Either, it will do as nothing happened, eg it will allow you to pay for as much time as you want. HOWEVER, if you pay for lets say 4 hours on a 2-hour max park, then the person who are checking the parking spaces, will check the "issue time" on the tickets (valid from), and if that is more than 2 hour ago, then you will get a fine for overstaying. The second case with parking meters is that they are programmed with the max-stay time, but then, if you overpay, it will then "swallow" any overpay and then issue a ticket exactly valid for the permitted period. So for example, if a max 2-hour park costs 25 SEK for 2 hours, and you put in 3 pieces of 10 SEK equaling 30 SEK, then it will accept 30 SEK payment and print a ticket valid only for 2 hour. The ticket will instead adjust the price, so on the ticket, it will not say "12,50 SEK per hour" instead it says "15 SEK per hour", so all the number on the ticket match up.
I don't understand. The FSA has a memory of one word, in this case up to 25 and worth say, 5 bits. What if its "single word" of memory was worth a trillion bits? And, it had a ridiculously large but technically finite graph of all the operations it could perform on that single word (or number) composed of a trillion bits? (In other words, couldn't you create a Turing complete machine with an absurdly complicated FSA?) As an example, he says you can't have the machine remember which coins led to 15, but adding a few bits to the number you could store that using a bit of math (mangling the number but it would still be extractable by a simple modulo).
Unfortunately, it's the accountants and Sales Force that want to know everything coming into a food/drink vending machine (coin float/stock) and all that comes out of it (ie.stock and change given). Lately I only work on vending machines that have more memory than some of my early windows computers. Oh, and they're all GSM equipped too! Those mechanical and 'brainless' vending machines are only used by very small owner operators who don't mind manual stuff as they only have few machines in market. We (my company) have thousands.
📺💬 Automation Some of the vending machine is built with a target dictionary like a word phase in the language model, sample twenty-five is ten five five five or five five five ... 5 times and you scan the sum up and create exchange rates. There is some vending machine has the problem when it limits by the language model and it can exceed the dictionary input. 🧸💬 The learning input series you need to mask the input and prepared data, masking is only scaled it by precision. 🐑💬 The 63 rules of the state machine is recursively working, with the number input over the rules it combined two possible rules or more together to create the target sequences as in the example it is not a bubble sorts algorithm and only finite state when the bubble sort listing every time it has new conditions. 📺💬 There is more benefits 🐑💬 Controls and security does not condition but it is stated as you unlocked a car door when it cannot detect your key, it can fix the ambiguous state with the correct order and be ready for the next command instant. 🐑💬 For example, it is precision working when using conditions you need to create many rules but for some machines, you can reset and make it ready with specific value loads without creating new rules. More example is a radio transmission when there is a problem you turn it off and wait then turn it on can negotiate communication again, or they are turning you to frequency and some radio does not update the screen. 📺💬 Tell Jirayu that is capability 🥺💬 Yes, and there are more but cipher text with a state machine is possible as the Markov model but I see you are explaining about encryption, decryption, algorithm, state machine, etc .
Every time I watch this I understand more and more. Like 5,5,5,5,5 is one of the acceptable strings/words in the Language of the machine i.e will allow the transition to the end state.
I don not quite agree... You will need more than 1 register... You need one register for the accumulator (sum of previous coins, the finite state) and also one register for the current coin being analyzed to be accepted/rejected... liked the video, thanks!