Watching these in 2022 is so cool. They show a tool in this video that converts a screen layout into the COBOL Data Division statements to generate that screen. I created a program around 1986 that did the same thing. I think it was written in Turbo Pascal. I called it Taurus. I know nobody cares... it's just so fun reminiscing about this stuff.
Gary and Stuart were both excellent hosts. You can tell Gary was a brilliant man, yet never talked down or tried to belittle any of his guests. Excellent show
Trivia: - Who is the officially recognized (by the IEEE) inventor of operating systems for the personal computers (first OS for microprocessors) - Who is the officially recognized inventor (by the IEEE) of the BIOS ( and thus the open architecture for the basis of the IBM PC). - Who is the officially recognized inventor (by the IEEE) of the first programming language for microprocessors? - Who actually produced the first GUI(windows) for IBM PC computers? - Who produced the first multitasking OS for IBM PC computer? ..etc....etc.. The answers: 1) Gary Kidall: CP/M operating system, which was later "reversed engineered" and ended up being sold as "MSDOS 2) Gary Kidall: Gary Kildall invented the BIOS, which the basis of the architecture of the orginal IBM PC(which really was a modified S-100 bus computer). 3) Gary Kildall: the first language for microprocessors was Gary Kikldall's PL/1 4) Surprisingly Gary Kildall again: GEM GUI for IBM PC came out before Windows 1.0. In fact it could be argued that Windows was a copy of GEM, not macintosh, as GEM ran on top of CP/M like windows ran on top of MS-DOS(a CP/M clone) 5) Gary Kildall: Gary's Digital Research came out with MP/M a fully multitasking preemptive OS, probably about 10 years before microsoft had a multitasking OS. And this is just the start of Gary's pioneering list... ...I have to laugh at myself when I think, "what was going on in Gary's head seeing all these guests come on, while the show mostly ignored his achievements".
The legendary Gary Kildall in full flow in this episode, wonderful to hear him talk. Such a shame he is not here to witness his deserved credit as one of the pioneers of all we enjoy today.
@@hansneusidler7988 well he tried to have his cake and eat it to he wanted everyone to have cpm that wanted it but he also wanted to charge for it to much had he known that computers were going to explode he could have dropped cpm prices and stayed in the ring but he didn't
As I understand it, the FILL command was written in assembly. But as a Forth programmer, you can then use the FILL command, no assembly language at all. When you use any kind of high level language, your commands get translated to machine code too. So not much of a catch. The great thing about Forth is, that if you do know assembly, you can create your own high level commands and expand the Forth language to your own requirements.
She alluded to this in the intro - that you can toss in assembly code where it’s needed to give the power of low level programming to a higher level language. C does the same, and it’s vital to its applicability I think.
@@ArumesYT Yep! The OP seems to miss the point of "FORTH". It's Not at all a "gotcha". Hell, people have included assembly language routines in friggin' GW-BASIC!
I think is a good catch. Similar to this you can use C for programming and then optimize some functions calling assembly. If all program was in Forth, that demo would run slow.
Friggin love Gary. FORTH CEO: See how fast our flood fill is? Gary: mmhmm, mmhmm, now is this all in FORTH or is some of it in assembly? FORTH CEO: Well, I mean the speedup is all from assembly optimization, mostly the exact part I'm trying to show off...
Those are most likely buckling spring keyboards, they're not common today (although surely you can still purchase one ofc), but back in the day they were essentially the norm.
it seems the early shows were educational in nature - brilliant. they then moved on to the hardware and grew with their viewers excellent show. - a true classic
Then in the mid to late 90's again (sadly) moving with the times it got less to do with actual computing and programming and more to do with multi-media stuff which made the show shit and eventually killed it.
@@rooneye and in my opinion the host (not gary) would rush people to speak about their wares in like 30 seconds before he moved on to the other person. just gave me anxiety watching it! these earlier ones im seeing are vastly superior.
We can run Apple ][ BASIC, Sinclair BASIC,TRS-80 BASIC,TI-BASIC, Commodore BASIC and IBM-PC BASIC programs on one PC (all at once if we want to..) thanks to emulators and VMs! What cool shit we have today! would have been mind blowing in 1994 let alone 1984!
This was just a couple of years before C++ (and later other object oriented programming languages) had a great breakthrough. In fact, C++ was released in 1985; a year after this episode was aired(but it took a couple of years to be more commonly used).
And today, almost nobody knows COBOL, but we need COBOL programmers again to update the old mainframes many states' unemployment systems still run on, overwhelmed with unprecedented demand caused by the Coronavirus.
You still have those old ones? Jesus, seems in some regards US is just a third world country. Should´nt be suprised by the "president" you have. I still had cobol in university 20 years ago. Even then it was already old and obsolete
I think many banks still use many COBOL programs. The reason is simple: They didn't dare to change something that was doing it's job and doing it correctly. Now it's time to upgrade and COBOL programmers can make a fortune if they come back out of retirement.
+Mariusz Pluciński C is still a very low level language that requires a LOT of typing and teaching the computer every nuance of the algorithm. I like to see a language I can simply talk to in plain English. Like StarTrek. C is a keyboard text language. Don't even need a mouse. Heck I use VIM most of the time.
+AirScholar You can think of the voice activated Star Trek computer as basically an advanced version of Google. Not really suitable for programming, unless your notion of "programming" is just to find something from the Internet and copy it.
thought2007 Wrong ... A problem can be asked of the computer and it will dynamically create the algorithm to solve it, if it can. It isn't just looing stuff up like Google.
@@livesimplyandhumbly I agree here. I did a lot of programming as a kid, but at some point it just gets boring, tedious, too repetitive. It's like being an architect and having to build your own designs brick by brick. But if you switch to high level languages to speed up that process, you sacrifice a lot of computing power. So it would be great if the computer itself could figure out an efficient algorithm to fit your programming needs. We're getting there, we all know the famous example of C compilers writing better assembly than assembly programmers, but we still have a long way to go as well.
And some of the languages mentioned were quite old by then. FORTRAN 1957, COBOL 1969 and BASIC 1964! amazingly 50 & 60 years later, some form of all three still exist. in fact with Dos Box you can get the MS versions to run on a phone!
I wish today's computer lessons were this detailed and clear. Now a days I think in my opinion everyone explianing/teching anything computer tech focus too much to dumb things down and short cuts we miss why and how things really work.
Hell yes. We have insane access to total information and our society is stupider than ever. It's really a case of zero attention span and loud hip hop creeping into everything. ADHD society.
Yikes. Please don't try to learn about computing from these. The Computer Chronicles episodes are a *very* high-level first pass at the topic, and the reason that they seem so clear is that they simplify things to an extreme degree in order to fit everything into the half-hour infotainment format.
@@HardCase1911 Blaming "loud hip hop" for a lack of intellectual curiosity is probably top 10 for me in terms of most boomer things I've ever read in my life
Man, this show looks ancient and yet, I was 16 when this aired. Weird idea. I went from Logo and Turtle to Pascal, to Basic, 6510 assembler, C++, ASP, Javascript, Perl, ASP.NET, Java, Python, C#, Objective C, and Swift. A programming language is just a tool and those tools have come a long way..
Me born in 1989, in India: Logo to GW Basic to Java (Blue J editor from Monash University Denmark) in class 10 to C++ in class 11/12 to C in Engineering 1st year Basic computing paper to machine level/assembly level code in 3rd year of engineering in Microprocessor paper to HTML CSS Javascript Ajax JSP SQL in my first job. 😅 And now I write codes in Python for AI/ML.
BASIC was included in literally every home computer sold in the 1980s, and it also acted as the user interface and operating system to the whole computer. Basically everybody knew at least some BASIC statements...
Yes. "C" has been around since 1972! More Amazing than that: "Fortran" is STILL around. Fortran has been with us since 1957!!! COBOL is also still around, and it's from 1960!
@@josephf.2787 Yep! "BASIC" is still around too! I can't really name a high (or "mid") level language that's actually "dead". There may be a bit of "assembly" languages that are no longer used because the HARDWARE is no longer in use, But every language HIGHER than that has code running somewhere.
Somehow I think that this COBOL guy doesn't really know what's talking about. Granted I don't know COBOL either but I'm not on TV. Gary Kildall had such a direct way of asking very pointed technical questions that it makes all the gaps show. Love this show.
Photoshop v1.0 Mac was programmed in Pascal, the source code of this version has been published years ago and can be downloaded legally here is the legal link to it: computerhistory.org/blog/adobe-photoshop-source-code/
I have worked with a number of programming languages on my Atari and Windows/Pentium computers. The languages that I am most familiar with are Action, Assembler/Editor, BASIC, COBOL, Logo, Pascal & Pilot.
hehe BUT if you learn COBOL today you can make SHIT LOADS of cash. Because banks still use the fuck out of it. Pretty much all transactions in banks and ATMs use it. The old programmers who've retired are making fuck tonnes of cash coming back to do EASY (to them) small jobs for loads of money because no one is learning COBOL no more so there aren't any programmers around.
09:15 “Data-handling” and “file-manipulating” firmly built around the concepts of fixed-length fields and records, and ISAM files. These days “data-handling” and “file-manipulation” are more descriptive of what you might do in Perl or Python. They mainly involve free-format lines and text-heavy data, where you have to recognize patterns and delimiters rather than count columns.
Meanwhile high-level programming requires years of education and writing extremely old fashioned complex syntax while the opposite should have happened... A child should actually be able to do it. I can understand top notch stuff such as AI is something different but come on..., even the simplest change for a user in integrated software requires a high-specialist and tons of time (cost). This by now should have been solved years ago.
5:37 omg who does that guy look like? He looks so familiar! He reminds me of someone so much, but I can't quite put my finger on it. lol he totally failed at the end and choked and didn't show the end of his presentation because he fucked something up 🤣 with "Finger trouble" EDIT: Micheal Sheen! That's who he reminds me of.
Back then, even early MS-DOS wasn't programmed in C. Microsoft used assembler in the early days. The compilers weren't that sophisticated and available memory was low. But Microsoft has used C extensively for WinNT.
No he didn't. In her intro she said it incorporates aspects of machine language. He was highlighting a feature to the audience in that you can also use low level language in Forth.
"undue partiality or attachment to a group or place to which one belongs or has belonged" It's a less common meaning of the word, but it's in the dictionary.
진짜 저 당시에는 컴퓨터 엔지니어, 프로그래머들의 인상착의(ㅋㅋ)는 거의 대부분 대머리에 두꺼운 돋보기 안경을 낀 존나 클래식한 정장... 내가 95년부터 프로그래머로 근무했는데 Cobol Programmer로 재직할때 대머리가 거의 반반... 나도 대머리...ㅠㅠ 컴터에서 나오는 전자파가 당시에는 완벽하게 차폐되지 않은듯... 되게 웃긴게 결혼한 선배들은 거의 대부분 딸...딸...딸내미들 투성이... 당시에는 아들을 못낳으면 씨받이라도 들이는 풍조였는데... 지금은 딸바보 아빠들이 더 인정(?)받는 희한한 세상이 되었네...ㅋㅋㅋ 코볼 프로그램 디버깅할때 신택스 에러가 4~500개씩 발생하면 대부분 그걸 처리하는데 몇일이 걸리는 넘들도 많았는데... 그걸 10분도 안걸려서 해결하면 다들 '코볼신'으로 추켜세웠던게 기억남.... 로직에러는 잡기 어렵지만 신택스 에러는 겜하듯 아무것도 아닌것인데... 바부들... 지금은 다들 뭐하냐? 전산실에서 막강한 항온항습기의 위력에 한여름에도 오들오들 떨면서 근무하던 프로그래머들... 다들 잘있재?? 옛날에 매킨토시 클래식 만지작거리면서 너무 황홀해했던게 그립다... 옛날이 그리워~ (50대 아재 프로그래머)
LOGO... I wish my teachers did *know* how to programme in it so that I wouldn't fail my computer exams for 3 damn years. Thank you, for you have inspired my interest in computer programming by not teaching me how to do so! See what kind of monster you have converted me into: sum←{C←⍕√⍵+0v ⋄ (⍎1↑C) + +/{⍎⍵}¨99↑2↓⍕√⍵+0v} ⋄ +/∊{sum ⍵}¨N~(N←⍳100)*2
16:37 “Business applications” which nowadays involve access to SQL databases, for example. Which means having good string facilities for dynamically generating SQL statements. Which COBOL is lousy at.
But one of the key points about COBOL was its portability between completely different machines from completely different manufacturers. So all that goes out the window once SQL comes in?
@@lawrencedoliveiro9104 But that's only true up to the standard. Vendors would happily add their own extensions if they thought it would give them a competitive advantage.
COBOL was already decades old when SQL became a thing. The applications that were built in COBOL and used its native, standard data access -- multiple flat files with fixed record lengths -- didn't suddenly stop working when SQL came out. Matter of fact, SQL wasn't really a business *requirement* until the 90s and those applications needed to be maintained in the 90s and beyond because you can't just "lift and shift" a COBOL application into an SQL database and expect everything to work. COBOL is lousy at many things, but it's excellent within its niche, which is still a place modern solutions based on e.g. SQL and Java can't touch because of the environment it runs in.
Not much has changed. People still invent new languages everyday and there are too many to choose from. It’s probably worse today really. We have factions these days
Wow... 3 levels of language and the examples was totally off, even for back then. Machine language/Assembler could be considered the same level, and the lowest level. Then structured languages, like C, Pascal or Fortran, etc... Lastly, macro (and today scripting) languages.
Cheifet: "Hey, Gary, next week's show is on programming languages. Why don't you demonstrate CP/M a little." Gary: "I think I'd rather talk about logo." Cheifet: "Come on, CP/M, that's your baby. Why don't you..." Gary: "I SAID I'LL TALK ABOUT LOGO, OK???OK!!!!"
It's still in use today! en.wikipedia.org/wiki/Forth_(programming_language) Quote: "Forth is used in the Open Firmware boot loader, in space applications such as the Philae spacecraft, and in other embedded systems which involve interaction with hardware." I really didn't expect that.