Тёмный

"Stop Writing Dead Programs" by Jack Rusher (Strange Loop 2022) 

Strange Loop Conference
Подписаться 82 тыс.
Просмотров 434 тыс.
50% 1

Most new programming languages are accidentally designed to be backwards compatible with punchcards. This talk argues that it would be better to focus on building new live programming environments that can help us solve the problems of the future.
Talk transcript and links: jackrusher.com/strange-loop-2...
Jack Rusher
Applied Science Studio
@jackrusher
Jack Rusher's long career as a computer scientist includes time at Bell Labs/AT&T Research and a number of successful startups. His current work focuses on the deep relationship between art and technology.
------- Sponsored by: -------
Stream is the # 1 Chat API for custom messaging apps. Activate your free 30-day trial to explore Stream Chat. gstrm.io/tsl

Наука

Опубликовано:

 

2 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 621   
@marinoceccotti9155
@marinoceccotti9155 Год назад
When you realize computer science has become a culture, with even an archaeology department...
@DogeMultiverse
@DogeMultiverse Год назад
in 50 years, people will look back at our time and laugh like how we laugh at fortran
@TheAlison1456
@TheAlison1456 Год назад
professions are cultures
@almari3954
@almari3954 10 месяцев назад
​@@DogeMultiverse No, totally disagree. We're still digging ourselves into a hole. We first need to get out of it. Watch: "The Mess We're In" by Joe Armstrong
@artdehls9100
@artdehls9100 Год назад
I remember reading this story about a company that was trying to send a program to a customer in France. Trying, because every time they did, it would fail to run customers' hardware. Finally they sent someone with a case that contained the program to sort things out. When he went through customs he dutifully declared the program as an imported product, whereupon the customs official pulled a few cards out as a required "sample" of the imported product. Oh joy.
@kurax1
@kurax1 Год назад
You cant really know if the food tastes good until you have some. Thats what went through the customs officers mind i think.
@PMA65537
@PMA65537 Год назад
I had a program to run remotely on lots of servers and something in the shell and terminal setup was eating a few of my characters. I added multiple lines of #### for a NOP slide to overcome that.
@totheknee
@totheknee Год назад
This is awesome. They could put throwaway code onto a few cards, like some superfluous OOP or Rust checkout checker border patrol or whatever they call it, and the rest of the program could still run in France.
@christiansitzman5601
@christiansitzman5601 Год назад
My blood started to boil just reading this.
@jesustyronechrist2330
@jesustyronechrist2330 Год назад
Actually kinda funny parallel to Docker and the whole "ship the whole machine that the code works on" meme
@pkphilips2
@pkphilips2 Год назад
This guy speaks so fast.. basically about 5 presentations in the time for 1, but somehow, he is completely understandable and he keeps the attention of the audience!
@judaronen
@judaronen Год назад
Watched it 2x… 😛
@ALaModePi
@ALaModePi Год назад
I was almost through this entire lecture when I realized that all these issues sound like "when you're a hammer, everything looks like a nail." We were trained by a thousand editors and programming languages to approach problems in a particular way instead of asking, "What is the best tool to approach the type of problem I'm working on?" Thanks for showing some really good tools and challenging us to make tools that are equally good for working with certain types of problems and data sets.
@57thorns
@57thorns Год назад
But it also trigger my silver bullet detector. While I agree C++ is a bloody mess, you can still write reliable real time programs in it. Of course, you can't use dynamic memory allocation (apart from the heap for function call) and you have to be careful about which standard libraries you use. And C++ is a pain syntactically. I wonder how python works in real time systems with digital and analog inputs?
@BillClinton228
@BillClinton228 Год назад
"The best tool for the job" largely depends solely on what the most senior programmer in the company is familiar with. It rarely has anything to do with tech and more to do with politics. These guys have usually been with the company since the beginning and the executives know him and trust him, so he has carte blanche to do as he pleases, so if he thinks the best tool for the job is Cobol or Delphi then that's exactly what will be used as long as it delivers software that makes money for the company. Sorry to burst your tech utopia bubble but politics and profits are way more important than the "tools"... if management agrees that the latest and greatest tech needs to be used to write good software then thats what will happen, if they agree that the legacy code is working fine and doesnt neeed to be written in the latest tools then sorry for the 20 year old junior intern but you will need to learn the ancient tech to work there and it will look terrible on your CV but that's just how it is.
@ZahrDalsk
@ZahrDalsk Год назад
@@57thorns >And C++ is a pain syntactically. I love C++'s syntax, personally. It just feels natural and easy to understand.
@ridespirals
@ridespirals Год назад
I'm a big fan of "idea the right tool for the job," I hate when people try to force solutions into a system to reduce the total systems/languages in use. my current company does that, does everything in javascript when other frameworks or languages would be better.
@robertwilson3866
@robertwilson3866 Год назад
You can do what he talks about in the video really quickly by just asking ChatGPT.
@michaelgfotiades
@michaelgfotiades Год назад
My first programming class used punched cards running FORTRAN on a Sperry/Rand UNIVAC computer (IBM 360 clone). As a consultant over the subsequent decades I would carry a little history kit to show the newbies - some punched cards, a coding pad (80 columns!), 9 track tape, 8" floppies, and a little bag of coal as a sample of what we had to keep shoveling into back of the computer to keep up a good head of steam. As my friend called it - "The age of iron programmers and wooden computers."
@r0cketplumber
@r0cketplumber Год назад
You had coal? We had to scavenge for firewood.
@phinhager6509
@phinhager6509 Год назад
@Eleanor Bartle not in computer labs.
@AdrianBoyko
@AdrianBoyko Год назад
My high school had Apple ][s and UCSD Pascal but the teacher didn’t want to learn a new language so we had to do Fortran on punched cards, instead. The cards would go to a university about 30 minutes away but the results took a week to come back.
@johnmoss4624
@johnmoss4624 Год назад
A week. wow!
@JackRusher
@JackRusher Год назад
😱
@KaiHenningsen
@KaiHenningsen Год назад
And then you learn that there was a FORTRAN available for the ][s UCSD system and weep. I once wrote a punched card Pascal program (for a uni course before terminals became available for those) by first developing in UCSD, then going to the card punch with the resultant listing. (I'm not sure, it might have been the 7 billionth implementation of Life.)
@TheAntoine191
@TheAntoine191 Год назад
@@KaiHenningsen Also people often hate on fortran because they had to use 78 version and practices. Modern Fortran is OK in my opinion.
@username4699
@username4699 Год назад
@@TheAntoine191 I think deeming it "OK" is valid for those who still must maintain programs in it, but there are still too many leftover - or even new - oddities that prevent it from being used in the ways that C is still useful. Some of these being: if you want an array of pointers to some data type, you have to use a structure; the lack of a true way to define/typedef custom data types; the intense duplication and verbosity required when declaring methods on classes; the syntax for declaring subroutine/function arguments; and the lack of a literal syntax for nested data structures (like assigning a value to an array that exists as a field inside of a structure, all at once). However, other old, largely forgotten languages like Ada, Modula-2/3 and modern variants of Pascal (Free Pascal and Delphi), certainly do have many redeeming qualities and are still very usable to this day, sometimes more so than mainstream/popular solutions even, Ada being the biggest tragedy out of the ones mentioned, in my opinion.
@davedouglass438
@davedouglass438 Год назад
One of my more unmistakable descents into IT Madness: At Conrail, I had to write out my COBOL programs on 14-inch green-and-white coding sheets, and send them over to the 029 experts in the Punchcard Department. Next day, when they'd dropped the code into my Shared Storage, it would contain so many errors that I had to spend an hour fixing it... So I took to typing my code directly into Shared Storage, using my handy-dandy SPF Editor... and was REPRIMANDED for wasting my Valuable Professional Computer-Programmer Time.
@MarcTompkins
@MarcTompkins Год назад
SPF Editor! Now, _THAT_ brings back memories.
@Tiddo1000
@Tiddo1000 Год назад
I think the biggest problem with all visual examples is that they work great for data-science or theoretical algorithms, but far less for your run-of-the-mill "corporate programming" such as (web)services. When building services, almost all of the programming is about creating a model of the real world, and not so much about visualizing and transforming data. All those examples of graphs, tables, flows etc. work really well for data-science (hence things like Jupyter are so popular there), but they don't generalize to domain modeling very well. I would absolutely love to have some sort of interactive and visual environment to build and maintain domain models, but I've yet to come across anything like that.
@ABuffSeagull
@ABuffSeagull Год назад
I feel like Dark Lang is pretty close to what you're describing, and it seems really cool, but I'm not quite ready to have so little ownership of the tech stack
@lkedves
@lkedves Год назад
Then it may please you that _informatics started with such tools,_ like the Sketchpad from Ivan Sutherland (but it's better to learn about it from Alan Kay because the original demos don't really explain the difference between "before" and "after") or the NLS from Douglas Engelbart (look up the Mother of All Demos, pay some attention to the date or the hint at the end that ARPANet "will start next year"...) Unfortunately, Engelbart's Augmenting Human Intellect Report is a very hard read, the whole field lost the point and the result is what we have today.
@alex987alex987
@alex987alex987 Год назад
And not for the lack of trying. I've watched oir read pretty much this talk at least five times in the last 30 years.
@lkedves
@lkedves Год назад
Results like that we have the ultimate communication infrastructure, but people don't feel pain to - limit themselves to a single bit, "Like" and think that any number of likes can ever worth a single statement. - repeat the same statements picked up here and there without processing and pretend that it is the same as a dialog. - rip off and descope the "Stop drawing dead fish" lecture (Bret Victor, 2013) in 2022. It's not about coding and punch cards but our very relationship with information systems (in machines, libraries, human communities and within our own brain). _"Why do my eyes hurt? You have never used them before."_ (Matrix, 1999)
@RaZziaN1
@RaZziaN1 Год назад
Domain modelling is bunch of graphs.. cqrs, ddd and so on. All is just processes and workflows.
@ianglenn2821
@ianglenn2821 Год назад
From 18:00 to 19:35 is such a good sequence haha, I finally understand VI keybindings
@gargleblasta
@gargleblasta Год назад
It was a revalation...
@eugenetswong
@eugenetswong Год назад
Yeah, it makes VI look logical. When I first saw VI, I could never understand how people accomplished anything, but my boss [i.e.: my uncle] kept pressuring me to use it.
@tinkerwithstuff
@tinkerwithstuff Год назад
@@eugenetswong But the fact that a subculture of people is using, for decades, ~ IBM-compatible keyboards, with editor software that's totally mismatched to that, is kinda hilarious.
@monad_tcp
@monad_tcp Год назад
@@tinkerwithstuff it really is, as I started learning computers when I was 8yo on DOS6.22. edit_com just felt natural for the IBM PC keyboard. When I came to the unix world, their stupid editors always felt "wrong" anachronistic. Why can't I have "edit_com" ? every sane editor I ever used on PCs with Windows or OS/2Warp was always like that. (and yes, I installed OS2/Warp when I was 10yo on my PC) Linux/Unix always felt like going to the past, to a museum. That can't be true, why would anyone ever want to use "vi/vim" ? Emacs, it at least made sense, you call anything with "command", which is `ctrl`, like every modern keyboard shortcut ever in any GUI program like qbasic or edit_com or msword. Then I found "nano", well that solves the problem. But the more I studied Unix/C, the more at a museum I felt. Like why ? why must I program my supercomputer x86 from 2007 like a freaking PDP11. Let not get me started on how brain damaged is writing shell scripts. I HATE IT, Why can't you "unixy/linuxy" guys just use Perl or Python. And the top of my unix journey was "autotools" , FSCK IT ! no, I had enough, even CMake is better than that, even ".bat" and "nmake", I'll never, ever, ever use it, just reading the docs give me headaches, why, why do you have 3 abstraction levels of text-generation, its absurd, it literally easier to write the command manually (in `nano`) and ctrl-c ctrl-v them to get the freaking binary. And when I'm choosing libraries for "C++", I chose those NOT use any that only provides build script for autotools. Lets also ignore how all code that has the "GNU" is basically horribly written, from 2010 perspective, and I've read a lot, A LOT of C/C++ code. Its just amateur code, not professional, by modern standards. It baffles me that people think they are good. If its from a GNU project, the code is basically a "bodge", example is "screen", not only the code is really bad, the user interface of the tool is really, really bad, like a circular saw plugged to a angle grinder that hangs from the ceiling by its cable, no wonder you keep losing your arms. And those horrible, horrible things are worshiped like if they were the holy grail of the `Church of C`, or must I say the `Church of PDP11`. I understand the historical importance of such things, but they are super anachronist, its like driving day-to-day in a Ford Model-T, its not good, it was good for the time, but I prefer my modern 2019 peugeut. I wanted to do computing, not archeology of old computing systems. That's what unix always felt like. I like knowing it, and experimenting with it, but I don't want to use it on my day-to-day job, but is there any other option.
@achtsekundenfurz7876
@achtsekundenfurz7876 Год назад
The one thing i don't get is his hate on "fixed width" tho. Whenever I program in a new invironment that uses proportional fonts, I switch to something with fixed width, because without it, numbers don't line up any more. A 1 takes less screen space than a 2 without fixed width, and the code looks ugly. Even worse if you depend on white space, like Python...
@hlprmnky
@hlprmnky Год назад
It isn’t every day I see a conference talk that reminds me why I want to work on being a better programmer. Thank you.
@kentbull
@kentbull 9 месяцев назад
agreed
@SamGarcia
@SamGarcia Год назад
As a sort of self taught programmer, now I understand the purpose of notebooks. Thank you for that.
@pleonexia4772
@pleonexia4772 Год назад
Can you explain for me please
@DanH-lx1jj
@DanH-lx1jj Год назад
@@pleonexia4772 You load the large dataset once and edit/rerun the code on it over and over instead of reloading the dataset every time you want to make a change to the code.
@wumi2419
@wumi2419 Год назад
@@DanH-lx1jj and then you still rerun everything if you ran cells in wrong order at some point
@foobarbecue
@foobarbecue Год назад
Make sure you also get familiar with breakpoint debugging and stopping through running code. Absolutely essential for a self-taught programmer in the "popular" languages.
@auntiecarol
@auntiecarol Год назад
@@pleonexia4772 look up Don Knuth and literate programming. Pretty common in Emacs circles to write executable code in blocks in org-mode (a kind of "markdown"), a precursor of these notebooks.
@Verrisin
@Verrisin Год назад
I would love this, but give me a language and IDE, that properly completes symbols for me, is context aware, is _interactive programming_ before I even wrote it. - That's why I like types. Kotlin, C# ... They are helpful sooner. They catch nearly all typos. In fact, I always tab-complete, so I never have to worry about typos. - I tried Elixir because the erlang model is so great, and I had dumb mistakes right away (typos, wrong symbol, etc), all costing lots of time to go back to. Only found through running tests, instead of before I even made them. - An environment that let's me make mistakes is worse, then one where I notice them ~live. Worse is only type checking (and no help) at compile time. Even worse is only getting errors at runtime, which sadly due to many reasons, when trying Clojure, that's where I would end up. A lot of things are a problem to do in the REPL, say I need to inspect argument to some callback. In Kotlin, I at least see the full type spec, and the IDE is helpful. In Clojure, I need to mock-trigger the callback, hope it roughly matches production, hope I can "suspend" inside the callback and and hand craft a reply, and that's even worse: How do I know what reply it wants? Reading docs is tedious. Filling out a clear type "template" provided by IDE is really nice and simple in comparison.
@janisir4529
@janisir4529 Год назад
Lecturer: Talks about debugging in fancy visualization Me: Cries is performance
@jonstarritt848
@jonstarritt848 Год назад
The history of the vi arrow keys and the home/~ connection blew my mind! Now it's time to go down the Unix history rabbit hole.
@curls6778
@curls6778 Год назад
Now I wish there where a part 2 of this talk that goes into more detail regarding modern language options that tackle these issues. A lot of the options mentioned seem near impossible to setup in a dev environment because the tooling is so outdated that I have to spend more time getting the environment to work than even thinking about programming in it. It especially seems like there are no options whatsoever when it comes to hard real-time applications like audio synthesis.
@sid6645
@sid6645 Год назад
Yeah its a peek to the future, if people decide to pick it up. Hope it comes to fruition, because bringing a coder closer to their code will only make it easier to see what actually goes on, past the abstraction of language syntax, semantics and language specific quirks.
@JackRusher
@JackRusher Год назад
@Curls Check out this talk: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-yY1FSsUV-8c.html
@JamesGroom
@JamesGroom Год назад
I still haven't used it myself but you might be interested in Sonic Pi
@hmd-unpo
@hmd-unpo Год назад
Supercollider is an advanced audio synthesis tool. Faust is a nice DSL for audio.
@thewafflemancer
@thewafflemancer Год назад
Yeah SuperCollider, TidalCycles, Max MSP and PureData are great examples of this
@soppaism
@soppaism Год назад
It's all spot on. Optimally, we would spend all of our time in solving the actual problem at hand, instead of spending most of it fighting the details that emerge from our choice of tools/solutions.
@JackRusher
@JackRusher Год назад
@11:06 rank polymorphism, I mispoke in the heat of the moment.
@metalpachuramon
@metalpachuramon Год назад
I never even stopped to think about it, now I have a name for it: introspection. Before studying the hard theory about regular expressions, I never actually understood them and just resorted to copy one from stack overflow. After learning the theory, I still don't write them as punch cards, instead I like using websites where you can test them in place, see explanations and so on. Now I don't feel bad for wanting to attach a java debugger to a live server haha
@brunodantasm
@brunodantasm Год назад
Indeed, that's the same point that game devs John Carmack and Jon Blow make. The debugger is the most useful environment there is. Also note that regex is amusingly not the same thing as formal language theory's regular language. After I learned that I started to forgive myself for having a hard time with them. en.m.wikipedia.org/wiki/Regular_expression#Patterns_for_non-regular_languages
@Favmir
@Favmir Год назад
Yeah I open up Regexr website every time I need to write a regex. Would be great if IDEs at least tried to help you with the visualization.
@Duconi
@Duconi Год назад
Debugging in a live environment is very problematic. Just imagine a order process of a web shop and you debug it in execution and mess things up accidentally and as you stopped the process it's not executed and also other orders are not coming through. There is a much better way. Write tests. I sometimes don't even try out my changes manually. It's tested and if I would have broken something the changes are high, that some test will find that. Some testing frameworks have even watchers, that execute the tests every time you safe your file, so you immediately see if your changes work. If you have proper tests, there isn't much in production that can cause it to fail. So instead of debugging a live server I would rather set up the development process in a way, that you find bugs before they reach live. That at least works really well for me.
@soheil5710
@soheil5710 Год назад
@@Duconi Nobody intentionally writes bugs. Prevention is good but not perfect. Don't you still need a minimally disruptive way to fix the live environment?
@BosonCollider
@BosonCollider Год назад
@@brunodantasm It depends on which kind of regex you are dealing with. Regexes from SQL or grep are real regexes. The ones in many scripting languages that use the perl syntax are fake regexes and can be six orders of magnitude slower on hard inputs
@netchkin
@netchkin Год назад
This talk is so engaging it made be spontaneously clap along with the audience while watching it at home.
@TheMrKeksLp
@TheMrKeksLp Год назад
I have to say some things about this talk really irked me. Like the implication that APL has superior syntax because for this very specific use case it happens to be quite readable and more terse than the alternatives Most choices are a compromise one way or the other. Compiled languages might be "dead programs" but that's the cost you pay for function inlining, aggressive code optimization, clever register allocation, known static stack layout and so on. That's why compiled languages are fast and static and not slow and dynamic. It's all a trade off In fact just yesterday I had an idea for code hotreloading in Rust. One limitation that immediately came to mind is that every control flow that crosses the module border will have to use dynamic dispatch, mostly preventing any meaningful optimization between the two
@Nesetalis
@Nesetalis Год назад
Yeah this exact exchange is what I was thinking about while listening to him. Compiling isn't a bad thing, it's an optimization. I use python for rapid prototyping, for instance, but when I'm done playing and ready to do some work, I write my final in C++, because it's fast. Yes I've spent days compiling libraries before, but once they were compiled, I didn't have to worry about them, didn't have to wait for my computer to chug and choke on the complex human readable parsing. Computers are not humans, don't feed them human. This whole mentality is an offshoot of the "just throw more hardware at it." camp, one I find regrettable.
@jrdougan
@jrdougan Год назад
@@Nesetalis The problem is that most languages don't have both an optimized and unoptimized (introspectable) version. I want to be able to do both without changing language. I expect he does as well.
@duncanw9901
@duncanw9901 Год назад
@@jrdougan Then use Haskell 😈 (admittedly, GHCi is nowhere near LISP levels of interactivity. But, it's better than nothing)
@gamekiller0123
@gamekiller0123 Год назад
@@jrdougan I don't think that would be enough to him. It seems like he wants introspection on production. I don't see how this is possible without making some major tradeoffs like globally turning off optimizations or annotating things that can be introspected. In fact it seems like he even wants to make the code modifiable at runtime (not necessarily the production code though).
@janekschleicher9661
@janekschleicher9661 Год назад
@@gamekiller0123 I mean, why not. Basically we already doing it, just in a slow way. In bigger projects, usually you don't just deploy and overwrite your previous version, you deploy it, let it running through staging/production pipeline and then make it first available in addition to the existing code via an internal route for the programmers and integration testing pipeline, then you'll canary make it available to a small part of users, monitor it, if nothing fails, you make it available to a significant part of users (let it route to the new version, while still keeping the old version), then if you don't monitor something wrong, you'll make it the default and then you stop serving the previous version and finally make a deployment some time later to get rid of the deprecated functionality. So, what happens as effect is that we are changing the runtime without really switching it off (if we regard the executed distributed environment as one unit execution). But the whole process is slow (we are talking about hours to see first changes and days till everything is finished -> very punch card like) and hard to debug and monitor (even with tools like distributed tracing or kafka or w/e). There wouldn't be anything wrong or scarier if the programming model just would allow to do these changes directly in the runtime (probably still keeping different versions) and not do it on microservice level with the help of container runtimes and routing services and complicated tools for introspection. Just doing what the language should do for us involves in the end knowing Docker, Kubernetes, API gateways, Prometheus, DataDog, Kafka, a CI/CD pipeline, and many things I might have missed on the fly now. In the end, most companies are now in high demand for DevOps engineers to optimize this process (-> punch card operators are back) as the complexity is too high to really expect the programmers to handle while they are trying to solve a complete different problem (the business case).
@Woodside235
@Woodside235 Год назад
I dislike that Tweet towards the beginning about how programmers will feel good about learning something hard that they will oppose things that make it easier. For several reasons. Firstly, it could be used to automatically dismiss criticism of something new as the ravings of a malding old timer. Secondly, it paints experienced programmers as these ivory tower smug know-it-alls. Thirdly, it implies that behavior is unique to programmers. Do old time programmers sometimes look down from their ivory towers and scoff at their lessers? Absolutely, and I am no fan of that either. But the Tweet at face value could lead to someone with a new idea (or something they believe is a new idea) being arrogant. The bit with the increasingly smaller ways to write an incremented array ignores the fact that the more you remove semantics which more obtuse languages have, the less clear it is what the program is _actually doing_ besides the high-level cliff notes. This can lead to extremely painful debug sessions, where the code you write is completely sound on a high level, but that the syntactic sugar is obfuscating a deeper problem. Lower-level languages have more semantics that they really need to, but the upshot is that it allows more transparency. It's often difficult to debug average issues with, but it's significantly easier to debug esoteric issues if you slow down and go line by line. Not to mention it makes very specific optimizations easier as well. A lot of the ideas in this video have been tried and didn't stick around not because of adherence to tradition, but because they simply were not as effective. Visual programming in particular. It has the same problem as high level languages in that it's easy to capture the essence of the program, but not the details. Ideally you would have both a visual representation side by side with the text-based semantics.
@LC-hd5dc
@LC-hd5dc Год назад
tbh, C or even asm is still obfuscating stuff from you. i would say it's more a matter of knowing the hardware you're running on and knowing the quirks of the language and the compiler. (which would naturally take years.) blaming the language is not entirely correct imo.
@DominikRoszkowski
@DominikRoszkowski Год назад
What I really enjoy about Dart is that, even though it's punch card compatible, thanks to hot reload I need to compile the program usually just couple of times a day when I pick up some new thing. Most of the time code can be reloaded in real time at incredibly short feedback loop. I still wish there were more features that would help visualize the structure and relationships of code but it's already so much better than most of the tools in the mobile ecosystem.
@erikjohnson9112
@erikjohnson9112 Год назад
I've been chasing live system programming for years. Dart provides a lot of what I am looking for, as well as Python with hot reloading (see a project called Reloadium). One of my ideas for my own system (that has yet to be written) is a little similar to the last example in this video. There are nodes which represent your program and there are "sparks" of execution so you can see data flow through the system.
@MrSaemichlaus
@MrSaemichlaus Год назад
It's very hard to carve a statue with a can opener. Selecting the right tool is key to success. But then most people also have an employee mindset, they are not toolmakers. It's good to see what other methodology is out there in order to set the right expectations in the users of programming environments and languages.
@alanr4447a
@alanr4447a Год назад
1:30 The 80-column "Hollerith" punch card design is an advancement over the original same-number-of-ROWS (12) with what I think were just 27 columns (circular holes rather than narrow rectangles) designed by the man named named Hollerith himself for tabulating the 1890 U.S. census, decades before there were "computers".
@LemuriaGames
@LemuriaGames Год назад
And before that the predecessors of punchcards were used to "program" weaving looms.
@DevineLuLinvega
@DevineLuLinvega Год назад
Terrific talk, laughed, and then I cried, then I was hopeful again. I won't turn to carpentry just yet. Thanks Jack.
@seismicdna
@seismicdna Год назад
woah cool to see u here lol. seems like some core tenets of the philosophy that underpins your work is well represented here
@DevineLuLinvega
@DevineLuLinvega Год назад
@@seismicdna I think we share a lot of similar ideas, I was fortunate to stay with Jack in Berlin a few years back, and meet Szymon Kaliski too. I was sad to hear that Strange Loop was stopping after this year, I've been dreaming of attending.
@JackRusher
@JackRusher Год назад
@@DevineLuLinvega There will be one more next year. You should give a talk!
@calder-ty
@calder-ty Год назад
I'll need to watch again to digest further. Working with a data team as their engineer is both a blessing and a curse. I've seen some of the benefits of the interactivity that Jack talks about. Particularly with data pipelines sometimes the easiest way to debug it is to pull open the notebook and run it until it breaks and inspect. It's also easy for analysts with little programming experience to write things and get started and explore. It's a curse because it does make it so easy that I'm often tasked with fixing and maintaining a heap of poorly designed programs written by many times the people than myself, with little to no consistency. Many of the perks that Jack mentions are useful for scientists/analysts for whom programming is merely a means to the end of getting their analysis done. Not having to worry about types is nice if you just want it to work. As an engineer, working with typed systems means I _don't_ have to keep the mental "working memory" whenever I jump in to make a change down the line to remember what I nuances of my interface I have dynamically programmed. Like I said, will have to watch again to really understand.
@DevineLuLinvega
@DevineLuLinvega Год назад
@@JackRusher I'd love to! I'll try to get in touch with the event's team.
@ssddblade
@ssddblade Год назад
What a great talk, thanks Jack. I agree with most of what you said. I just don't know what to do about it. I think our industry as a whole is in a local maxima, and don't know how to get out of it.
@esobrev
@esobrev Год назад
it’s up to us to create the solution.
@matju2
@matju2 Год назад
APL actually has a shortcut for making a list like 1 2 3 4, such that you can do the example program in only 4 characters : 1+ι4 (that's the greek iota letter) instead of 1+1 2 3 4
@thoperSought
@thoperSought Год назад
I know a lot of people love APL, but it seems too terse to really be readable to me
@HoloTheDrunk
@HoloTheDrunk Год назад
@@thoperSought APL is part of the "fun thought experiment but the next guy will just want to shoot himself while reading your code" languages. No sane person would use it for large software (or at least I hope so).
@NamasteProgramming
@NamasteProgramming Год назад
​@@thoperSought It is easy, all you need is a keyboard with 200 buttons
@abrudz
@abrudz Год назад
@@thoperSought What if your reading speed is reduced by 80% but the amount of code is only 10% of the alternative?
@EvincarOfAutumn
@EvincarOfAutumn Год назад
@@thoperSought The “expert-oriented” terseness of APL/J/K is scary at first, but it soon pays off, because the core languages are so tiny that you can become an expert surprisingly quickly. There are only ~5 syntax rules and ~30 symbols to learn, depending on how you count. Beyond that, basically all of the problem-solving skills are transferable to other languages, especially to APL alternatives like numpy/R/Julia/Excel.
@curls6778
@curls6778 Год назад
In the multi-media programming world there are pure data and max/msp, that are very similar to his last examples and very commonly used by artists. This talk shed helped me understand why I keep coming back to those for projects where I have to iterate on ideas very quickly.
@matju2
@matju2 Год назад
Unfortunately, those two are a lot more stateful than the average non-visual languages, because every function has been turned into some kind of object class that, if it has more than 1 argument, every non-first argument is an instance variable that has to be set before sending the 1st argument. And if ever you want to set the 1st argument without running the function, or running the operation without setting the 1st argument, you have to use special cases like "set $1" and "bang", IF they happen to be supported by that given class. Then to manage all of this, you have to sprinkle a lot of [t b a] and [route stuff] objects and connect them with lines that quickly get hard to follow. The DSP subsystem (~) is the exception to this, but that's only because it has a fixed data rate, and then when you try to control that subsystem at runtime you have to use non-DSP objects I described above.
@ekted
@ekted Год назад
It's not just the code itself that can have a lot of "this isn't part of the actual problem" problems. All of the "technical bureaucracy" (certificates, hosting, provisioning, deploying, releasing, building, source control, branches, pull requests, code reviews, unit/integration tests) contributes in a big way to stuff not part of the actual problem. In addition, "corporate bureaucracy" (development process, useless roles, incompetence, corruption) is a killer. At the end of the day, maybe 5% of your mental effort goes to solve the real problem, and the end result is ruined by the other 95%. Solving a problem with 5 lines of code versus 1000 lines just gets lost in all the other noise.
@christophersavignon4191
@christophersavignon4191 Год назад
Imagine a craftsman complaining that one needs to know metalwork to craft woodworking tools. Or a soldier moaning that all those logistics officers are not contributing because they don't fight. You'd just laugh at them. Creating tools has always been an investment, spending effort on one task to make another task easier. Teamwork has always required coordination. IT is no exception. If you become able multiply your workforce by 50 and spend 10% of that on the "actual problem", you have quintupled your progress. If you don't want to coordinate a team, your only other choice is to work solo. And while it sounds intriguing not to deal with 49 other lunatics and their code that conflicts with everything, including your sanity, it will really slow you down, more than team coordination ever could.
@Dyllon2012
@Dyllon2012 Год назад
I think your argument applies to just reducing LoC, but better abstractions can also eliminate certain types of mistakes. For example, a hash function builder reduces the chance that some hash function is written incorrectly and produced collisions.
@LC-hd5dc
@LC-hd5dc Год назад
docker's imperative config and uninspectable (possibly even malware-ridden?) root containers to me is already part of that legacy mentality, people just slap it in because everyone else is doing it, not because it gets the job done the best. imperative config and orchestration is the way to go to eliminate most of the issues you mentioned in "technical bureaucracy" as you call it. "corporate bureaucracy" is just capitalist problems. and incompetence has nothing to do with this discussion. neither of these will be solved with better programming tools.
@Sonsequence
@Sonsequence Год назад
Have you ever led a team where you were free to remove that technical bureaucracy? I am. I haven't. For each of those items you list I asked how we could shrink the footprint but removing entirely would have gone badly. Certificates, hosting: Be maximally unoriginal in cloud provider setup. Source control: Yes. Have you tried scp on text files instead? Branches: trunk only except for shortlived ones just to hold pull requests. Pull requests, code review: So much more powerful than merely quality assurance. But yes, very expensive so always good to figure out when and where to skip.
@Sonsequence
@Sonsequence Год назад
@@LC-hd5dc I guess you meant to say declarative config is the solution?
@itsdavidmora
@itsdavidmora 8 месяцев назад
For those wondering, the title is likely a reference to Bret Victor's 2013 "Stop Drawing Dead Fish" talk.
@permartin5819
@permartin5819 Год назад
Getting the night's production jobs loaded (via punch cards) as quick as possible was aided by the operators removing the rubber bands and lining up the "decks" on the counter. That is, until the night when the HALON system was accidentally triggered, sending the cards everywhere. It took quite a while to find cards stranded under equipment. Fortunately the strips on the sides of the cards helped. But it was a long, long night putting everything back together.
@rodschmidt8952
@rodschmidt8952 Год назад
Suddenly I think... Was there a method for making backup cards? Sure, read the cards and punch them. But did anybody do this?
@HowardLewisShip
@HowardLewisShip Год назад
This was an outstanding talk; interesting but with good humor. I think I need to go take a peek at Clerk.
@newogame1
@newogame1 Год назад
I see this often but it usually falls apart when you approach higher levels of complexity. There are many graphical programming languages, you could even call photoshop a programming language. The problem is there are tons of experiemnts but none of them really create anything "new". They spend their time trying to copy functonality from C. Stop copying C in your GUI Language.
@medetahmetson
@medetahmetson Год назад
Hmm, sounds like its better to design this kind of programming language with ui/ux designer together.
@nifftbatuff676
@nifftbatuff676 Год назад
Yeah this is my experience too. Graphical programming looks greak only with simple small problems. They are incredibly harder to use and a waste of time when you need to solve real-wold complex problems.
@theq68
@theq68 Год назад
The issue with this kind of presentation is exactly that, this convinces the management that the new shiny language is the solution to all the company problems but the sad reality is complex problems are complex in any language and learning the new shiny language takes longer than solving them. Create tools in your language that solve your problems is the current solution.
@TimeLemur6
@TimeLemur6 Год назад
@@nifftbatuff676 Automate for Android comes to mind. Fantastic app, I use it for a bunch of stuff I can't be bothered to write Java for and browse through Google's API docs. But large programs are an absolute nightmare when everything is drag and drop.
@gunkulator1
@gunkulator1 Год назад
Agree with this. You need the right tool for the job but a specialized graphical tool is really only good for solving problems that can be modeled graphically. I have wasted many hours with new tools that are supposed to bring about a new paradigm in how we program and in the end we always end up abandoning them because they never quite fit the problem at hand. The seemingly small gap between the cool demo example and what you actually need to accomplish ends up becoming an impassable chasm. In the end, tools are built by biased people who are thinking in terms of how to solve problems A, B and C but I'm stuck trying to solve problems X, Y, and Z or else a whole new class of problems, #, % and ^ that no one has ever considered before.
@chrishamilton1728
@chrishamilton1728 Год назад
Dynamic, interpreted languages are better than statically typed, compiled ones? Now that is a hot take. Not a good take, but a hot one.
@fakt7814
@fakt7814 Год назад
They have a potential to be much better in some important aspects like debuggability and prototyping. But most scripting languages did not go very far from static in these aspects, which does not make very much sense. Why sacrifice performance and stability for practically nothing? That's why dynamic interpreted languages are often perceived as inferior to static. It's either because most of them initially were either a replacement for shell scripting or developed to solve a very specific task (like JavaScript) and then accidentally grow bigger and become more significant. It's no wonder that the most advanced languages in that matter are Lisps, because they were designed as an AI research tool from the start.
@no-defun-allowed
@no-defun-allowed Год назад
1960 Lisp I called, wants its compiler back.
@Crazy_Diamond_75
@Crazy_Diamond_75 Год назад
For understanding, debugging, and visualizing your program in real time? Yes, absolutely.
@xerzy
@xerzy Год назад
Two things: 1) let the compiler blow up on the dev rather than the program on the user (especially if you seek the lowest runtime overhead, or you ARE making the runtime) 2) you can start getting this future, today, with current languages, using Jupyter notebooks and alike (e.g. literate Haskell)
@trapfethen
@trapfethen Год назад
Yeah, It might be interesting if we can develop a language that runtimes during development (for interactivity, visualization, etc) but can compile for deployment. Because there are instances when interactivity just isn;t necessary and the required abstraction and overhead is nothing but dead weight.
@stefanwullems
@stefanwullems Год назад
This absolutely blows my mind. I've been daydreaming on my ideal programming language for a while now and it basically boiled down to interactive visuals in the way leif made them, combined with a notebook view of your program like clerk. I'm so excited to see other people have made these things already :D
@IARRCSim
@IARRCSim Год назад
I don't think those are properties of the programming language. Visualization and interactive visualization are features of a code editor or integrated development environment. Development tools for a lot of existing programming languages could do that if they just implemented those features. Those features would also be more useful for some languages than others. The features would be more difficult to implement for some than others too. The video makes it sound like the language and its development tools are completely tied together. If you're choosing a language to learn or use in a project, you might as well group the language and its tools together. If you're tempted to invent a new programming language because you want to use lots of visualization, the distinction is important. You can always make new tools and new features for an old language without changing the old language. Inventing a new language that no one uses doesn't help anyone else. Inventing tools for popular existing languages will much more likely cause others to benefit from your creation.
@LC-hd5dc
@LC-hd5dc Год назад
@@IARRCSim yeah, like sure all the ASM boilerplate is annoying, but people could write tools to automate that boilerplate as you're typing and fold it away for visual convenience. as an example. i'm sure someone's already done it and i just haven't really looked myself.
@kplgr
@kplgr Год назад
Woah, this was a refreshing talk! Thank you Jack Rusher, whoever you are, for sharing thoughts which I never knew I had as well - let alone that I agree with
@ianmclean9382
@ianmclean9382 Год назад
I realized my habit of printing out variables and what information is being calculated in what the speaker calls "dead languages" is exactly the point he's making. There needs to be easier ways to observe the data and processes we write as it runs.
@Taladar2003
@Taladar2003 Год назад
On the other hand printing out values is a lot more productive than that nonsense single-step debugging. Give me a printout of two runs of the program and a diff tool any time over stepping through it for hours trying to remember what the debugger displayed 1000 steps ago in the last program execution.
@flyLeonardofly
@flyLeonardofly Год назад
Incredible talk: I noticed the homage to Bret Victors: "Stop Drawing Dead Fish!"
@JackRusher
@JackRusher Год назад
💯
@Waitwhat469
@Waitwhat469 Год назад
I have to admit, the idea of messing with runtime as sysadmin and security guy sounds nightmarish. Great tools in the Dev env, but in production it seems like a system that limits checks and requires increased trust of the devs. Mind you I'm in the sysadmin camp that IaC and CaC greatest benefits is that you move AWAY from click here to do this administration and towards more formally tested and explicit ones.
@LC-hd5dc
@LC-hd5dc Год назад
finally some sense in these comments lol. i'm curious, what other options would you suggest for runtime introspection? usually what i've seen is slapping in logging statements everywhere, but i have to assume there's a better way
@Waitwhat469
@Waitwhat469 7 месяцев назад
Logging, metrics, and tracing are the only things I can think of, but it would be nice if you could clone a running container stick it in a mock environment and step through the process.
@rumisbadforyou9670
@rumisbadforyou9670 Год назад
The only question I have is: "How do you mix heavy optimizations of Rust/C++ with powerful debugging and on-flight editing of Smalltalk?" If you have an answer, I'm willing to switch. From my experience JIT compiled code is always slower than AOT compiled. (And "lol just get a more powerful PC" or "stop running server workloads on a 10 y.o. laptop" are not valid arguments) If somebody has an example, of a performance-dependent software written in Smalltalk/LISP-like languages, like ray-tracing or video-encoding, I'd like to take a look and compare them to more conventional solutions.
@D0Samp
@D0Samp Год назад
Also even if JIT comes close to native compilation (at least as long as the latter does not use make use of profiling and other advanced optimizations) in either responsiveness or throughput, you typically pay for it in higher RAM usage, which is unfortunately the most limited resource in shared computing in multiple ways. Contemporary Java comes to mind there, even though on-flight editing is obviously not a thing there, I'm already grateful for a REPL.
@LC-hd5dc
@LC-hd5dc Год назад
how about this - JIT while you're working on the code, and then AOT when you want a prod release? i definitely don't agree with his suggestion that we want JIT in production.
@lanthas5744
@lanthas5744 Год назад
As of Visual Studio 2022, you can use Hot Reload to change C++ applications while they're running. I'm actually quite surprised he didn't bring this up.
@pierremisse1046
@pierremisse1046 Год назад
One Solution of combining heavy optimizations of Rust/C++ & capabilities of Smalltalk is to use twin softwares (or simulation). Works fairly well, recent smalltalk distributions have worked using such an approach for more than two decades now. They code their VM in Smalltalk (OpenSmalltal-VM/Pharo) and generate C code from it. There's also RPython that does similar things. This approach is loved by some, hated by others. Is this an example you consider to be a performance-dependent software?
@rumisbadforyou9670
@rumisbadforyou9670 Год назад
@@pierremisse1046 I guess I'll try Pharo after learning some Smalltalk. But from reading about it a little, it still sounds like it'll bring some runtime overhead that might be difficult for the compiler to optimize. But I'll give it a go. If transpiled C will be faster than native JS, I'd consider it a win for Pharo.
@Barxxo
@Barxxo Год назад
Very interesting talk. Thank you.
@alanr4447a
@alanr4447a Год назад
At the high school I attended in the 1970s we used punch cards typed in a KEYpunch machine (not "card punch"), and we fed them into the card reader and took the lineprinter (much faster than a teletype, although that was also an option for program output - program LISTING was always via lineprinter) printouts ourselves, so not all setups were equally primitive. Also, the reader was able to read either actual punches or pencil marks, and we developed "code cards" to allow us to make code with pencil marks (called "mark sense") so we weren't limited to the bottleneck of one or two keypunch machines for everyone to use, and I myself wrote the program to generate punched cards from marked cards, used at the school for several years after I graduated.
@GordieGii
@GordieGii Год назад
I have used an IBM 029 key-punch. When I was in high-school (about 1980) we used bubble cards, but the near-by university had key-punches so we would go there to type in long programs. We still had to send the card decks to the school board computer center (overnight), because we didn't have an account at the university.
@genericdeveloper3966
@genericdeveloper3966 Год назад
Those graphical representations may help some people, but they just seem like more work to interpret as they are like a new language in themselves. They should be used only when they are the better alternative to comprehension for the average dev.
@LC-hd5dc
@LC-hd5dc Год назад
yeah as far as i can tell, most of them were just showing nesting levels... ultimately they seem more like teaching tools than daily programming tools.
@rv8891
@rv8891 Год назад
Wouldn't that be because most devs use 'traditional' code representation? In a world where programming is cannonically done in brightly-colored ballons connected by lines, trying to put it in a single sequential file might be the "hard to interpret". I think there's something to be gained here using visual&spatial&interactive programming, although I have not yet seen a version that sparks joy. Maybe functions as code in a bubble, and jump points (function call, return, goto) as a line between bubbles? It would visualize program flow without giving up the details you need to actually program. IDK, but it's an interesting problem.
@Taladar2003
@Taladar2003 Год назад
@@rv8891 The problem with graphical representations is that they are bad at abstraction and that they are hard to process by tools. Code is all about abstraction and tools to help you work with it.
@TheCablebill
@TheCablebill Год назад
The PL/I Checkout Compiler, under VM/CMS was my first use of a tool set that provided a powerful interactive programming (debugging) environment. The ability to alter code at runtime was a treat, generally only approximated by even today's debuggers. Progress seems to be a matter of small steps forward, interspersed with stumbling and rolling backward down the hill quite often.
@red-switch
@red-switch Год назад
I loved this talk but I don't know why the author is sounding as though typing is somehow a waste of time or insignificant. Most web Devs use typescript or Babel because otherwise you wouldn't catch a lot of errors while writing the program. Type checking has nothing to do with making the programming experience interactive, and in fact would aid it.
@eric-seastrand
@eric-seastrand Год назад
The algorithm sent me here. What a fascinating take on compiled/batch vs interactive programming.
@michaellatta
@michaellatta Год назад
Smalltalk was one of the best live coding environments. You could change source of active stack frames. The issue was delivering the program to “production” on one of 100s of servers.
@sidneymonteiro3670
@sidneymonteiro3670 Год назад
The issue is how to do product development with a team of developers on the same code base for testing and releases.
@edwardog
@edwardog Год назад
Would the team be working with a CI server?
@edwardog
@edwardog Год назад
Was it an issue of it being unclear how to address the server in question? I’m also curious how you feel it compares to today’s approach of using containers/images
@mikecole2837
@mikecole2837 Год назад
The fact of the matter is that all of our hardware infrastructure expects the user to program in either ASM or C. Runtime environments are expensive and not available at the bare metal level without a ton of headaches. Lua is promising but it's written in C. I agree that modern hardware introduces many problems that don't have anything to do with solving the business problems that make us money. Maybe more people should become computer engineers and devise an ISA that allows for visual and runtime feedback natively.
@dickpiano1802
@dickpiano1802 Год назад
42:34 is "someone made Mathworks Stateflow at home". 29:07 is "Anaconda Spyder from Ali Express". 37:28 is Jupiter Notebooks 2.0. 38:37 is "Mom, can we have Berkeley Madonna at home?" The truth is that all of these tools have existed for decades, but people just wanna be hackers and there's no changing that.
@Omnifarious0
@Omnifarious0 Год назад
Food for thought, though he glosses over why things like edit/compile/link cycles still exist. There are costs to things, and sometimes those costs aren't worth the benefit.
@JeremyAndersonBoise
@JeremyAndersonBoise Год назад
I was on the edge of my seat yelling "yes!" more than I wanted to admit, until now. Inspiring presentation on multiple levels, Jack, thank you.
@LinkEX
@LinkEX Год назад
Agreed. "Hard in a dumb way" is a concept that deserves broader awareness. Dealing with new problems created by the tool meant to solve the original problem is common. What ends up happening is that people either take false pride in making those symptoms the focus of their work, rather than the cause. Or an odd sense of envy leads them to force others to suffer through outdated and avoidable symptoms even when there's a better tool at hand.
@beautifulsmall
@beautifulsmall Год назад
Recently looking at fast ai , I can see the notebook live method has huge benifits compared to my line -by-line pycharm script. fascinating coverage of first principles progarming. I will buy the Kadinski point,line,plane. data rabbtis and racket. Im going to print and display the chemical elements for the tea room wall. There are so many stupid methods for doing simple things but the complexity gives some people a warm feeling that propogates the fools errand. great talk.
@TenderBug
@TenderBug Год назад
Fantastic! Most insightful talk I came across in a long time.
@amasonofnewreno1060
@amasonofnewreno1060 Год назад
I find your view of things very interesting. I observe there is a lot of activity again in the "visual programming" section. However while I do agree to some extent at least with sentiments I find that textual models are still gonna persist. I would love to offer a notebook interface to "business" people, so that they can simulate the system before bothering me (it would sure cut on the feedback loop). But for the most part I think 70-80% of any codebase I worked with is "data shenanigans" and while I do like for textual data there to be visually adequate(formatted to offer better view of the problem) I do not find it enticing to expose those. Another problem I find is that, UIs are and likely will always be a very fuzzy, not well defined problem. There is a reason why people resort to VScode - as it is texteditor. So you also have this counter movement in devtools(counter to highly interactive/visual programming) and returning to more primitive tools as they often offer more stable foundations.
@jones1618
@jones1618 Год назад
I agree about a better notebook-like system modeling tool for business people. As a developer, whenever I have to do any spreadsheet work, I'm also struck by how immediate & fluid it is compared to "batch" programming ... but ... also clunky and inflexible to lay out or organize. I'd love to see a boxes-and-wires interface where boxes could be everything from single values to mini-spreadsheets and other boxes could be UI controls or script/logic or graphical outputs, etc. Now that I think about, I'm surprised Jack didn't mention Tableau which provides a lot of the immediate & responsive interaction he wants to see in future IDEs.
@notoriouslycuriouswombat
@notoriouslycuriouswombat Год назад
What an insanely great talk
@magicponyrides
@magicponyrides Год назад
Man, I enjoyed this a lot more than I expected to.
@SikadaSorel
@SikadaSorel Год назад
This was such a fantastic talk, both in the subjects covered and in the very charismatic presentation. Thanks for sharing your insights Jack!
@cesarromeroalbertos3839
@cesarromeroalbertos3839 Год назад
I agree with the main concept of the talk, like, I'm always fighting with people over this stuff. That said, I'm a videogame programmer, and usually work in Unity so not much choice (even if I didn't most games use c++, Unity is C#). The thing is, in game development many of the things you say you have tools to implement and do. We can change variables on runtime, we can create different tools and graphs and stuff to see what's happening in runtime, visualize stuff, etc. Of course it's not the same exactly as the examples in the talk and these things are implemented due to the nature of how a videogame works, rather than for a better programming experience. Just wanted to point out a curious case of how game engines get a bit closer to this idea for different reasons.
@naumazeredo6448
@naumazeredo6448 Год назад
Most of his examples are about tools, not programming languages themselves. He shows the issues as programming language's issues, but in reality, most of them, are lack of tooling around programming languages. Game engine editors (not game engines) are made exactly to address most of these issues. I agree with him that the language's ecosystems lack some basic tools, but these are also completely program specific. For games you will need a 4 floats type to store colors, should the language know about this and have a way to visualize the colors in its own editor, even though the majority of developer might be using this same language to code CLI/deamon programs? Does keeping the state of a program makes sense when you're shipping a game to players? It totally makes sense when you're developing, for fast iteration and debugging, but when you need to release the game and publish it, you need to compile, disable hot reloading, disable debug asserts, etc, since the client (the player) won't need any of this and all of this adds a performance cost.
@homelessrobot
@homelessrobot 8 месяцев назад
@@naumazeredo6448 its because a lot of programming language communities (at the encouragement of their developers) think of these things as language issues, because they have yet to ever witness the beauty of a programming language getting out of a better tools way and sitting on the side lines for a play or two. If there is a job to be done in software development, its something to do with a programming language, and specifically MY programming language.
@HrHaakon
@HrHaakon 8 месяцев назад
Check out GOAL, the lisp that they made Jak and Daxter with and your mind will be blown.
@christiankrueger8048
@christiankrueger8048 Год назад
Thank you!
@coderedart
@coderedart Год назад
I do agree that having runtime reflection is a great thing so that we can look at the environment / state over time. But i hard disagree with most of the other points in this talk. 1. comparing C / C++ / Rust / Zig with Lisp / Clojure etc.. is just plain wrong. anyone can see that these languages are targeted at different use cases. They are manually memory managed low level languages for precise control and peak performance to extract everything out of hardware. literally just a step above assembly. 2. This talk conveniently skips over things like Garbage collection (and performance in general) except for a reference to tweet talking about devs being okay with stop the world compile times but not stop the world garbage collection. Games or Apps sensitive to latency ( real time music/video editing, trading etc..) just cannot afford to have that garbage collection pause no matter what. But devs can very much afford that compile time. 3. Saying Rust and other ML family languages don't improve software is also debatable. Rust's typesystem turns runtime errors into compile time errors making the software more reliable. Rust is in trial mode in Linux kernel.. because it provides a proper safe typesystem that C doesn't. Most of the talk is about debugging, viewing runtime state and live interactive coding. Which is more about tooling surrounding the language rather than just the language itself. We definitely need better tooling and many projects shown in the video are good examples of what might be possible in the future. for anyone interested, i recommend watching the talk about dion format dion.systems/blog_0001_hms2020.html which also talks about editing syntax trees in a custom IDE instead of a text language with syntax. Rust has been getting decent tooling to improve developer experience. github.com/jakobhellermann/bevy-inspector-egui for example shows all the game State live AND allows you to modify it. there's github.com/tokio-rs/console for async server apps to provide a look into the runtime state. you can always add a scripting language to your application (like lua) and query any state you want. there's other initiatives like lunarg's GfxReconstruct which will dump all the vulkan state so that the developer can reproduce the gui/graphics state exactly on his machine by receiving the vulkan dump from user. people are working on lots of cool ideas to help with debugging state machines. Although, i think a custom rust specific IDE will go a long long way.
@IamusTheFox
@IamusTheFox Год назад
Not a rust guy, but rust is a great example of how he missed the point of static typing. It's feedback at compile time. Run time errors are errors caught by the end user if you are not careful.
@dixaba
@dixaba Год назад
All that "types slow devs" sounds like dynamically typed languages are better. Maybe they are... until your fancy no-explicit-types JS or Python or whatever app crashes in the middle of logic because, for example, you forgot to parse string into number. Languages with static types (even as limited as C) just won't allow you to run such nonsense at all. Types are helpful for reliability, TypeScript, Python typing, etc. confirm this. Better slow down 1 developer team than have 1000 customers with "oh no, I made an oopsie" crashes.
@fat4eyes
@fat4eyes Год назад
Thank you. Way too many people who don't actually work on real systems completely ignore performance and maintainability and focus way too much on writing code 'quickly'.
@tajkris
@tajkris Год назад
Ad 1. And how being low-level, manually memory managed for peak performance stops you from having nice things like runtime modifiable code and introspection into live system? Those are orthogonal concepts and they aren't mutually exclusive. C++ approach is 'pay only for what you use', but there doesn't seem to be much to 'buy' when you actually would like to pay for those niceties. Ad 2. It's not that devs can afford the compile time, it's that they have to in some of the languages. E.g. you can run Haskell or Ocaml in interactive shell while developing, but compile to get better performance for release. JIT compilers do exist for various languages, so it's not like you cannot have a runtime modifiable system that performs well. C# has garbage collector, but you can use manual memory management to some extent when you really need to (high perf or interoperability with C/C++). It's an attitude problem, designers of the language(s) decided that's it's not of enough value. The point of this talk as I see it is to highlight the value of presented things and get language designers to think about such use cases. Ad 3. This is only an improvement in an environment with forced compile/run cycles. You type something, launch the compiler (or your IDE launches it the background) wait between 0.5s and 60 minutes for it to compile, you get an error about wrong type. You fix it, compile again, run it, spend between a second to a minute to verify that it works as expected (i.e. rule out problems that weren't caught by type system). Now compare it to: you type something while your program is running, you see clearly incorrect results on screen and on top of that you get an error. You modify the code while the system is still running and you see correct results on screen. IMO the second workflow is much better and more efficient. Also, look at TypeScript or Python - you can rapidly prototype your code omitting the types or add type annotations for additional safety. TLDR: compiled/statically typed vs interpreted/dynamically typed - you could have both and achieve high efficiency in development as well as high performance in runtime, there's no need to limit yourself.
@janisir4529
@janisir4529 Год назад
Those high level tools look so fragile, they'd never make back the time invested into them.
@defnlife1683
@defnlife1683 Год назад
This is a wonderful talk and I think it underline a lot of the weird things that non-programmers start finding and programming language. I was originally drawn while I was self learning to less because it was so different and because it does have some super powers compared to other languages. That seem so wrong that later languages did not incorporate a lot of the stuff that was innovative enlist even to this day. Erlang, Elixir, CLisp, Scheme, Clojure, Clojurescript are all wonderful and make my life a lot easier as a self taught dev. Elixir Livebook is wild
@verigone2677
@verigone2677 Год назад
Sooo... I work in the Energy industry, we just retired our last VAX in the last 18 months...though we still have a bunch of virtualized VAX for historic documentation. We also just replaced a real time system that had one of the very first mice ever made (it was actually a Trackball and it was MASSIVE).
@wazzo_gaming
@wazzo_gaming Год назад
INCREDIBLE TALK!
@joebowbeer
@joebowbeer Год назад
My stock has risen! Licklider taught me to teach and Sussman taught me to program, some four decades ago, and now they're both mentioned in this excellent talk. Speaking as someone who helped port Scheme (written in MacLisp) to a DEC timesharing system for the first Scheme course at MIT, l don't know why your LISP examples aren't written in Scheme. Harrumph
@JackRusher
@JackRusher Год назад
😹 I prefer Scheme myself! I used SBCL in these examples because it is a very strong programming environment (compiler, tooling, &c).
@nunzioturtulici9636
@nunzioturtulici9636 Год назад
Really interesting talk!
@SaMusz73
@SaMusz73 Год назад
Excellent !
@timothy8428
@timothy8428 Год назад
Building buggy approximations is my specialty.
@VladyYakovenko
@VladyYakovenko Год назад
great talk
@5pp000
@5pp000 Год назад
That's a line printer, not a Teletype. (And yes, I too wrote Fortran on punch cards.)
@thomas-hall
@thomas-hall Год назад
A really strong start, we do a lot of dumb stuff for historical reasons, but the second half seems to totally ignore performance and honestly anything outside his own web realm. The reason programs start and run to completion is because that's what CPUs do. You can abstract that away, but now you're just using user level code and making it an undebuggable unmodifiable language feature. Sure functional languages look neat, but where are your allocations? How are they placed in memory? Are you going to be getting everything from cache or cold RAM?
@colinmaharaj
@colinmaharaj Год назад
17:17 that machine, yeah I did work on that when I started working in the early 90s, they were phased out within a year for faxes.
@d.jensen5153
@d.jensen5153 Год назад
Entertaining but deliberately simplistic. How desirable would Linux be if its kernel were written in Haskell?
@SgtMacska
@SgtMacska Год назад
Eliminating an entire class of exploitable bugs? That would be amazing
@wumi2419
@wumi2419 Год назад
@@SgtMacska also probably eliminating multiple classes of machines due to performance
@wumi2419
@wumi2419 Год назад
@@SgtMacska Haskell definitely has its uses in low level application though. In relation to security, it's a lot easier to prove Haskell code and compiler are mathematically correct (which is a requirement for some security standards), proving therefore that runtime is secure, than proving the same for another language. In general Haskell's clear separation of pure parts is very good for security, as that's a large part of codebase where you have no side effects
@fakt7814
@fakt7814 Год назад
Performance-critical applications should be written in something like C or Rust (but not C++, f**k C++). When you know what you need to do beforehand and optimization and fitness of the code to the hardware is of the most concern, not modelling, getting insights about things or experiments. The talk was mostly about development environments and it doesn't make much sense for a kernel to be wrapped up in this notebook-like environment, because by definition kernel is running on a bare metal. But even there OS devs can benefit by modeling OS kernel routines in a more interactive environment using something like a VM before going to the hardware directly. Well, they are already using VMs, developing a bare metal program from scratch and not using a VM is an insane idea. What I'm talking about not a traditional VM but a VM-like development tool that trades the VM strictness for interactivity and debuggability. Of course a code produced in a such environment should be modified before going to the production, if not rewritten entirely, but we kinda doing that already, by firstly writing a working program and only then optimizing it.
@julians.2597
@julians.2597 3 месяца назад
we should eschew the kettle, for how desirable is it to make chilli in a kettle?
@saurabhmehta7681
@saurabhmehta7681 10 месяцев назад
Amazing talk! Thank you for sharing it with the world. Lisp and Erlang are amazing innovations which are about to go obsolete, and that makes me sad
@vitasartemiev
@vitasartemiev Год назад
Every popular language without static types eventually gets static type support, but worse than if it got it in the start. Have you tried debugging 50TLOC+ python codebases without type annotations? It's infuriating. Type systems are a must. They don't need to be rigid or obtuse, but there has to be some mechanism for the programmer to know at a glance what to expect. Also "build buggy approximations first" is objectively wrong. Everybody knows that generally managers don't allocate time for bugfixes and refactoring. If you teach all programmers to write buggy approximations, you're gonna have to live with code that is 70% buggy approximations. Maybe he's talking about TDD like that, but it comes off wrong. Also I don't understand why he says debuggability is mutually exclusive with correctness - it's not... Yes, interactive code is cool, but correct, readable interactive code where all type-driven possibilities are evident at a glance is 10x cooler. Also Rust has a REPL mode. A lot of compiled languages do. Isn't that exactly what he wants? Also also what does he mean by debugging code in production? I really wish he'd elaborate on that.
@janisir4529
@janisir4529 Год назад
That's an issue with managers, and not coding methodology. Not that I agree much with what he says in this talk, but heard some horror stories of managers. And I suppose debugging in production means attaching a debugger to the currently working server or client on the customer's machine?
@YDV669
@YDV669 Год назад
Debugging code in production is where you buy a system that promisses it because when a program crashes, it just falls back to the interpreter prompt so you look at all your variables and code, and then you write an entire point-of-sale system in said system and deploy to 100 stores only to discover that you can't dial into the stores to connect to the crashed system because they have just one phone line and they need that for credit card machines.
@daddy3118
@daddy3118 Год назад
Jupyterlab helps with visualizations. OEIS includes setting series to music.
@Sonsequence
@Sonsequence Год назад
He's wronger than he is right. I'd love to be convinced but I think that most of these prescriptions would bring marginal improvement or go backwards. The better a visual abstraction you use, the more specific it is to a certain problem and confusing for others. The more power you give a human operator to interactively respond on a production system, the more likely they are to rely on such an untenable practice. The one thing I'd like out of all this is the ability to step into prod and set a breakpoint with some conditions in prod which doesn't halt the program but records some so I can step through. EDIT: Reached the end of the video and thought his Clerk project and the idea of notebooks being part of production code is fairly nice and much more limited than the earlier hyperbole.
@Hyo9000
@Hyo9000 Год назад
I think he conflates type safety with bad dynamic ergonomics. We can do better, and have both decent types and decent ergonomics
@ParrhesiaJoe
@ParrhesiaJoe Год назад
That was very, very good.
@AG7SM
@AG7SM Год назад
I still have warm memories of being able to do commericial Smalltalk development. Nothing else has ever felt quite the same.
@georgiosdoumas2446
@georgiosdoumas2446 Год назад
I do not know anything about Smalltalk, so you were producing what kind of applications? Who were the customers? What computers were running those applications? What years? Why Smalltalk did not became popular for corporate/business applications as C,C++, Java, C# ?
@pierremisse1046
@pierremisse1046 Год назад
Smalltalk still exists, Squeak, Pharo, VirtualWorks are just a few examples !
@paul66766
@paul66766 Год назад
Reminds me of the Gary Bernhardt talk from Stranger Loops 2012.
@viri1695
@viri1695 Год назад
(re-posting, looks like my comment got flagged for having a link in it) Excellent talk! One thing, Zig *will* have live code reloading, there's already been proof of concept working on Linux. It's just not at the top of the priority list, given the compiler should be correct first!
@Zen-rw2fz
@Zen-rw2fz Год назад
I usually do this to better understand computing, I don't even work as a developer of anysort so I'm just doing this as a hobby and it's fun to have these challenges
@porky1118
@porky1118 Год назад
11:01 APL also immediately came to my head, even if I never used it nor J.
@diynevala
@diynevala Год назад
Very interesting talk! Now I got to go back to my JS code, though.
@qu4ku
@qu4ku Год назад
mint quality.
@youcantata
@youcantata Год назад
I have used the punch machine myself. I wrote a simple toy pascal compiler in PL/I on IBM 370 for college course (the "dragon" book) assignment. Compiling job is processed once a day in collage computer center. Now we come a long way and are living in age that AI can write computer code. What a wonderful days to live.
@JerryAsher
@JerryAsher Год назад
We have to remember that the width of punch cards and the number of columns on a punch card goes back to the width of a horses butt in Rome
@guzmanhernandez3409
@guzmanhernandez3409 Год назад
There are some low code platforms that somehow try to implement some of those ideas. I'm talking about power apps and their "programming language fx". Obviously, it has some bigger limitations and still a bit painful work with it coming from actual code... but it's quite nice being able to modify something "live"
@daddy3118
@daddy3118 Год назад
Jupyterlab might be a way to recognise that sometimes, no one language does it right and be able to support interacting runtimes of more than one language.
@janekschleicher9661
@janekschleicher9661 Год назад
Beside spreadsheets, I think the other very popular implementation of live programs that can change (and be inspected) while running are relational database management systems. They are robust, easy to use and to experiment, keep very often most of the business logic, easy to change, self documenting, self explaining (EXPLAIN and DESCRIBE etc), and highly optimized (to one part automatically, and you can give many hints to improve the optimization -> comparable to typing in programming), and secure. Indeed, the possible constraints are much better than in usual programming languages (with type systems and similar) by: expressibility, performance, durability, versioning, self-explaining, transactional behaviour and atomicity. They also ensure grants in different levels of details, while in a classic programming mode a programmer very often can see everything and change everything or nothing, but not much inbetween (like yes: you can do experiments on the running system, but it's guaranteed that you can't change or break something if you have rights to select from other namespaces, create views and similar in your namesapce, but no rights for altering or similar things, and get down prioritized when in doubt of performance automatically vs the production system). This liveness of the system might be one part of the story why an incredible amount of business logic is inside such databases and not in classical programming logic.
@LC-hd5dc
@LC-hd5dc Год назад
you're not entirely wrong, but i think the reason why people use it is because the abstractions are easier to grok at first glance. and the limitations of the languages and tools mean that you can't get completely lost. i still don't agree about the "liveness" of the system being key necessarily though. the whole "compile" vs "run" notion is just an abstraction; i'm not gonna get into the whole "compiled" sql statements topic, but what i will say is that you're going from "i'm waiting hours for code to compile" vs "i'm waiting hours for this statement to run" i don't really see much benefit there. the approachability benefits come from decent tooling imo (like having a small sample of data and seeing how it flows through your query), which programming tools can also implement.
@Sonsequence
@Sonsequence Год назад
Interesting points there. I cut my teeth in a naive environment where all backend code was in the RDBMS server. It was very productive and carried a lot of the efficiency and elegance you not. But it was also cowboyish, error prone and felt more like jazz improv than crafting a great symphony. When I then went and studied proper software engineering I ate up every sanity restoring technique greedily.
@Desi-qw9fc
@Desi-qw9fc Год назад
R is also a one-liner: 1 + c(1, 2, 3, 4). Or even better: 1 + 1:4.
@artiefischel2579
@artiefischel2579 Год назад
Every time I work in R I feel like I'm back in the era of magtapes and getting your printout from the operator at the computer center. I reflexively look down to check my pocket for a pocket protector. ;-)
@julians.2597
@julians.2597 3 месяца назад
it is also an array language at heart after all
@josephlunderville3195
@josephlunderville3195 Год назад
I don't get the point. People have tried all the ideas presented here in various languages. If you don't understand the reasons behind standards, or why a mediocre standard that's actually standard is often more important to have than a "superior" one that doesn't develop consensus, you're missing a dominant part of the picture. For example, the reason TTYs are 80 columns wide is essentially because typewriters were. Typewriters weren't 80 columns wide because of computer memory limitations, they were that wide because of human factors -- that's a compromise width where long paragraphs are reasonably dense, and you also don't have too much trouble following where a broken line continues when you scan from right to left. Positioning that decision as just legacy is missing some rather important points that are central to the talk, which purports to be about human factors. I could start a similar discussion about why people do still use batch processing and slow build systems. There are a few good points in here, and if what you want is comedy snark I guess it's okay. But most of the questions raised have been well answered, and for people who have tried interactive programming and been forced to reject it because the tools just don't work for their problems, this talk is going to sound naive beyond belief. The presenter seems particularly ignorant of research into edit-and-continue, or workflows for people who work on systems larger than toys. The human factors and pragmatic considerations for a team of 10 working for 2 years are vastly different than someone working alone on data science problems for a couple months at a time. The one thing I'll give the presenter is that everyone should give the notebook paradigm a try for interactive programming.
@laalbujhakkar
@laalbujhakkar Год назад
Hey i’m still using a tty terminal on an M2 Macbook faster than a Cray2 Supercomputer
@flyingsquirrel3271
@flyingsquirrel3271 Год назад
There are a lot of gems in this talk and I like the really "zoomed out" perspective. But talking about all the "traditional" programming languages we use, I couldn't agree less with this statement: 27:24 "And I think it's not the best use of your time to proof theorems about your code that you're going to throw away anyway." Even though you might throw away the code, writing it obviously serves a purpose (otherwise you wouldn't write it). Usually the purpose is that you learn things about the problem you're trying to solve while writing and executing it, so you can then write better code that actually solves your problem after throwing away the first code. If this throwaway-code doesn't actually do what you were trying to express with the code you wrote, it is useless though. Or worse: You start debugging it, solving problems that are not related to your actual problem but just to the code that you're going to throw away anyway. "Proving theorems" that can be checked by a decently strong type system just makes it easier to write throwaway-code that actually helps you solve your problem instead of misleading you due to easily overlooked bugs in it.
@karltraunmuller7048
@karltraunmuller7048 Год назад
👏🏻👏🏻👏🏻 What a great, engaging, enjoyably unusual, smart talk.
@alexanderskladovski
@alexanderskladovski Год назад
An epic talk
@jakykong
@jakykong Год назад
My one real criticism of this talk is that there _is_ in fact value in being consistent over time. Change what needs to be changed, and the core thesis here (TL;DR always look for better ways to do things independently of tradition) is basically right, but sometimes arbitrary things that are historically contingent aren't bad things. The 80 column thing is a good example to me. It's true that we _can_ make longer lines now and sometimes that seems to have benefits, but the consistency of sticking to a fixed fairly narrow column width means highly matured toolsets work well with these, whether that's things like font sizes and monitor resolutions, indentation practices (esp. deeply nested stuff), or even just the human factor of being accustomed to it (which perpetuates since each new coder gets accustomed to what their predecessors were accustomed to by default) making it if not more comfortable, at least easier to mentally process. Maybe there is some idealized line width (or even a language design that doesn't rely on line widths for readability) that someone could cook up. And maybe, if that happens, there would be some gain from changing the 80 column tradition. But until then, there is value in sticking to that convention precisely _because_ it is conventional. Don't fix what ain't broke -- but definitely _do_ fix what _is_.
@jakykong
@jakykong Год назад
Rather, let me clarify by addressing specifically the "visual cortex" thought. It's absolutely true that we should explore graphics and pictures and how they can be useful - but it's not by any means obvious to me that it's actually worth dismissing 80 columns for being antiquated until and unless graphical systems actually supplant conventional linear representations.
@draconicepic4124
@draconicepic4124 Год назад
I know that in his opinion that live programming languages are appealing, but they aren't always practical. These types of languages have a great deal of overhead and aren't suitable for certain applications. The best example of this is operating systems. In this talk he bashes on Rust a little, but the simple truth is that it was never made for this purpose. I know people want the "One Programming Language that rules them All!" so they don't have to learn multiple languages, but reality isn't so kind. Certain languages are simply better at some tasks than others.
Далее
НЕ ДЕЛАЙТЕ УКЛАДКИ В САЛОНАХ
00:43
Whose action is better?🥹 #filaretiki #shorts
01:00
Love Challenge With Mellstroy And Mrbeast
00:19
Просмотров 4,6 млн
Stop Writing Classes
27:29
Просмотров 930 тыс.
"The Mess We're In" by Joe Armstrong
45:50
Просмотров 378 тыс.
How principled coders outperform the competition
11:11
So You Think You Know Git - FOSDEM 2024
47:00
Просмотров 1 млн
"Performance Matters" by Emery Berger
42:15
Просмотров 480 тыс.
how NASA writes space-proof code
6:03
Просмотров 2,1 млн
Кто производит iPhone?
0:59
Просмотров 457 тыс.
iPhone 16 - БЫСТРЕЕ И НАДЕЖНЕЕ
3:57
Просмотров 37 тыс.
iPhone перегрелся, что делать?!
1:01