Тёмный

Will we be writing Hare in 2099? (with Drew DeVault) 

Developer Voices
Подписаться 22 тыс.
Просмотров 17 тыс.
50% 1

Опубликовано:

 

7 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 159   
@JanKanis
@JanKanis 7 месяцев назад
Code generation and AST parsing/unparsing **is** metaprogramming! You just use tools outside of the compiler, which is just as bad as the C preprocessor (which also operates outside of the compiler).
@kuhluhOG
@kuhluhOG 8 дней назад
tbf, code generators can make sense for certain things, like for example protobuf but other than things like that, I want to work on actual types as the compiler understands them
@Debrugger
@Debrugger 7 месяцев назад
So glad to have found this channel, love what you're doing!
@yellowrose0910
@yellowrose0910 10 месяцев назад
Interviewer said "Lye-nicks" instead of "Lin-nucks". My man! Instant sub!
@edism
@edism 7 месяцев назад
Lol
@liquidmobius
@liquidmobius 10 месяцев назад
I randomly stumbled upon Hare casually browsing the OpenBSD repos about two weeks ago. Considering C has been around for 50 years and is still going strong, I think a 100-year lifespan is actually very achievable.
@vikingthedude
@vikingthedude 8 месяцев назад
They've got 50 years to mess it up
@maxrinehart4177
@maxrinehart4177 5 месяцев назад
C will still run strong for the next 50 years too. Its best longevity secret is Its community. Hare with this mindset (of no proprietary os only) would not last for 50+ year.
@blarghblargh
@blarghblargh 4 месяца назад
​​​​@@maxrinehart4177 proprietary OSs are going away in the next 20. They're not profitable anymore. Their epoch has come and gone. Hence MS announcing this year their plans to cram theirs full of ads and spyware. If they've gone full mask off, it's game over. They're done providing value, and are ready to raze the place and rake in profits until everyone leaves
@konstantinrebrov675
@konstantinrebrov675 4 месяца назад
This is what I thought when he said about a programming language lasting for the next 100 years. Did the creators of C language intended it to be around for 50 years? I don't know them, but perhaps not. My idea is that the language will almost always still be around as long as there are computers. It persists by inertia. Whether it was the intention or not, I think that most of the modern programming languages will still be around by the end of the 21st century, that is if the electronic computer civilization does not collapse. But even in the Fallout world they still had computers.
@shaurz
@shaurz 22 дня назад
C will still be around in 50 years, I wonder if anyone will remember Hare?
@JH-pe3ro
@JH-pe3ro 5 месяцев назад
I was a passive Zig fan for a while(not really doing anything with it, but generally aligned with the goal of usurping C) but I've grown a little more interested in Hare's approach, because it's disinterested in self-extension. If you need to do extension, you really want a Lisp instead of a Fortran(using the Chuck Moore characterization of "there are only four languages, Fortran, Lisp, Forth, and APL".) But as soon as you make a Lisp, you make a jungle of Lisps, none of them quite alike. And this is borne out by everything long-lived that has tried to add some general-purpose extension - C macros, C++ templates, and of course the Common Lisp and Scheme landscape. So I'd rather define extension within a second language made to be a good code generator, e.g. Janet. The design problem of nailing down a systems programming language with good defaults is more prone to bikeshedding than pushing it onto extension, but it leads to a concise implementation that's easy to understand.
@poggybitz513
@poggybitz513 10 месяцев назад
Thank you Kris for such amazing talks. This is the only no nonsense podcast on tech yt I watch.
@morthim
@morthim 9 месяцев назад
you should shift to watch more no nonsense yt content.
@notoriouslycuriouswombat
@notoriouslycuriouswombat 8 месяцев назад
I have to give a sub, your work is phenomenal! Engaging questions, love the tone you set. I hope you really grow this channel!
@Alexander_Sannikov
@Alexander_Sannikov 3 месяца назад
I honestly think at this point "how is it better than C?" question should be forbidden when talking about new languages. It's really not that hard to find something better than a dead horse in 2024. Yes, we know that errno is bad, we know that null terminated strings are bad, we know that C macros are bad. No new language competes against C, it competes against all other languages that compete against C. Tell me how this new language is better than zig, odin and rust.
@oconnor663
@oconnor663 22 дня назад
> We've got solutions for managing generic resources, like Python's `with` for example. And we've got solutions for managing memory, like Rust's borrow checker. But nobody's really tried to unify that. I don't think the difference between `with` and what Rust is doing is really about the *kind* of resource being managed. It's more about the *scope*. The `with` keyword (like `using` in C# or `defer` in Go) is tied to a particular block of code. The resource in question always gets cleaned up at the end of that block. Rust's borrow checker is intimiately related to its model of ownership and move semantics, and the lifetime of a resource can be more complicated than any one block. This makes the borrow checker suitable for managing all sorts of things: memory, iteration, mutability, files, locks, threads, etc. But of course Python would be unusable (any language would be unusable?) if every single object had to have a known scope.
@kenneth_romero
@kenneth_romero 7 месяцев назад
recently saw tsoding messing with it. seems pretty interesting. maybe in a few years we'll see if the toolchain for it gets easier to use.
@makeitreality457
@makeitreality457 7 месяцев назад
it seemed very straightforward, and very complete. But I had to compile everything from latest sources. The distro pack was a no-go
@shaurz
@shaurz 22 дня назад
Linear types are interesting but pretty awkward to work with in practice since you have to thread the resource handle through every function call.
@edgeeffect
@edgeeffect 6 месяцев назад
The advantage with having some kind of macros in a language is that it gives you something similar to what you get with Zig's `comptime`... and this is really really really useful when you're working on very constrained systems, like microcontrollers. Got me thinking what has Hare got for `comptime`????
@digitalspecter
@digitalspecter 2 месяца назад
Ahh, Hare-brained Scheme sent me 😆 (it would probably be a cool project too!)
@g0mf
@g0mf 9 месяцев назад
Thank you both for this interview!
@christofferlerno2633
@christofferlerno2633 9 месяцев назад
C3 when?
@CristianMolina
@CristianMolina 2 месяца назад
Love this interviews
@MiaChillfox
@MiaChillfox 22 дня назад
hare looks interesting, I like the idea of the language not breaking things once it's 1.0. I don't really care if my code runs in 100 years, but currently with the programming languages I use, it's very tedious to add a new feature or fix a bug in a 2 year old project. It's half a day of updating the code or even longer setting up a VM with the old compiler and libraries before I can even start to make the changes I want to.
@toby9999
@toby9999 16 дней назад
I've just imported a 25 year old C++ program into MS VS 2022, and only a few changes were required around for loop scope. Took 10 mins. The code was originally developed on Windows 95 with MSVC 4.0. Needless to say, I was rather surprised that it was so easy.
@MiaChillfox
@MiaChillfox 16 дней назад
@@toby9999 That's awesome.
@colinmaharaj
@colinmaharaj 8 месяцев назад
16:00 So you should also implement massively parallel programming primitives in the language to 'talk' to GPUs using CUDA or OPENCL type dev. In less then 100 years, 16, 32 and 64 cores will be normal on the desktop, this can take the form of non-performance cores, and there should be a primitive to 'talk' to these 32 cores on a CPU or 1024 cores on a GPU to aid in parallel development but without a major change in the syntax, but maybe use a toggle switch in the language. 53:50 liked and heading to the hills
@makeitreality457
@makeitreality457 7 месяцев назад
Right now, parallel primitives aren't expected from a language, since the mindset is that libraries such as basic linear algebra subprograms (BLAS, clBLAS, cuBLAS...) take care of that. The problem might be interoperating with closed-source video drivers. But the new, open-source drivers, and Vulcan, are making that avenue more accessible. Maybe a good direction as more such hardware becomes widely-available and reachable directly through software.
@blarghblargh
@blarghblargh 4 месяца назад
My computer is 3-ish years old, and it has 16 true cores, 32 hyper threaded. This reality isn't even 10 years out.
@kuhluhOG
@kuhluhOG 8 дней назад
@@makeitreality457 yes, right now, but if you create a language with the explicit goal of making it for the future, especially with the goal of it not changing anymore, you need to think (or rather guess) about how things are going to be at that point and if you think that this is practically impossible to do and pretty much entirely luck based, I guess you also answered yourself how likely it is for such a language to succeed
@thegeniusfool
@thegeniusfool 14 дней назад
I get a HolyC feeling here.
@kuhluhOG
@kuhluhOG 8 дней назад
What baffles me around Hare is that multithreading is not supported at all, to a point where they explicitly state that you should not do this, are on your own and that the stdlib could possibly break from doing that (since it's not threadsafe). Did they even look at how computer hardware evolved over the last -10- 5 years? Or how hardware manufacturers say where things are going to go in the next decade? If this wouldn't be a systems programming language, fine, a higher level language can get away without support for it (look at Python), but a systems programming language?
@smokkku
@smokkku 8 месяцев назад
You should complete the series with Vale language.
@DeveloperVoices
@DeveloperVoices 8 месяцев назад
Yes, definitely want to do Vale. (Though I suspect the series will never be complete. 😅)
@shaurz
@shaurz 22 дня назад
​@@DeveloperVoiceshave you done Odin yet?
@DeveloperVoices
@DeveloperVoices 18 дней назад
@@shaurz Sure have! That's over here: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-aKYdj0f1iQI.html
@anotherelvis
@anotherelvis 3 месяца назад
Hare looks nice. It is not overly inventive, but it has nice defaults.
@TheHubra
@TheHubra 10 месяцев назад
Very interesting
@replikvltyoutube3727
@replikvltyoutube3727 7 месяцев назад
Who else came (pun intended) here from tsoding?
@mikumikudice
@mikumikudice 7 месяцев назад
people complaining hare is targeting primerly real operating systems such as linux instead of toy operating systems like windows
@fiona9891
@fiona9891 5 месяцев назад
truly don't understand why someone would make such a thing, what a waste of time. a niche operating system made solely for playing videogames will never take off
@justinhale5693
@justinhale5693 3 месяца назад
Are linear types similar to what ATS uses?
@adammontgomery7980
@adammontgomery7980 10 месяцев назад
I don't get why somebody can't just take C and strip out macros, and header files; add some mechanism for generics (which also leads the way to result types), revamp the type system (I don't like typing uint16_t etc). I'm not a good programmer at all, and every language I learn feels like it's in my way. At least, with C, I know I have to build everything from scratch but it's such a simple language that I sort-of know the path I have to take. I can appreciate the safety of Rust, but it feels like a wrestling match against the compiler (I always lose).
@poggybitz513
@poggybitz513 10 месяцев назад
You should take a look at modern c. its much more better. also c is not dead language. it is actively being developed and feature are being added every year. also, its common misconception that you have to do everything by scratch in c. i write c everyday and at no point i have to do everything from scratch, but i use variety of libraries to get the job done.
@abowden556
@abowden556 10 месяцев назад
zig does some of this. by replacing the macros with comptime code exec (kinda like lisp) you get a lot of of power (including generics) but instead of having to learn a new language for your language it's still zig. As for the type system I am not totally sure, it may or may not be to your liking. It solves at least two of your problems with C though so probably worth checking out in your case. Might learn it myself, actually.
@adammontgomery7980
@adammontgomery7980 10 месяцев назад
@@abowden556 it does. I like zig a lot but it's not stable yet. There's not much information (that I can find) about how to write build.zig. I would like to be able to write zig for embedded but don't know what to do about linking. That's what I mean when I say I'm not a good programmer; what seem to be simple tasks put me at a standstill
@androth1502
@androth1502 9 месяцев назад
take a look at C3
@_slier
@_slier 9 месяцев назад
@@poggybitz513how is it better? if you still need to create header file, it doesn't sound like better at all
@TurtleKwitty
@TurtleKwitty 6 месяцев назад
The real problem with older c programs is people dynamically loading glibc rather than statically linking it, that's usually where the 3 year old programs fail. glibc despite claiming to be for making c more stable is the most unstable thing in the universe but anything that was statically linked (or just outright syscalls directly) still runs perfectly fine
@kuhluhOG
@kuhluhOG 8 дней назад
and glibc is so not meant to be statically linked that it may break from just doing that (yes, I have seen that happen, no, I don't understand how) so, I would recommend you to statically link a different libc, like musl
@evan_ca
@evan_ca 10 месяцев назад
C macros are trash, agreed, but I'm a little confused that the solution is external code generating programs writing to source files. The extent to which this can even be considered a significant upgrade to doing codegen with C (which is already my personal go-to strategy for meta-programming in C) would seem to come down to the quality of internal ast manipulation tools. Assuming this is done well, you're a hop, skip, and a jump from something much closer to typed lisp-style macros. It sounds pretty good still, don't get me wrong, I just don't understand not going the extra mile to get compile-time superpowers. Will be checking out Hare either way.
@edgeeffect
@edgeeffect 10 месяцев назад
Yes... compiletime is the feature that most attracted me to Zig and procedural macros for Rust... running external codegen programs is how I do things with Kotlin and assembly language - can't say I find it that great.
@jonathanmarler5808
@jonathanmarler5808 10 месяцев назад
The disadvantage of any metaprogramming language feature is it's always going to be more difficult to understand than the equivalent non-metaprogramming code. Code generation has the same problem but with them you still get real concrete code on the other side you can look at/debug. An interesting example of this is the D programming language "mixin" feature, which allows you to take any compile-time known string in interpret it as code. Cool idea but understanding "mixin" based code is always more difficult than the equivalent static code. There was a proposal to take all mixins and write the generated code out to disk so the code could be inspected/debugged later, but at that point it's really just a glorified code generator; which all languages can already do without having to support advanced language features like mixin. This being said metaprogramming isn't "bad" or stricly worse than code generation...it can be very nice/convenient to use. Point is that language designers should always compare any meta-programming feature to the "code generator" solution and consider the tradeoffs. It's clear that Drew understands this equivalence and the tradeoffs.
@evan_ca
@evan_ca 10 месяцев назад
​@@jonathanmarler5808 proper meta-programming is strictly superior to code generation, unless there's something preventing you from having some function that converts an internal ast to formatted code. It's not even close, since a meta-programming system like we're discussing has type information that you generally need to separately track yourself for code generation. There are tons of things in software development that are subjective or simply a matter of tradeoffs where the best answer is "it depends", but metaprogramming vs codegen is not one of them.
@jonathanmarler5808
@jonathanmarler5808 10 месяцев назад
@@evan_ca your take on this intrigues me. I think by your description, Dlang's mixin feature would not qualify as proper metaprogramming as it doesn't operate on the ast nor have knowledge of types (it's much more like code generation). I'm open to learn more if you can point me to resources to check out?
@krumbergify
@krumbergify 10 месяцев назад
Adobe is pushing the language Val which also relies on linear types.
@OlivierDALET
@OlivierDALET 4 месяца назад
The OS with its capability model and messaging system is reminiscent of Google's Fuchsia
@19Draco96
@19Draco96 10 месяцев назад
I'd love having something like golang with sum types and better error handling or rust with implicit traits and GC so you never have to think about lifetimes. I really think the perfect language is somewhere between those two and would love to see some experimentation in that direction.
@pedrinhocrafte3216
@pedrinhocrafte3216 9 месяцев назад
Checkout the V language, it might be a good fit to what you are describing
@etooamill9528
@etooamill9528 9 месяцев назад
I've heard that Ocaml is rust with GC but it's very far from the C procedural programming style
@oconnor663
@oconnor663 22 дня назад
> rust with implicit traits and GC Totally. Could be super useful. > so you never have to think about lifetimes Aaa! The single thing that makes Rust Rust is the no-mutable-aliasing rule. Even if you have GC, you don't want to get rid of that rule, and that rule requires lifetimes. I'm summarizing a great article called "Notes on a Smaller Rust", but if I link to it RU-vid will black-hole my comment :(
@AtomSymbol
@AtomSymbol 6 месяцев назад
... if I was implementing a microkernel, I would also put ideas similar to NaCl-like (Google Native Client) technologies directly into the microkernel.
@tapwater424
@tapwater424 6 месяцев назад
What makes NaCl different from other byte-code interpreters?
@FreeScience
@FreeScience 4 месяца назад
That does not seem like something that should be in a microkernel. But using interpreters a fundamental server could of course be viable.
@SaHaRaSquad
@SaHaRaSquad 10 месяцев назад
I only looked a bit at Hare but I already like its concept a bit more than Zig. I think Zig's error handling design is its biggest mistake, only being able to return an error code without context makes it too limiting. Hare on the other hand seems to do this better while otherwise looking mainly like a cleaner C. Nonetheless I think both languages are promising.
@krumbergify
@krumbergify 10 месяцев назад
Since Zig has no RAII and no GC it can’t support arbitrary error value types. It would be a shame to have to defer destroy() all errors unless you return them. Error handling in Rust is quite verbose compared to Zig unless you use ”anyhow” which puts errors on the heap and turns Rust into Go.
@thebatchicle3429
@thebatchicle3429 10 месяцев назад
@@krumbergifynot really true. There’s no reason why an error couldn’t contain more context than a simple error code while not needing to be destroyed
@krumbergify
@krumbergify 10 месяцев назад
⁠​⁠@@thebatchicle3429Agreed, and it could for example be nice to see where in a file a json parse error occurred. It would however require some discipline to not make error unions too big and to not put in pointers to things with a non static lifetime. If you put in two extra integers in your error type then that will grow the size of any instance of anyerror as well.
@makeitreality457
@makeitreality457 7 месяцев назад
After playing around with it, Hare is terrific for Linux. But it lacks the cross-compile targets of Zig. And apparently you can also repurpose zig compiler to compile C, C++, run a complete alternative zig build system, and also run code.
@blarghblargh
@blarghblargh 4 месяца назад
I am not sure the error handling in zig is set in stone. I think there's still an open proposal they are considering. After having used Rust for a while, I agree that just having codes is Zig a little sucky. Still better than exceptions though. And I still am more bullish on Zig than I am on Rust. The cross compiling, C/C++ migration, hot reloading and incremental compiling, and being able to switch out LLVM (eventually) all are killer features.
@DerekSeymour-f3b
@DerekSeymour-f3b 8 месяцев назад
This guy might want to rethink calling his language hair.
@_slier
@_slier 9 месяцев назад
but no windows.. its like delusional programming language.. try looking at Odin.. its the best low level language out there before Jai come out ( will it even be )
@133289ify
@133289ify 9 месяцев назад
100-year old language... imagine still writing for-loops in hundred years
@DeveloperVoices
@DeveloperVoices 9 месяцев назад
Sadly I can imagine we’ll still be writing COBOL in 100 years. 😅
@gnatinator
@gnatinator 22 дня назад
sounds like a 100 year lang that'll take 100 years to finish.
@baxiry.
@baxiry. 7 месяцев назад
24:24
@delibellus
@delibellus 9 месяцев назад
wait, he said "ed", not "ed"!
@happygofishing
@happygofishing 8 месяцев назад
Hyprland jumpscare BOO!
@mari3434
@mari3434 6 месяцев назад
Boo! I have no COC!
@happygofishing
@happygofishing 6 месяцев назад
​@@mari3434drew seethes daily because the person who made the best wayland compositor is not him.
@mari3434
@mari3434 6 месяцев назад
@@happygofishing vaxryGODS..... we won......
@rednibcoding3412
@rednibcoding3412 8 месяцев назад
Hare is pretty much useless for serious projects because it only supports non-proprietary operating systems (both: as host and as compile target). This means popular systems like Windows and MacOS etc. are all not supported. I like the general idea and concept of Hare, but this "no-proprietary-os"-thing is holding it back. For Hare to gain any popularity, they will need to reconsider this design choice.
@codenameirvin1590
@codenameirvin1590 7 месяцев назад
100%. I came across this in the docs and immediately closed the browser tab and disregarded the language. It is some sort martyrdom that guarantees the failure of the language. I understand that their stance is that anyone can create a compiler based on the spec, however, that encourages the sort of fragmentation that C suffers from.
@JakobKenda
@JakobKenda 6 месяцев назад
so fork it
@codenameirvin1590
@codenameirvin1590 6 месяцев назад
@@JakobKenda why would I? What incentive do I have? There are already other languages out there that are good and support all major platforms.
@pyrotek45
@pyrotek45 5 месяцев назад
Windows and Mac suck, prolly won't be around for long without breaking changes or will get replaced by some new version like they have been. They're terrible targets for anyone who wants to make a project stand the test of time.
@codenameirvin1590
@codenameirvin1590 5 месяцев назад
@@pyrotek45 that’s absolute and utter nonsense.
@markzhitomirski7921
@markzhitomirski7921 10 месяцев назад
Hmm ... [18:29] "try to find a binary that was built for Linux three years ago". Easy, take a statically built mysqldump 5.7.25 (for linux-glibc 2.12; GA 2019-01-21), it runs on a freshly updated Ubuntu 20.04 (glibc 2.31), and I have little doubts it runs on 22.04 or newer. That said I admit it ain't easy to build/maintain this crap, with which I dealt firsthand, so I'm all for throwing C out of supply chain. Long live Hare 1.0!
@toby9999
@toby9999 16 дней назад
I like C as long as macros aren't overused, but I like C++ better. People just love to bash C (and C++) because it's seen as the cool thing to do by the 'in crowd'.
@theb1rd
@theb1rd 10 месяцев назад
No preprocessor, no multithreading, no thanks
@MagnusNemo-xc5nx
@MagnusNemo-xc5nx 10 месяцев назад
Odin
@edgeeffect
@edgeeffect 10 месяцев назад
Anything that considers Niklaus Wirth to be a design idol has to be worth a look!!! :) But, like Hare, no ARM32 and no MIPS32 support is a deal breaker for me. :(
@edgeeffect
@edgeeffect 8 месяцев назад
@@actualwafflesenjoyer if you like to mess around with PIC33 microcontrollers or old embedded devices and OpenWRT... yup!
@matthijshebly
@matthijshebly 3 месяца назад
No Windows support => Dead in the water
@michelians1148
@michelians1148 2 месяца назад
Hare has bad syntax and design because Drew can't write a good text parser. Everything is compromised to make his life easier.
@toby9999
@toby9999 16 дней назад
This language sounds worse to me than rust and zig and a bunch of other C or C++ wannabes. Use only once seems like a bunch of unnecessary overhead for no good reason.
@nanthilrodriguez
@nanthilrodriguez 9 месяцев назад
Since there has not yet been a single low level language to provide me a macro system, I still will not be leaving C. What is it with kids and refusing to allow programmers control over the language, and not just the computer?
@DeveloperVoices
@DeveloperVoices 9 месяцев назад
Yeah, there's a lot of appeal (and certainly a lot of brain food) in having a language that lets you manipulate its own code. Buuuuut...I wouldn't hold C up as a great example of that. It's string-based macro system kinda sucks. IMO, Lisp had the right idea here - let people manipulate the language *in the language*, with AST-transforms. It works so much better than string-munging. You might find Zig interesting. It's a C-style language with a macro system that works at the language level, rather than the source-string level. YMMV of course. 🙂
@nanthilrodriguez
@nanthilrodriguez 9 месяцев назад
@@DeveloperVoices The issue for me is that there are whole universes of language design space left entirely unexplored because no one has even concieved that it is possible. APL, J, K, and all the Iversonian languages have proven the benefit of notation, failing to mention the power that mathematical notation unlocks in the human mind. And yet, if I'm going to use a low-level language, I'm forced to use the most cumbersome and unweildy notation ever concieved for problem solving. How many times must I jam out for loop syntax before I get a language level construct for implicit iteration? K iteration looks like this: fn'array
@wildebeest1454
@wildebeest1454 9 месяцев назад
Nim?
@SimGunther
@SimGunther 9 месяцев назад
​@@nanthilrodriguezAny way to create this mystical land of notation without it devolving into a write-only language like literally every Iversonian language? Maybe some examples of typical problems solved in C/other systems languages that can be better clarified with your preferred style of notation?
@mskaarupj
@mskaarupj 8 месяцев назад
@@nanthilrodriguez I agree, since last century several high level languages have ways to avoid loops, which clarifies code once you get used to reading it. Why don't new low level languages implement this? I also like mathematical notation, but this probably isn't that useful when writing operating systems, otherwise it would have been implemented in some of the low level languages.
@androth1502
@androth1502 10 месяцев назад
doesn't support windows; it's not going to be the language of 100 years. it's not going to be anything at all.
@ulipink
@ulipink 10 месяцев назад
2123 will be the year of the linux desktop
@satinxs8
@satinxs8 10 месяцев назад
Honestly true, I think this is their biggest mistake. Sure, QBE is simpler than LLVM and all, but not supporting one of the major platforms will hurt adoption for sure.
@androth1502
@androth1502 10 месяцев назад
@@satinxs8it's always kind of sad when developers bring their personal ideologies into their work. andrew is also hostile towards windows, but at least he begrudgingly supports it. the most chill out of the bunch is ginger bill. and we know windows will be a first class citizen because odin is developed on a windows machine.
@satinxs8
@satinxs8 10 месяцев назад
@androth1502 I don't mind people bringing ideology in their project, it's their brainchild, after all. But don't honestly expect widespread adoption
@donwinston
@donwinston 10 месяцев назад
Makes no sense not supporting windows
@donwinston
@donwinston 10 месяцев назад
Retro languages are not the future. The more features in a language the better. C++ is a huge improvement over C. Java is a huge improvement over C++. (for business apps) Scala is an improvement over Java.
@thebatchicle3429
@thebatchicle3429 10 месяцев назад
Literal web-developer-tier opinion
@vyyr
@vyyr 9 месяцев назад
I worry about your job security in the upcoming decade.
@donwinston
@donwinston 9 месяцев назад
@@vyyr Java and PHP and JavaScript will dominate for decades and decades. AI will take my job not some Go or Rust programmer.
@vyyr
@vyyr 9 месяцев назад
@@donwinston 1.thats exactly what I was suggesting, but DECADES AND DECADES is a wild statement regarding codemonkey webdev jobs, 1 decade till early adapters, 2 decades till you are phased out probably. 2.both GO and rust programmers are slowly taking over the backend market. 3.its not about languages, its about the understanding the underlying concepts of programming and hardware.
@happygofishing
@happygofishing 8 месяцев назад
Java is good but its too verbose and everything being an "object" is an abomination. GO is a good example of a modern language.
10 месяцев назад
What about Lurcher.
@KushLemon
@KushLemon 4 месяца назад
DeVault is also a curmudgeon - upon being pointed out issues in his crappy code, he gets uber-defensive and attacks people. Hahaha.
Далее
Is Odin "Programming done right"? (with Bill Hall)
1:00:05
Microservices are Technical Debt
31:59
Просмотров 446 тыс.
Hare Programming Language
1:39:25
Просмотров 52 тыс.
Creating and Evolving Elixir (with José Valim)
1:42:22