Тёмный

Why today's software sucks 

bitheap-tech
Подписаться 1,2 тыс.
Просмотров 6 тыс.
50% 1

Опубликовано:

 

6 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 109   
@timmy7201
@timmy7201 4 месяца назад
As a software-engineer myself, I put full responsibility on: A) Sales, because they constantly sell things that are impossible to create in any realistic manner B) Software architects, because they're just to incompetent to code themselves. C) Management, because they believe everything A and B tells them... I just spend this workday, writing a library whereof I know it's going to cause a plethora of problems. Not because my code is bad, but because the whole architecture of the code base is the equivalent of a cesspool! But yeah, management made sure that the previous guy who complained to much isn't with us anymore! So everyone just shuts up and writes code the way the're told to...
@boxguytv
@boxguytv 14 дней назад
I honestly wish many of these large corporations would actually take time to create and then launch apps/websites that are completely functional and not overly bloated with ads and random BS because someone needs to justify a job that really shouldn't need justification. You coder dudes are there to build, bug fix, and maintain your crap not build junk to fulfill some stupid persons idea of productiveness.
@YK_data
@YK_data 4 месяца назад
I agree completely. Even biggest tech companies create products with unbelievable logical errors. At the end of hundred thousands years of human history, we can not even deliver scroll-consistent websites. Unbelievable. Today’s developers are addicted so much to frameworks, ORMs, packages, abstractions, and componentization that they end up making simple jobs quite harder for the sake of their mottos, such as DRY.
@henson2k
@henson2k 4 месяца назад
I don't think developers have any saying in what frameworks are used or what product has been delivered to customers.
@user-yp4qv2sv6r
@user-yp4qv2sv6r 4 месяца назад
DRY is a cost reducer, so your company doesn't need a programmer with deep knowledge or one can do many things, biblically accurate example is Unity (not a very good piece of code) what can save a lot of money vs handmade game engine what need a really cool guys and tons of time
@vitalyl1327
@vitalyl1327 4 месяца назад
The greedy industry is trying to move faster than it can, resulting in hiring people who should have never been allowed anywhere close to any engineering (and I mean 99% of those who call themselves programmers these days).
@m13v2
@m13v2 4 месяца назад
martin fowler did a good thing in documenting patterns and practices. the issue is actually that people ain’t reading what he and others actually wrote. he’s also not happy, as are kent beck, et al., with what agile has become. the real issue is that the demand for software developers grew so fast that education and science couldn’t keep up and instead we now have something that alan kay called “pop culture”. the whole agile, ddd, clean code thing basically grew out of what alan kay was trying to do: computers which could be programmed by their users. then the internet and java hypes killed it.
@MilMike
@MilMike 4 месяца назад
Oh yeah! Agile is the worst. I started to work for an agile company almost 2 years ago (still working) and compared to all my non agile years before is that we have so many meetings (1-4h per day). For every small thing: a meeting. For a feature which I could to alone without any distractions in 2 weeks, we have been already working on it for 2 months now. We are over budget and customer is pissed. Because everything needs a damn meeting. Agile should increase the quality of the project, but it is just increasing my blood pressure.
@srki22
@srki22 3 месяца назад
That is what I was going through when working for Amazon. So inefficient.
@kerojey4442
@kerojey4442 4 месяца назад
Almost every app that done by big companies gets worse with every update, devs are paid to undevelop their software.
@bitheap-tech
@bitheap-tech 4 месяца назад
I have the same feeling! Why do you think this happens?
@kerojey4442
@kerojey4442 4 месяца назад
@@bitheap-tech 1) Very dogmatic practices (OOP) which brings unmeasurable advantages for cost of performance and quality. 2) Big company only interested in how fast features shipped. They dont care about quality of such features. 3) Bad code in slow languages on top of decedes of legacy code and abstraction. 4) No standards for quality software. People seemenly okay that sites loads welcome page in 10 seconds(!!!!!) same with some desktop apps(vusial studio, visual code)
@mrECisME
@mrECisME 4 месяца назад
@@bitheap-tech Because they need to make work for themselves. Everything most always improve even if it is perfectly fine because then you are out with the job. Also, there is obsession with Dev ops processes and automated testing.... but not the actual testing that matters. User testing....
@timmy7201
@timmy7201 4 месяца назад
As a software engineer myself, I can confirm that it's usually management who forces us to implement stupid changes... Those few developers who protest, are usually removed from the company...
@wayfa13
@wayfa13 4 месяца назад
this is what happens when you're only employed for the project sprint or to develop a product. The next devs have to try figure out wth happened and try fix things that were built wrong, but probably don't have the time to do so, so they just throw their new stuff on top of older and broken stuff, and then the flaky tower built on dolomite ground just keeps growing
@bozbrown4456
@bozbrown4456 4 месяца назад
The problems of minimum viable product and the obsession of managers who have never written production code with the latest and greatest buzzword or programming methodology has also caused a lot of this, I have seen it play out over the past thirty years. Current AI systems can write your code for you with just a prompt so it's only going to get worse, and in time it will only be the AI which understands the full stack. Glad I'm retiring soon.
@wilfridtaylor
@wilfridtaylor 4 месяца назад
Yeah see it all the time. Lots of projects with little direction get stood up and everyone goes to grab the latest cool framework to recreate the wheel yet again. Then get to watch said projects fail or worse actually deliver a mess that we get stuck with for years.
@bitheap-tech
@bitheap-tech 4 месяца назад
absolutely! AI will dumb devs even more which will result in even lower quality. We even started to see how copilot-written code affects the code quality throughout GitHub projects: visualstudiomagazine.com/articles/2024/01/25/copilot-research.aspx
@wayfa13
@wayfa13 4 месяца назад
" Current AI systems can write your code for you with just a prompt so it's only going to get worse" one big GIGO circlejerk incoming lol
@GamingHelp
@GamingHelp 3 месяца назад
"minimum viable product"
@KipIngram
@KipIngram 4 месяца назад
The software industry is pathetic. But it's not really so much the "fault" of current programmers. The seeds of this situation were laid decades ago, and we've just continued to dig the hole deeper ever since. The original problem was that we let legacy software steer the development of new microprocessors. What we SHOULD have done is let the hardware guys figure out how to give us the best, most performant, most secure hardware possible, and then we do what we have to do to use it, even if that means re-writing some software. We should have kept the architectures SIMPLE, easy to understand, and then we'd have had had a SHOT at keeping them SECURE. The second problem is over-relaince on libraries. Modern programmers don't really want to write software - they want to stitch together libraries, without understanding how those libraries actually work. There's just no way to keep the stuff secure when the programming team doesn't really understand the big picture. Ten million code monkeys is NOT better than ten thousand truly exceptional programmers who actually DO understand what they're doing. I agree with EVERYTHING you've said here.
@rocknowradio
@rocknowradio 4 месяца назад
I think the main problem was to let idiots runnning the ship. I mean, in any industry you don't see a person running the show without knowing anything about. But here I am developing network scanners for managers who could not say the difference between DHCP and a cup of coffee.
@KipIngram
@KipIngram 4 месяца назад
@@rocknowradio I totally agree - I've always believed tech development projects should be run by someone who actually knows the tech. That's the one person you most need to understand the "big picture" and how everything fits together. And his job is to make sure that all the pieces developed by the younger, less experienced people on the team are actually going to "fit together" when the time comes. This "pure manager" thing, where you just have a guy who knows some agile lingo and is good at slapping sticky notes on the wall running your show, is a fail before you even start. And the other part of that leader's job is to also be teaching those younger guys WHY the decisions are made the way they are, so they can grow up to be leaders too. He or she should be watching and taking note of who shows promise, etc.
@Mirgeee
@Mirgeee 4 месяца назад
Absolutely fucking nailed it. I totally agree. If I were to make a video, I would just say the same thing less eloquently.
@philmarsh7723
@philmarsh7723 4 месяца назад
The principle problem: Management. Management don't reward those who deliver.
@ErazerPT
@ErazerPT 4 месяца назад
While the points aren't wrong, you missed the real issue. Ever increasing complexity. As the number of "moving parts" ever increases, the "failure surface" also increases. Writing Hello World is trivial. Writing a PDF renderer is not. If you could count "discreet units of work", you'd find that the "bugs per unit of work" was similar. It's just that now there's a lot more of them. And sadly, as combinatorics can assure you, the number of (potential) interactions between discreet units rises exponentially, which in turn makes "number of interactions that fail" skyrocket. So, what is the solution in an ever increasingly complex world? You patch a lot more, because we've gone past the point where it's humanly possible to explore every failure mode form the go...
@ccj2
@ccj2 4 месяца назад
I think this is a good point but I also believe that the point stated in the video exacerbates this point to where it gets as bad as it is now.
@m13v2
@m13v2 4 месяца назад
software development actually is all about managing complexity. not being able to do that is the issue.
@ErazerPT
@ErazerPT 4 месяца назад
@@m13v2 I'd say it's architecture not development. But notwithstanding, the more complex, the more unmanageable it becomes. And there's "complex you made and own" and "complex you use without control". DX for example has no shortage of barely documented "this should but won't". Add drivers that "should but don't" and your failure surface just ballooned. In a humorous way, given the complexity of modern software and middleware it uses and OS's it runs on and devices underneath that, it's more amazing that so many things work right than that a few work wrong...
@KipIngram
@KipIngram 4 месяца назад
Yes - this is absolutely part of it. Decades ago we started to let the tail wag the dog, as the processor industry chased the ability to run legacy software as fast as possible. And what we've wound up with as a result is processor cores that are pretty much impossible for one person to totally understand, all the way down to the bottom. What we should have done is just swallowed the multi-core pill much earlier (we did eventually have to swallow it anyway), while keeping the core design as simple and straightforward as possible. It's gotten to the point now where more of the logic in a core is aimed at this "legacy acceleration" than at the basic fundamental computing. And that's why we get stuff like Spectre/Meltdown - the left hand doesn't understand what the right hand is doing, and there are unforeseen interactions. If we'd done that right, then today we'd have chips with hundreds or thousands of very simple cores, instead of a few cores that no one can understand, and the "basic structure" of the software landscape would be entirely different. The same problem occurs in the software side through the myriad libraries that programmers want to use, instead of just solving the problem that's in front of them with the most simple, direct code possible.
@m13v2
@m13v2 4 месяца назад
@@ErazerPT i absolutely despise the distinction between „development“ and „architecture“. yes, when things get complex you start to add structure to keep complexity manageable. you start to separate space and that’s architecture. „development“ without „architecture“… that’s just garbage dumping, instant legacy code.
@tamtrinh174
@tamtrinh174 4 месяца назад
the more you work, the more you know about what you do, but less about everything else
@Asdayasman
@Asdayasman 4 месяца назад
Martin Fowler never having done anything useful is an interesting point. I remember when RMS got his feet out and started moisturising them in the middle of an interview, and, in awe of how little of a fuck he gave, I wondered what he'd done to make him so revered. Turns out he wrote GCC. Moisturise at will.
@chucksneedmoreland
@chucksneedmoreland 4 месяца назад
ok lets be honest it was a bit more than moisturizing, he picked something off his foot and ate it
@Asdayasman
@Asdayasman 4 месяца назад
@@chucksneedmoreland Unfathomably based.
@chucksneedmoreland
@chucksneedmoreland 4 месяца назад
@@Asdayasman he can do whatever he wants
@bitheap-tech
@bitheap-tech 4 месяца назад
Damn, didn't know the interview... Need to redo the video 😂
@Asdayasman
@Asdayasman 4 месяца назад
@@bitheap-tech Wait why? He wasn't mentioned in the video, you're fine.
@drone-reels
@drone-reels 4 месяца назад
Great points! I would add the maintainability dimension to the quality part. It's more common than ever to see non-backwards-compatible software. It's very annoying to normalize such a low standard for delivering software.
@ratfuk9340
@ratfuk9340 4 месяца назад
tl;dr 1. What you likely believe is low-level (i.e. "what lies beneath the abstraction"), actually is actually another abstraction (e.g. C is a high-level language, it assumes flat memory and sequential execution) and your model of the computer when programming is a complete abstraction. 2. That model doesn't it contain any unique "first principles" that are necessary for computation. 3. You're likely to just make it harder/impossible for the compiler to optimize your code if you start fucking with "low-level" stuff and micromanaging everything when that's unnecessary. 4. Humans are better at declarative (math-like) reasoning than imperative (Turing machine like) reasoning. Bit twiddling is hard and error prone, and it distracts from the actual problem leading to logic errors being harder to spot amidst all the micromanagement. It's best to leave that stuff to machines, which are good at stuff like that. 5. Since we're all pretending anyway, we should pick the right level of abstraction to pretend depending on the job. Imo, that usually requires a high-level declarative language . AGILE is cancer but I disagree that the issue is higher level languages. Don't get me wrong, I don't think the most popular JS framework at this micro-instant is the best thing ever, I'm approaching this from first principles. I do think thinking about bits in memory and pointers and such is not useful if you're not doing systems level programming, quite the opposite: it's an error prone distraction from the actual problem domain. First of all, C's (and even ASM's) model of the computer isn't really "low-level" unless you're actually programming for the PDP-11. But you're not, you're most likely programming for a different abstract machine, like a web browser. Saying that ppl need to learn "how a computer really works" and they should program that way is pure LARP because that's not how a computer "really works", and you're likely just making it harder for the compiler to optimize your code for modern architectures. Yes, it's good to know C and ASM as a kind of "general education" for any programmer but that's as far as they go for general app development. C is a high-level language and it has many concepts that are pure abstractions, like the call stack (which isn't technically required by the spec but most implementations still have it). These are all useful abstractions for the systems programmer/language designers etc but not all of them (like the call stack) are useful for most programmers. They are also not universal principles for computing, they are often very practical and particular to our actual machines (or rather their predecessors) but unless you're working on the machine itself, you're better off just forgetting all about that stuff. What you really want is to program for an abstract machine on the right "level of abstraction" for whatever it is you're doing, and let the AM handle the hard parts since there's zero chance of you actually being able to do a better job on any non-trivial program. The part of the problem, in my opinion, is the whole imperative mindset that encourages us to pretend that there's a single core executing instructions, one by one, pulled from flat memory (modulo concurrent imperative programming but that's even worse!). And that's the best we can do really because you don't really want to reveal what's "beneath the abstraction", you want the OS and drivers (and actual HW below them) to deal with the messy stuff and pretend that you're programming for the simple PDP-11. Contrast that to declarative code which, in principle, could be parallelized automatically by the compiler (but in reality it's not this simple of course) because in declarative code, you don't care at all about the actual physical machine. Instead, you define a high-level description that compilers, unhindered by your silly ideas of what a computer is, can then turn into an efficient machine code for the actual physical machine (rather than your imagined PDP-11). Humans are bad at programming Turing machines to do complex things, we're a lot better at doing something akin to math, i.e. declarative stuff. In other words, most of us should be looking to lambda calculus for "low-level" first principles rather than actual real world machines. Leave that to the geniuses and compilers to handle.
@noahw4623
@noahw4623 4 месяца назад
You don't have to be a genius to do low level coding. Abstractions are good for repetitive tasks, compilers are generally good at optimization, but they can only do static analysis of the code. They do not understand your requirements, and so there are many situations where you'll want to optimize code for a particular situation, particularly when building large applications. Learning how computers work at the low and intermediate levels will make you a better programmer and will help you to debug your programs. You don't need to start out with x86 assembly, but you should get to a point where if you're given the CPU manual and a simple program you can map out the operations that program runs
@ratfuk9340
@ratfuk9340 4 месяца назад
@@noahw4623 Yes, but when you say learning how computers work on a low level, you're probably talking about a very simplified model that itself is actually a fairly high level abstraction. It's still worth it as general education but it's not how computers work (similarly, I'd say it's good to know basics of theory of computation, a little lambda calculus, formal languages etc). I'd personally recommend getting comfortable with a RISC-V emulator and the classic "Computer Organization and Design" but the RISC-V version. But I claim it won't benefit you in practice unless you develop for RISC-V. And the same goes for x86 and anything else. Have you ever recalled how the processor datapath works when programming for the web? No, because the abstract machine isn't the CPU in this case, it's the browser. That also goes for higher level models like how C models the computer: it's only useful in practice if you write programs with C. Also, the cases where your hand optimized code is better than what a compiler can produce are extremely rare. Unless you're Linus Torvalds, I'd be extremely skeptical of any claims of going against the compiler (not that it's impossible tho). The benefit of declarative (especially functional) code is that it doesn't restrict the compiler as much, you give it more opportunities to optimize. It's true that currently declarative code generally runs at least a little slower than imperative code but I think that'll change in the future, at least for programs that benefit from parallelization. The real benefit is readability/maintainability/extensibility though. Source code isn't so much a common language between the computer and the programmer, but between programmers. Code should be optimized for other programmers and it's the compilers job to make it run efficiently, with obvious exceptions of course.
@noahw4623
@noahw4623 4 месяца назад
@@ratfuk9340 Suppose I'm writing a python program for a ML task, and I have a large (several million element) list that I want to sort for some reason. What's better, sorting using the built-in sort algorithm, or writing a module in C++ to do it? C++ would be faster. Now, I could use the builtin sorting function for C++, but this is single threaded, and will be slow. The smarter approach would be to parallelize the sorting operation to speed it up, can your compiler do that without any additional insight? (Hint: it can't) Most of the code I write is backend, desktop, embedded, and server. While your arguments may be true for front end web development, it's not true for programming as a whole.
@dungeonrobot
@dungeonrobot 4 месяца назад
Just to prove your point: The “this video is playing in picture in picture mode” popup is rendering on top of this video
@bitheap-tech
@bitheap-tech 4 месяца назад
absolutely, and it's Google ffs... almost unlimited money and huge talent in their workforce. And on top of that, they recently increased their pricing for premium
@vocassen
@vocassen 4 месяца назад
@@bitheap-tech I've had many longstanding bugs in youtube they just refuse to fix, some for over 6 years. Listen to a big music playlist (1000+ songs), for instance, and you'll find it can get stuck in the same loop of 4-8 songs. Last checked a year ago, but it's been a thing for so long I have no doubts it's still there.
@eotikurac
@eotikurac 4 месяца назад
there's a saying that what takes one programmer a week to make, it will take two developers two weeks. it's very true, small teams are the way to go.
@artemkotelevych2523
@artemkotelevych2523 4 месяца назад
With my 7+ years of experience I can agree with everything. Sometimes I just don't understand how some big companies like Facebook can create such shitty UX across all of their apps
@PetrGladkikh
@PetrGladkikh 4 месяца назад
I could explain that, but one of the biggest problems is that people do not listen.
@ErroneousTheory
@ErroneousTheory 4 месяца назад
I have been programming since 1984. You nailed some solid points. It all makes for unmanageable software with an uneven user experience - not to mention turning a creative role into metric-driven drudgery.
@wkdj2522
@wkdj2522 4 месяца назад
thanks for this. i'd love to see a follow up video discussing where I.T. was right around Y2K and how, at least at a few companies or on some Projects there actually WAS effective development. For example, Red Cross applied CMMi level 5 and had very few bugs, was performant, and secure. Boeing used SDLC, same with NASA, financial firms, countless others. What happened? when did Use Cases get downgraded to the glorified post it notes people call "user stories"? What would posses people to abandon teams such as Requirements Management and Configuration Management and adopt CI/CD? Oh sure, just go from code check-in straight to prod; what could go wrong there? SMH. When did people stop doing formal design reviews (PDR/CDR) and just let devs "sprint" and check in their crap whenever they deem it done? how the hell is a tester supposed to prepare for testing something if all they have is one of those jira post-it notes that says "gosh, when i click this button i'd sure like to see my billing info" really? is that what we call a Requirement now? Is that what we're settling for instead of use cases? remember those? from the super famous UML purple book? In fact, what happened to Process in general? shops nowadays have almost zero Process, maybe just de-facto standards like using github, IDE's, docker images. and yes, these are fantastic improvements, but should never be replacements for formal software Control. the log4j disaster would never, ever have happened back when CM was a thing -- they would never in a million years have even allowed some outside dependency into a build. It's just insanity. Look at us now, half our crap software is built on open source. this is not software engineering, it's the realization of mark zuckerberg's n00b idea: go fast and BREAK THINGS
@battalll
@battalll 4 месяца назад
i completely agree with the video, but wonder, why software different compared to other engineering jobs? is it because; - it can be "self-taught" - bootcamps pumping people to the sector - agile is not applicable to other engineering jobs - something else?
@bitheap-tech
@bitheap-tech 4 месяца назад
Probably this combined with the continuous need for companies to get engineers on board to continue access more funding to finance their software projects.
@m13v2
@m13v2 4 месяца назад
a) software is basically literature. the other engineering disciplines have to cope with much more constraints. b) the demand for software developers has exploded over the last decades so that education couldn’t keep up. which increased the demand even more.
@stzi7691
@stzi7691 4 месяца назад
Software engineering is mostly like Golf. It is pretty difficult to get the technique right to be a good golfer. But it does not look like much. And because it does not look like much anyone believes he/she can do it and have a say in that topic.
@the_real_glabnurb
@the_real_glabnurb 4 месяца назад
I think the biggest difference is that it's not physical. If a bridge collapses because the statics were not correctly calculated, if a car loses its wheels while driving, these are all very dangerous and costly faults. On the other hand, if a software crashes - no biggie, restart it and maybe make a bug report and hopefully get a patched version soon. Of course in more critical applications (i.e. aircraft control) software is held to the same standards as hardware and the development and testing is rigorous.
@gruanger
@gruanger 4 месяца назад
Tell me you use Chase without telling me you use Chase ;)
@mrECisME
@mrECisME 4 месяца назад
. Western companies are outsourcing everything for half the price to mostly Indians. And well you get what you pay for....
@stzi7691
@stzi7691 4 месяца назад
That is true. Most of them are not good at all. Another topic is mentality. Most of the indian developers go there to be part of the company in that job for 1 to 2 years. If they do not manage to get into management at that time frame they quit.... And then you start over with another freshman. The very good engineers in that country mostly leave anyway to the US or to Europe.
@PaulSebastianM
@PaulSebastianM 4 месяца назад
You are absolutely right. Everything you've said is in line with the latest research.
@mllenessmarie
@mllenessmarie 4 месяца назад
I immediately subscribed to your channel. I agree with pretty much every point, especially lack of deep and meaningful work - it's difficult to focus nowadays and work in big firms has a lot pointless calls and administrative tasks. As for scrum - it's pure evil. I don't do programming, but I use code and so far our the worst managed project is the one that uses agile/scrum.
@ResZCreator
@ResZCreator 4 месяца назад
But sometimes people prefer to elegant design than user friendly interface
@H0RIA
@H0RIA 4 месяца назад
I agree with everything except for the conclusion. Good software engineers will be more valued as time passes, because everyone will have the same quality problems+ lack of deep understanding. To fix even "trivial" issues, they will need real developers. So, to be honest, I am rather happy with the current context 😊
@gameswizard3502
@gameswizard3502 4 месяца назад
Excellent points! Especially when talking about games, companies are ripping their clients off and deliver sh*t quality
@bitheap-tech
@bitheap-tech 4 месяца назад
Couldn't agree more
@official-phuh
@official-phuh 4 месяца назад
I don't miss the price and plastic waste but I do miss the care and quality needed to put a game into a finality-sealing cartridge
@bitheap-tech
@bitheap-tech 4 месяца назад
@@official-phuh absolutely! Not to mention that there have been recent cases where consumers bought games that are not accessible anymore because the platforms where they bought the games from closed or removed the respective games. Plastic waste has been replaced by energy-hungry datacenters that consume the amount of energy required to power entire cities and even countries.
@jfajfa5582
@jfajfa5582 4 месяца назад
I'll add maintainability to the software quality definition... and there the "all in python crowd" really hurts ! v2 to v3 was a pain, but it doesn't end here: 3.6 to 3.10 is a mess too ! And maybe worst, the way variables are "defined" make any refactoring a nightmare. So please use python only to replace shell scripts or little tools...
@official-phuh
@official-phuh 4 месяца назад
0:53 this is why computers want to conquer humanity
@pharoah327
@pharoah327 4 месяца назад
Ok, agile and scrum are NOT bad. Your example shows someone using it very wrong. You never let the customer know the number of story points, that's an internal calculation to help approximate workload, not a selling point for a price package. And I'm not sure at all what you mean by t-shirt sizes. Also, if you are doing an 8 hour sprint planning, that sprint should be 4 weeks. So that's 1 full work day of planning, then you get 27 calendar days to work. Doesn't seem so bad. Stand-ups are only 15 minutes normally, so in an 8 hour day, you still get 7hr 45mins of work. I agree that scrum of scrums are annoying but they normally don't last long and can be done once a week at most. In general, if you didn't have an over zealous management team that loves meetings, scrum can be very advantageous. Don't blame the tool if your organization is using it wrong.
@mllenessmarie
@mllenessmarie 4 месяца назад
"agile and scrum are NOT bad." Ah yes, the classic "x is not bad, you're just using it wrong"~ The numbers you provided never work in reality, all planning takes so much more, same with stand-ups/daily calls. I'm not trying to sound mean, it's just if a project management system is good, it should not need constant convincing is it good.
@pharoah327
@pharoah327 4 месяца назад
@@mllenessmarie it isn't "the classic", it's the truth. If you use a tool wrong, it doesn't work. How is that illogical? The numbers I gave come directly from a company I used to work for and they are accurate. We used scrum successfully and I never felt like I was in meetings more than I was working. I know it can work because I've seen it first hand, and I have many other examples. It's a fact that if you misunderstand how to do something, and you do it wrong because of that, then it doesn't work. Again how is that in any way illogical?
@pharoah327
@pharoah327 4 месяца назад
I think what most likely is happening is that upper management reads an article one time about this thing called scrum, thought it was great, and decided to force it on the team without actually fully understanding how to use it properly. The team doesn't know either, so everyone tries to figure it out as they go, thinking they are doing it right. In reality, the scrum master should ensure stand-ups are kept short and sweet (ours were 15 minutes max and the prescribed small team size makes this possible), sprint planning should be no more than 1 to 2 hours per week in the sprint and scrum of scrums is just a quick recap with the heads of each team, happening at most once a week. We did ours at the end of the work day on Friday when everyone was pretty much mentally checked out anyway. Meetings are kept to just what needs to be talked about and those timings are easy to hit when the scrum master is on top of things and keeps everyone on track. Scrum works and you get a lot of work done. Now I've also worked for a company where they seem to love meetings and everyone feels as though work time is suffering because of it. Meetings drag on too long because no one tells people to wrap it up and there are 2 of these a day. This is not scrum, this is just poor time management from the company.
@mllenessmarie
@mllenessmarie 4 месяца назад
@@pharoah327 It's not that the tool is wrongly used, the tool is faulty from the very beginning. Btw. google "scrum T-shirt sizing" (the thing that surprised you in your first comment) and see how common that silly notion is. Also, I'm really happy that you guys managed to implement it successfully at your company (and as you have some other examples too), but that's anecdotal at best, as it's so more common to see scrum/agile being used incorrectly. So much so that seeing good agile/scrum is like seeing a rainbow - it happens, but it's a rare event. Also, I'm not fluent in project management as I work cybersecurity engineering. I do not need to to know that someone is trying to sell me methodology that is sh*t. Kanban boards? These I know and I respect them as they work great. And other tools used in project management. Same with tracking the updates. So I'm not completely against the idea, I'm just against things that do not work (story points and estimates - always wrong, countless calls and no time for real work, PMs with silly ideas etc.).
@ansidhe
@ansidhe 4 месяца назад
@@pharoah327Probably the first time I read someone who groks it - kudos 👍🏻 Although, the sad fact is there are thousands out there who don’t get any of it and the execs are their kings…🤦🏻‍♂️
@xsdash
@xsdash 4 месяца назад
I Agree with you.
@andrewporter1868
@andrewporter1868 4 месяца назад
I disagree fundamentally. What you have stated are mere symptoms. Observe that every new programming language is often trying to fix some problem from a previous programming language. For the most part, we might say that in some sense, every new programming language is trying to fix C, but not software engineering. I would like to state that I am also not speaking as a matter of mere experience, but from the perspective of correct theoretical knowledge. The issue is fundamentally a philosophical, primarily ontological issue: ontology is an essential part of design whether you realize it or not because in order to solve any problem, you have to know what it is. Answering the question of "What is a class" is already an ontological question, as is, "What is a graph?", "What is a car?", "What is an X?" whether we're talking about abstract data structures, or concrete existences. A wrong answer to what these whatnesses are leads to bugs which are a failure of implementation to conform to design in the best case, and an error in the design itself in the worst case. Designing is nothing more than comprehending and specifying essences or whatnesses: and our jobs as software engineers, therefore, is nothing but comprehending and specifying what things are, and then making those specifications actual by way of hardware. Now remove correct philosophy from society and you get precisely the mess we are dealing with, for instance, in the software industry. It's the blind leading the blind, each thinking he has the true solution, when in reality, he knows no better than the next guy, because sound and true philosophy has been forgotten because some guys a few centuries ago decided not to listen to reason, and it was more convenient to just cave in to such errors than face the reality. Thomism is the best philosophy out there. Consider reading courses such as "Ontology, a class manual in fundamental metaphysics" by Rev. Msgr. Paul J. Glenn, for starters. He has several other philosophy courses that should prove useful, and I'm sure are available on The Internet Archive just as this course is. Read it. Start making the connections between programming language elements and what they are such as variables, structs, and classes. You'll thank me later. Completely changed my whole perspective on our field after I got into philosophy.
@bitheap-tech
@bitheap-tech 24 дня назад
You don't agree because you're a philosopher yourself
@andrewporter1868
@andrewporter1868 24 дня назад
@@bitheap-tech There are two kinds of people in the world: those who originate thoughts, and those who merely believe the originators perhaps through still other mere believers. The latter is most people today. The former shrinks day by day.
@bitheap-tech
@bitheap-tech 24 дня назад
@@andrewporter1868 a very simplistic view about a very complex environment.
@andrewporter1868
@andrewporter1868 24 дня назад
@@bitheap-tech It's nevertheless proven from first principles: Knowledge is known or believed by intellect or by faith (such as the word of others) Many today make claims without any premises, or at least satisfactory premises Therefore many today know or believe what they do because of the knowledge or faith of others Finally, those who know or believe not of their own intellect necessarily get their knowledge and beliefs from others Knowledge and belief cannot originate from what lacks the ability to know and believe of itself Therefore in a chain of those who know or believe by others, there are one or more originators who know and believe first that have communicated their knowledge and beliefs Quite simple really. The syllogism isn't taught in schools today. Now why might that be🤔 Well, I'll refrain from giving an answer since I don't know for sure, but I will say that it is easy to manipulate a crowd when you tell them they are nothing but advanced monkeys and emphasize sense knowledge while dismissing theoretical knowledge. Theory and practice is merely the difference between what a thing is, and how it is implemented: the ancient principle of hylomorphism first practically studied and comprehended by Aristotle. The failure to recognize this is why it is an incredible deception to dismiss all theory as bunk and only trust what one can sense. This is none other than empiricism (idealism its opposite), and it is an incredibly limiting and erroneous doctrine.
@matveyshishov
@matveyshishov 4 месяца назад
Incompetence.
@dschreck
@dschreck 4 месяца назад
FB has two apps so they can compete in the messenger space; not because the app is too complicated. Games are huge multiple year billion dollar projects that get pushed out the door, not because they're too complicated to make (10 years as a game dev). Banks are complicated because they're built ontop of mainframes from the 1980s and it's a highly regulated space. Look into the tech work Chase has done as an example of tech leadership pivoting an org. Martin Fowler, while I'm not a fan, is a well established author for longer than you've been an engineer. Coming from someone with 20+ years in software engineering: The rest of your rant is pretty mediocre and comes off amateurish; seemingly with little real world org and software scaling experience.
@bitheap-tech
@bitheap-tech 4 месяца назад
I appreciate your input, but: 1. For iOS FB didn't do the same, and they have significantly bigger competition there. 2. Don't understand your argument. Does this justify having 0-day patches? 3. Not true. The app I was talking about is written in JS, powered by a node js server. Ofc payments are still reliant on mainframes, but that's the only reliable part of this app, not the feedback form I was referencing. 4. Well established in cult companies, you are correct! Although I find his refactoring snippets from his books valuable, they are still refering to the Animal class famous examples from OOP. Nothing capturing a real software that brings real value. Also, introducing endless debates about what a test name should be does not make you well-established IMHO. We're not/should not be epistemologists here, we are engineers.
@drone-reels
@drone-reels 4 месяца назад
@dschreck I am sorry but you're way off with most of the points.
@dschreck
@dschreck 4 месяца назад
@@drone-reels no I'm not
@drone-reels
@drone-reels 4 месяца назад
​@@dschreckhmm lol ok if you say so 😂
@mrECisME
@mrECisME 4 месяца назад
Modern videos games and facebook are both shit....... what's your point here?
@user-dq7vo6dy7g
@user-dq7vo6dy7g 4 месяца назад
Bullshit. I didn't see a blue screen in years. When using windows ME, I saw one every week. How was security 15 years ago? You could elevate by replacing screensaver.exe. PHP was basically a collection of vulnerabilities. Basically the only thing that stay is video games. Reducing this whole video to actual just complain about the video game market.
@it_is_ni
@it_is_ni 4 месяца назад
The steps you’re taking in your arguments are way too big. I don’t necessarily disagree with all of the things you’re saying, but it’s cobbled together very messily.
@bitheap-tech
@bitheap-tech 4 месяца назад
How would you rearrange the arguments to have everything clearer?
@ElizaberthUndEugen
@ElizaberthUndEugen 4 месяца назад
​@@bitheap-tech How about not making polemic and bland statements, like "people that discuss TDD are worthless". That is a useless, and obviously false statement. How about not pretending, abstraction is universally counterproductive? That is obvious nonsense. You need abstractions like memory management in some form. Do you know how incredibly labor, money and time intensive it is, to build large applications on a low abstraction level and needing to do manual memory management? This is a tradeoff between development feasibility and performance. Performance is not the be-all and end-all. Not reflecting all these multi faceted concerns makes you look very immature, and your argument riddled with omissions, falsehoods and other fallacies. Also, there are very performant languages with high level of abstraction of hardware, such as Haskell. The problem there is not the abstraction level of the language, but the intellectual laziness of average developers to learn to work with this abstraction and type of language. Haskell is a perfect example of a great tradeoff between compiler checked code that reaches into semantics in some areas and very high performance, not far behind C++ etc. And all of this by the power of abstraction.
@bitheap-tech
@bitheap-tech 3 месяца назад
so apart of the TDD cult (and no one says that the people are worthless, but rather the debates are) we seem to agree on the main point.
@ElizaberthUndEugen
@ElizaberthUndEugen 3 месяца назад
@@bitheap-tech Who are you replying to? I see only your replies. I know replied as well, but I do not see my reply anymore.
@bitheap-tech
@bitheap-tech 3 месяца назад
@@ElizaberthUndEugen was replying to you. Weird that your comment seems to be removed (I for sure haven't removed it). Do you see how ironic this is? A software bug in a page that hosts a video about how buggy software is.
@jaywalks9918
@jaywalks9918 4 месяца назад
DEI
@CripplingDuality
@CripplingDuality 4 месяца назад
Always one of you twats around lol
@smiechu47
@smiechu47 4 месяца назад
Why? Diversity hires.
Далее
Linus Torvalds on why desktop Linux sucks
11:07
Просмотров 1,3 млн
Why Are Open Source Alternatives So Bad?
13:06
Просмотров 627 тыс.
ПРОСТИ МЕНЯ, АСХАБ ТАМАЕВ
32:44
IS THIS SOFTWARE DEV? | Prime Reacts
19:17
Просмотров 423 тыс.
Jonathan Blow on the Problem with Open Source
15:28
Просмотров 19 тыс.
This is Why Programming Is Hard For you
10:48
Просмотров 833 тыс.
Adobe: A Disgusting, Criminal Company
10:21
Просмотров 353 тыс.
NEVER install these programs on your PC... EVER!!!
19:26
Turns out REST APIs weren't the answer (and that's OK!)
10:38
Why Most Programmers DON'T Last
18:56
Просмотров 306 тыс.
How A Steam Bug Deleted Someone’s Entire PC
11:49
Просмотров 981 тыс.