So many channels posting political garbage and stuff, why would your channel that educates people on Linux and minimalist software get a strike :( this is really sad
@@thomas3754 A pepe memes video and a video parodying low quality Indian tech channels "inciting violence" and a video on RU-vid-dl. (more details are available on the website)
Dev: "No, that's a bad idea, you should make a simple website" Employer: "Well, you're fired. Hey, John can you make 70mb junk website?" John: "Yes, sir"
@@sergioblanco796 This sounds like a joke but just wait lmao. Some of us may live long enough to see it. Man... I hope soy devs are automated by that point
And why shouldn't they? If most people have good internet and the site provides functionality that can justify the size, I wouldn't want to downgrade the experience for everyone because of a few people who have shitty internet.
@@cc4405 >the site provides functionality that can justify the size yes, literal milllions of bytes for a bit of text, images and animation is totally justified
devs in the 90s "we fit a whole 3d revolutionary fps in 100mb" devs in current year "you need a gigabit internet connection, 32gb of ram, 10tb ssd, rtx2080 and a ryzen to display this text and a picture
@Allan Herrera tbh i dont rly know what im talking about im a zoomer, to me 100megs is tiny, i was more thinking of games like half life or quake than doom
@PIRACY BRUH yeah its cos of how much modern games are rushed, not just games even but all things are rushed now, companies wanna spend as little money as they can and if it runs like shit well people just have to buy more powerful computers, games are basically released half finished state, loaded with unused assets, no optimization, loads of duplicate files, uncompressed textures all kinds a shit that the poor devs just arent given the time to clean up, the moment the games in a playable state its released, even basic testing isnt done, shits fucked up
I really hate how people put back stories to their recipes. Just give me the recipe and maybe a description of how it tastes. That's it. Fuck off with your personal story about how this dish changed your life when you went on vacation somewhere.
@@fnerXVI Wait until you hear people actually defending that dumb shit: www.washingtonpost.com/news/voraciously/wp/2020/03/30/mindy-kaling-complained-about-stories-in-online-recipes-and-the-food-bloggers-let-her-have-it/
These sites arent built by web devs, but by "entrepreneurs", using a web builder, with a template built by a designer, based on advice from other "entrepreneurs".
Entrepreneurs back at it on their stupid bullshit. They're the bane of the modern world's existence. (also, self-proclaimed entrepreneurs, before the whole "hello, I'm an entrepreneur, and ", no one asked)
@@track_g35 When it comes to page load speed analytics scripts are nothing compared to unoptimised images and ads, which is the main issue in this video.
this is the best thing I have ever heard. It reminds me of the traffic lights they tested in a city in India. The more the people honk their horns, the longer the traffic light stays red. And there is a time you can see and when the cross road is too loud the time goes up or something like that.
@@therealb888 Mumbai. It was a pilot project. Probably only at one or few crossings. Here is a video. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-rNOZoYrOHxk.html
Page load time is a factor in SEO (Search Engine Optimization). Google search is notorious for pushing fast, lean, and accessible pages over those which lack such qualities.
The worst is when the publisher of the recipe appends a 3 page introduction about why the recipe is so delicious and how it changed his life blah blah blah instead of just posting the ingredients and directions.
The reason for that is that copyright doesn't apply to recipes, but it does apply to prose. If a site just lists recipes, anyone could take that recipe and post it on their own site for a profit. I personally have no issue with that, but if you're posting recipes for ad/sponorship profit you probably need something there that people can't just copy.
Chrome has its own problems, but there's a chrome addon called "recipe filter" that automatically filters just the recipe and instructions on those websites. It's a lifesaver. i don't care about how much your grandfather loved avocados from an avocado tree that grew outside of his childhood home and that's what inspired you to write this recipe
It's because recipes aren't copyrightable but the 10 paragraph intros are, so if they don't wrap recipes in a blog post, they can't really monetize the recipe
@@Nougator >village theres your mistake luke lives literally in the middle of nowhere. Never went anywhere rural where you have to walk, idk 5mi to your nearest neighbor? Luke's closest urban center is a village with less than 1000 people, tens of miles away. And most of the world still lives like that. Luke can take a car and drive to church or town etc but yep. He doesnt even get phone signal in most of his house. Im in a village and i get comfy 15mbps but i know people that need to ride horse to get to town or walk a few hours. They got barely enough signal for whatsapp. Im Latinamerican tho, Americans have good roads but America is very rural still. Everyone should go on a road trip around america and see how you can drive hundreds of miles of "empty" land, which really has a family every few miles if that. Americans tend to have acres upon acres of land.
3 года назад
Even high end gaming machines stutter when loading this kind of websites. I absolutely despise it. I'm a web developer, but my end users are sysadmins, so I'm not told to do all kinds of fancy things with the layout, clutter the thing with ads, trackers and externally loaded comment sections. I'm so happy that I don't have to deal with it. Mind you, most developers hate putting this in, as much as they hate stumbling upon it on the web. Customers will be customers, unfortunately.
I have one of those, and this site loaded really fast even with the cache disabled, loads pretty fast, are you sure you understand what a gaming machine is?
@@laughingvampire7555 agree, idk what gaming machine will struggle to load them, but the fact that regular computers like the one Luke has already struggle, proves that those websites are made so badly
As someone who reads Latin, could you explain this. Google Translate: English to Latin - "Quarantine" = "Tempus valetudini spectandae praestitutum" Latin to English - "Tempus valetudini spectandae praestitutum" = "Time to observe their health plan" English to Latin - "Time to observe their health plan" = "Ut Iudaei utantur tempus salutis consilio" Latin to English - "A tempore salutis consilium ut Iudaei utantur" = _"Now from the time when the Jews will make use of the plan of salvation"_
Luke, this reminds me of something that happened a lot at work. Some of the boomers will go to a recipe website and then try to print it out and take it home so they can cook it, these webpages are so big they won't print properly at least not on Windows computers. Instead, they fill up the print spooler memory and crash it. The user is then unable to print anything and they send me a help request since now nothing will print at all because Windows doesn't make it obvious that something has gone wrong. I've also seen times where Windows is able to handle the page, but when it sends it to the printer, and the printer isn't able to handle it because a two page paper is too big for the several MB of onboard memory on the printer. The recipe websites are some of the absolute worst websites on the Internet.
i'm surprised. why no conversion to pdf before printing? EDIT: such dumb responses below that i actually need to explain the boomer setup. - web browser's print/save menus are disabled. - custom SAVE AS menu only offers "plain text" or "printable pdf". - a desktop link appears and a helpful message -- [ YOUR FILE WAS SAVED TO THE DESKTOP ] -- - now they can open their "printable" file and use the pdf viewer's print menu. guys, this is not rocket science. please wake up.
@@shaurz I guess 90-99% clicks on these buttons are accidental, maybe 5 people in the world use them. And if a user wants to share something on his social media page, you can't possibly stop him, and existence of these buttons make no difference, because people just copy paste the url.
I know people that honestly can't copy-paste no matter how many times you try to teach them. But, there is a correct way to impliment sharing and a wrong way. RU-vid has 1 fucking button and you choose the platform to share or copy the link (which is totally ok since most of the users use it on their phones and the mobile app doesn't show anywhere else the link). All these bloated sites use 1 button per platform. WHY? Why do I need to see it on the side the whole fucking time? And I don't put them blame on them devs, they do what they are asked to do to get paid. The problem is the soyboys that own those crappy sites.
Those share buttons have little to do with clicking. They're trackers. If you forgot to log out of Facebook (or don't know how) Facebook and its advertisers know you've been to that page, without ever clicking.
I remember learning html as a kid in computer class and after making my first webpages I wondered wtf was wrong with all these web designers who made shittier webpages than me.
The main thing I never got was "Why is everything inside 5 or more DIV's..." along with those unique identifiers that make a URL 3 lines long...(And I'm leaving out those types of HTML5-ads that shift the content down, and when the loop of the video they might display ends, causes the thing you were actually reading to shift back up...animated or not, I hate those the most).
so what you are saying is "a bridge is just a couple of ropes and a few boards wtf is wrong with those engineers that use concrete and a lot of steel."
@@laughingvampire7555 No, what he's saying is "a bridge, while depending on its size for the complexity, is just ropes and boards then what's wrong with those engineers that use all this unnecessary stuff like sand and vines?"
Clients want this. I made a full website with 1MB and linked to a youtube video. This made it 8MB just because of the embed. So I got a copy of the YT in image and the embed will load on click. I got complaints because the was no useless animations and crap shining here and there. Even after I showed several good examples they didn't buy it and preferred a bloated Wordpress site.
Best are the slow sites blocked with a privacy policy wall, then after accepting fully reloads with a cookie banner, then you get popups that jump in your face while you area reading.
a ton of those privacy policy walls are just modals that you can delete with browser dev tools :DDD some of them are getting wise though, using encrypted javascript to delete the page contents :(
@@cherubin7th actually am using pale moon and it does have this ext. That deletes all the cookies after closing the tab, and also i can edit some pages which i need the cookies to save some lil data to easy return, but it's so cool and fast
I remember being a web dev in the early 2000s working on a custom e-commerce site and hand optimizing HTML and images to keep the largest page under 100kb as we had set that as our hard limit.
Setting a hard limit is the only way to not make garbage these days. Back in the 80s and 90s, the hardware set your hard limit. Now you have to set them yourself, and soydevs don't. "Works on my machine, so it's fine."
@@zackinator1439 yeah, I’m a huge proponent of hard limits. With my Dev teams we purposely set all our dev containers and VMs to have a quarter or less of the resources of production and have size and performance checks on our CI pipelines. If you make things slower then it gets rejected.
@@BlakeHelms I am to. My images typically get saved at like 60 or 70% compression rate, because you need to start pixelpeeping to see the compression artifacts at those rates still. (And next to this, you see how it helps with datalimits which are typically still in effect on hosting providers, whereas they've been a thing of the past in The Netherlands at least for years now). I like it when a website just "pops up" reasonably quickly, and this is true for things I make or things I visit and were made by someone else.
Having worked as a web developer for the past 5 years for a big international food brand company, starting from the very beginning of the project, I can share my view. It starts with some simple ideas, like having recipe pages. And we build something that very well represents that idea: title, some text, some images, some more text with steps, ingredients, etc. Usually designers get mostly involved at that point so all feedback and changes came down to redesign of hero section, title styling, text and images positioning etc. All reasonable at that point. But then after some time new requests come along: new redesign to “refresh” pages, things to work more dynamically on the pages, because “static is boring”, then SEO optimization comes in with all the quirks needed, scripts that load with the page to track all kinds od movements, but then not only one but few of those just for tracking, integrations with all kind of platforms to show videos of cooking recipes (text is no longer enough), then some more page restructuring to allow options like ingredients list sharing using all kind of 3rd party api, then you want a community around the recipes to register and login, comment, like, share, and so one after the other platform jumps in with their scripts, some more scripts to have triple A due to some ridiculois laws in some country, etc. And very soon you have a page that takes quite a bit of time to load due to all these scripts and dependencies. Ultimately the page content becomes slow and almost invisible from all the bells and whistles. The problem: most changes coming not from the actual need, but artificially. Because someone is watching analytics and making decisions to get more, more and mooooreeee of whatever his goal is. Then ppl who only think in clicks and analytics drive development on that basis. Unfortunatelly this is a problem that applies widely in todays world. For me it is simple, demand for something will drive the development of its production. When no demand for something, no production (or service). Crating the service or product and then pushing it to the ppl by convincing them it is what they need, is wrong path, and usually leads to the bloath in whatever the sector it is dealing with. Just look at what apple does in recent years, and others who follow it. I guess with the idea that we really wanted to pay more and do not even get that silly charger, yeah we consumers wanted that, rigth?
Yeah, im a SoyDev, I have worked on websites that weight more than 70mb uncompressed, and hangs even my dev machine on some screens I close my eyes in shame as I add the billion tracking scripts management tells me to insert I think of all that minified javascript burning on the CPU as the website loads and fills the console with exceptions I look at my reflection on my black terminal as npm installs all his junk for react-* packages and ask myself if this is what i want for my life I wont even blame management and the higher ups, i did this Where I hand myself in for my punishment?
70mb sounds extreme, but I actually had that on one of the projects, that was compressed and minified too. Some clever dude decided that Scala (a JVM language) will work really great in the browser as well as server side, so he used ScalaJS for all frontend. Of course, Scala being a JVM language, you can imagine that runtime for it is pretty large as it is, but then you have to also "compile" it to javascript. Long story short, the site weighed like 70+ mb and that was just javascript. I think after I left they scrapped it all and started from scratch with saner choices.
People are specializing in UX design, web design, all of the time invested in make your navigation bloated and slow. I prefer the pure html and some css.
People truly specializing in UX actually care about loading speed. It’s a part of user experience. Do not confuse UI with UX. Those are tied to each other but UX is a much wider term than a layout and foreworks design.
@@paulmik9356 I'm not confunsing things, i just miss a period between UX and design. I think i miss type, but i don't wanna edit because there are replys to it. Btw i agree with you but in pratice, the more bloated a site can be, the more people pay for it. It's like a meme.
My favorite "recipe sites" are ones with soccer moms telling me everything about their day for 20 paragraphs before finally getting to what temperature my pork chop should be grilled to.
I was actually thinking about this the other day. Internet speeds have increased significantly and yet, web pages take so damn long to load and perform like absolute ass. Really interesting video
@@Jimmy_The_Goat Bru go ask an African American for that. We just have normal name. we all African see our selves separate from the blacks in America we have different cultures and ideologies.
@@Jimmy_The_Goat Bru i can only speak for Ethiopian (its in east africa, the only country in Africa to have never been colonized) and we have our own language and normal names we don't call each other the n word. But you have the freedom of speech and the ability to call any body anything as long as you don't endanger or hurt them so the choice is yours
@@jenslindstrom2841 if their websites didn't look and work like crap, maybe people would pay to watch their content without all those ads and trackers.
@@papageorge4852 People don't tho... There are a bunch of recipe websites like the suggestion in the video running at a negative simply because people will not pay for internet content nowadays
Couldn't agree more. Seems like nowadays a site isn't worthy if it doesn't contain at least 5 social media buttons, a popup video, tons of "related resources" and "people also watched", dynamically loaded content that messes up the page as you scroll, a popup to join their newsletter and somewhere well hidden the useful informations. And that is without factoring in the ads...
To expand on this explanation: -Simple html+pictures content does not generate revenue. It only consoooms your cloud-hosting data cap, or servers + electricity cost. -Marketing and product teams need information to take decisions. All that user tracking goes back to them, and can measure site impact, view times, what drives users away, etc. -Most bandwidth use is just ads. And some ads just 'keep on loading' because they can cycle the add banner indefinitely, making a new request each time. In a site like the ones about recipes that you showed, writers will probably write in a site where there's already traffic, and where they can make money (even though that's not their main drive to post the recipe in the first place). So unless you want to pay out of your pocket for each visit you make, this is the next best alternative. You are maintaining the site by cumsooming ads. You are paying, not with money, but with bandwidth, time, and attention.
@@juzujuzu4555 sure, but if we don't do it someone else will. It's not like web or ads are going away any time soon, even if all devs boycott these decisions. To be clear, I'm not defending these "public" websites, such as recipe websites, blogs or news sites. They really go overboard with ads and other bloat. But I can see why they do it, it's basically the only way for them to make money.
@@Sergeeeek "Someone else will do it", the same applies to manufacturing and selling meth. It doesn't require a ton of people to change things, but people are just too afraid to stand up to their beliefs. I don't know what I would do if I was web dev, I get why people are afraid to stand up, but if I were to do this shite I would be honest why I do it and it's because I couldn't jeopardize my income. Ads in the current form is dying practice, how people are making money in the future is something I don't know. More and more people are using ad blockers, thus ads become less profitable, thus more ads, which makes more people use ad blockers... vicious cycle. Ads doesn't bother me nearly as much as bloated as phuck frameworks. One example, Patreon, it's so simple site that it could have been done even in the 90s. But just the home view had 380kb of code, without any dynamic functionality, just framework crap that places about a dozen picture in correct places and format text correct way. These bloated as phuck frameworks are horrible. Trackers and spyware are horrible. If site top/bottom banner ads, or ads on the side, that wouldn't bother me as long as those adds are somewhat light. Ads that load fast, and don't phuck up the experience of the site, I'm cool with it.
But Luke, you don't understand, we added a 500kB JS framework to make our load times faster by only putting 1MB on the initial load and then loading the other 50MB later.
This is a great use-case for Firefox's reader mode. Most of the time it grabs just the text of the recipe/article and you don't have to see all the other junk. Combine this with aggressive ad blocking and then every site is usable again.
Totally approving that video. I do have a fast connection, but I have a tendencie of keeping tabs instead of bookmark. All the CPU and RAM wasted by the running junk is quite impressive. Leaving the browsers on for a week will typically eat about 20GB of RAM.
This is more of a critique of corporate/ad focused blogs. I agree loading random ass scripts is a massive waste. Any web developer worth their shit knows the importance of getting rid of shit like this. Modern web frameworks are hyper focused on reducing bloat e.g. preact with 2kb bundle and svelte with 0b bundle. The real problem is corporate web, not web dev as a whole.
There are no many web devs. They're just setting up websites with wordpress and a bunch of plugins. They don't know to optimize and they're happy if it just works on high end computers.
Yep, especially when it comes to trackers, analytics, ads and other shit like that. I don't think there's a single dev out there who endorses these things, but management wants to look at pretty charts so you know.
@@senselessnothing explain the existence of Wikipedia's "anti-union violence" page, then. And you know, that one is just about the most direct application of force. However, I don't consider the "choice" between having to do something I disagree with (like bloating up websites) and starving to be a choice at all.
There are a number of dependencies these days for a simple website and those dependencies have other dependencies, powerful hardware has given rise to these incompetent hipster soydevs who probably don't know what they're doing.
I put most of the blame on the abundance of stupid and unnecessary JS frameworks. Javascript has plenty enough features to make a website without the bloated SPA bullshit. It's just jquery all over again but even harder to understand
@@Winnetou17 The question then becomes: can these people that write on top of such high level languages even code something mildly sophisticated in C? I wouldn't think so.
As a web dev (hobbyist), I completely agree. Most of these sites could weight 250-500 KB tops INCLUDING IMAGES. I have absolutely no idea what they are doing. The constraints we had in memory/storage back in the day were nice in that that you had to write software elegantly and carefully. Now it's like, why don't we add these 3 trackers and this unoptimized, completely unnecessary 15 MB image? Yeah I don't see why not.
I think one of the primary problems is trendy JavaScript frontend frameworks (React, Vue, etc.). All of the trendy web frameworks do a LOT of work behind this scene that the developer pretty much nothing about. These frameworks are optimized for developer time instead of actual performance of the website (see: ReactJS rerenders and what not, which are almost impossible to program against in a lean way). Basically, companies are trying to build things cheap, not good (lol). On top of that, because of the way "web development culture" has moved in the past couple of years, these web frameworks are basically the default for a lot of web dev projects, which I find pretty scary. No one just writes HTML, CSS, and Vanilla JS anymore. Nor do they even think that that is an option. Instead, they jump right into one of these frameworks where your HTML file has one '' in the body, and all of the content of your page gets filled in dynamically via the framework's engine. Most people don't even know why they chose the framework, they just do it lol The problem is basically: 1) No one knows how their computer works at all so they just yeet anything in there, and 2) People do the trendiest thing because there are more jobs for "React, Vue, ~framework~" developers than there are people who want to write lean and mean website. I imagine this bubble will burst once things get too bad though. JavaScript is also horrible lmao but the rabbit hole is too deep rn. We'd need some support from the big boys at the top to fix that, and they don't want to, cause it don't bring those dollars.
And BTW, remember how cookies were a perfectly normal part of the web space?; Everybody used websites just fine for a long time and nobody ever had to be asked to accept cookies(we knew they were there), until some snowflake freaked out and demanded to be a asked if they wanted to consent to them. Then suddenly, by law, it took a gazillion mouse clicks just to get a page to load. Why should the exception be made for cookies? It's a small part of web tech. What if we suddenly needed trigger warnings for every piece of tech in the puzzle, just because a Luddite got caught out of the loop and didn't understand all of the minutiae after sitting through the company training video?
@@h3HUg7Sp honestly it would be better if the page just asked you if you want them to store your data in any way Cookies on your hardware, or any other data on theirs And then you could just opt out of all of it Sadly that’ll never happen :)
This reminds me of an article I once read where the reader proclaimed that the web pages you get when entering "firefox reader mode" is how the web should actually look
Hey Luke Firefox has something called "Reader View" that takes away all the trash and leaves only the text and inline images so you aren't forced to sit through all that shit(Chromium had something like that but I think it was removed)
Quick presentation of the tool for chrome; ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-_JCKc1CCOR0.html I'm Looking for one in Librewolf and Brave.
we need to stop blaming web devs. sure there are some web devs that don't care about how much jank they load, but a bunch of them are just doing whatever the soy client is asking them to do. don't hate the player hate the game
yo some of our bosses are nazis but we still need weed money. we got a menu that has scrollbars on it ahaha 😂 you can only tell him he's dumb once or twice before you have to do what you are told regardless.
wordpress is famous for being full of bloat and that powers a lot of the web blogs / sites so its normal. But it's great because its accessible even for non-devs. When I didn't know how to code, it was always my go-to for a fast website. Which CMS do you recommend nowadays? I just recently tried Ghost CMS and I think it's great, I can also run it on a 1gb ram VPS easily which is just what I needed.
WordPress does make publishing more accessible to non-devs, which is the problem. Marketing people plugging in 30 plugins they never use, using WSIYWG page builders that add 20,000 line JS and CSS files... etc.etc.
You've got all these respectable fields in tech: systems programming, type theory, graphics programming, all kinds of stuff that's good. Then you got this pygmy soydev thing going on
@@phillycheesetake The ads are normally hosted on the ad networks. The web sites just want to make money, and then use that to advertise, to make more money so they need more adverts to advertise.
I'm a web developer and I don't like this spam/bloat/commercials approach in making sites. They are earning money with the bloat, it's the spam greed that makes them make this shit. It COULD be lightweight AND pretty.
I remember when I started out and all you needed was html CSS and some creativity. I remember those days adding JavaScript didn't hurt. But those were simpler times. I don't even recognize JavaScript anymore.
thanks, Mr. Boomer, for telling the Soydevs to start thinking about optimization, we need more sites that are more minimalistic it's OK with ads, but keep them simple so a person with a normal internet can use the internet, also the worst ads i know of is the epilepsy inducing ones.
I recall reading in old webdev books, 'scale down your images to be considerate of your visitor's bandwidth'. I figure those books have now been burned for heresy.
I feel the same way you do about most web pages. There's so much crap on them that even modern computers struggle if you have more than a couple of tabs open. Web pages generally tend to be poorly written and it comes at a cost of efficiency.
Fun fact: JavaScript was initially going to be a fairly different language more akin to Lisp (if memory serves me), and if it wasn't for the $500 mil Sun invested into Java marketing, we wouldn't have this whole mess to begin with. Eich had a pretty good idea about what he was doing, but it didn't matter because everybody wanted a piece of that Java fame. No wonder he eventually was like "k, screw you guys, I'm gonna make something decent at last".
@@nekoill look dude, I'm just look for FOSS alternatives for JS I'm trying to be s self taught programmer BTW, what's CoffeeScript& NodeJS are the variations of JS& what are thoughts about PHP 8.0???
Thanks for sharing this! I think I'm doing a pretty good job with my products now. It feels weird for me as a modern web dev looking at these websites. All these extra things feel so unnecessary and take away a lot of UX. I consider optimization to be an important aspect. Especially where I live, the difference between 7MB websites and 100KB can be your whole internet rent.
The problem is that as internet speeds get faster and faster 4 or 5 mb sounds like less and less to a soydev. On most Internet connections these sites load fairly quick. Unless you live out in the country like Luke, your ISP's slowest package will load them just fine. My ISP's slowest is I think 25. It used to be 15. I actually got a free speed boost bc my ISP dropped 15mb as a speed option because they had to make room for more bloat I guess. At 15mb, bloated websites still load fine. And my family uses all the bloat you could throw at a connection. Netflix, bloated sites, online gayming, and it wasn't that slow. It was actually a shock to me when I learned we only had 15mb. I wouldn't have thought that'd be enough. (Funnily enough, I think one of those Verizon clowns tried to tell my mom to get the 100mbps package instead of the 50mb when we decided to upgrade from the slowest speed because "It wouldn't be enough for a 4 person household". I of course told her that was the most retarded thing I'd ever heard. I now have 50mbps and it's plenty fast enough. Makes me wonder wtf people with gigabit connections think they need it for. It's basically the same issue as other software bloat, it runs 200x slowers than it should and eats half your RAM but the soydevs just point and go iT dIdNT TAkE tHaT LoOnG! wHo CARes iF iT uSEs 500 eXtrA mB oF RaM? DonT U HaVE giGaBYtes oF iT? It's why if you got a basic laptop designed for basic office work (Document writing, spreadsheets, presentations, light web browsing) from 2007 and tried to use it today you wouldn't even be able to. (If you tried to use Windblows and MS Office) The OS would probably barely run. But it's doing the same task, just worse. But everyone just shrugs it off and says "Of course an old computer can't run a new os." But why? What does Win10 do that's so much more intensive than 7 that's needed? The spying routines? Cortana? The new UI?
In this particular case, there is a plugin for Chrome and Firefox called RecipeCleaner. It does what you'd expect, plus it'll even clean out the 1200 word recipe preamble about this stew reminding them of Italy or their mother.
This video speaks to me. I got right into Gatsby/React-based websites the past 2 years and redeveloped all of my sites with it, but now regretting the decision. The irony is that Gatsby is supposed to be a blazing fast static site generator but actually creates these ridiculous JS bundle sizes that are near impossible to trim down to please Google. So now I'm rethinking the whole thing, and might redevelop them all with 11ty and strip out the JS. I agree with you completely - the modern web is an over-engineered bloatfest.
Same here, even web development is becoming bloated. You have to use some sort of frameworks that loads a shitton of packages leaving you with 250MB node_modules with just the bare bones of the project. I miss the days when I started learning and was using HTML and CSS with vanilla JavaScript, everything was working fast on my shitty PC.
The current trend in web development right now seems to be how best to create a web browser in your web browser with (insert newest javascript framework with the most bloat possible), and then use to load a twitter clone.
Another good example is Link prefetching. Since HTML5, your browser can hit a page and automatically prefetch all of the HTML, CSS, JS, and images for the links on the page. The W3C standards have bloat built-in. There is no escape from the soydev's bloat.
Watching this video is like receiving a vision of a utopic internet. But if you want to convince the soydevs, you have to make the ecological argument: Smaller websites = Less resources needed to host = Less energy consumption = Less global warming
I’m a “web dev”, but I’ve never shipped anything close to as terrible these dumpster fires. I actually specialize in optimizing web bundles. Though I’ve mostly worked on applications for embedded devices like pos terminals.
Even if you're using a modern library like react. The library is 5.3kb uncompressed. Even less if you gzip it. You can make something modern, with modern features and still have it be fast and responsive.
There was an attempt, quite some years back, to push an accessibility framework for websites, since there are plenty of common disabilities that most web devs don't even think to cater for. Needless to say, looking at the state of the modern web, that went absolutely nowhere. I still think about those issues sometimes. I'm aghast at the pages these things spit out: absolutely no care for semantic markup, no graceful fallback (many won't load properly without js) etc. It's an unholy mess and I feel sorry for those who have to navigate the web with accessibility aids.
There's a network tab in DevTools that lets you see the exact size of all loaded assets - you don't need to upload the images to your own site in order to compare the size difference.
I'm currently working on a couple of projects of my own that I plan to monetize through in-site ADs. The site DOES allow you to disable trackers and it does even recognize the DNT header, so it'll literally skip loading analytics modules, since my business logic doesn't depend on them :). The entire site with the full content averages about 500 KB to 2 MB (depends on user uploads, they can upload images). So yeah, there're still web devs who care about performance, reliability and users trust and privacy.
lol, I just had a client who wanted a recipe site made. I made it simple and the first piece of feedback I received was that we needed more banner ads.
@Youdubham yeah new websites are loaded with shit . we cant have just a normal html css website , i mean some js is not bad . but every website i go to now is just so heavy with tracking and adds
Used to work in webhosting, agree with this video 100%. People would come to us when their site pages weren't "loading fast enough", come to find out they would have all sorts of addons and plugins (people using Wordpress, ugh) that would delay pages loading in. WordPress itself is bloat, and people don't want to spend the time to learn about the technologies they're using, resulting in these slow, bloated sites.
The worst part is that you could probably easily get all the nice design stuff working with just a TINY bit of css, that still wouldn't even slow down the loading times like at all.
I agree 100%, and think this problem can largely be boiled down to two areas: 1. Communication. Web developers don't know how to speak marketing/sales and advocate for their position in-terms the rest of the company cares about. + "Hey, X feature will negatively affect our bounce rates, which will drive down engagement, aka we make less $$$" 2. Apathy. Failing at the above (or not even trying), web developers let companies wallow in the slop they've ordered. Not really caring. I could go into more detail but if I had to pick two things, it's these two major areas.
The web dev team I'm on tried for several days to migrate our projects to Docker. Why? I don't know. None of us had a good understanding of Docker, it broke a load of things and eventually, we rolled the changes back. It was a complete waste of time that would have only added massive overhead because of how containers pack dependencies. These soydevs push for these awful techniques because they think it is cool.