>1114298 I think we should BECAUSE her just woken up face makes her look like she's had a cry. cracky cracky cracky 2021 DAY 1 (2020 DAY 367) Philosophy and life strategy for 2021, STILL a work i - THIS IS YOUR BRAIN ON CRACKY - chansluts anonymous imageboards for camgirls, camwhores, attention whores, and porn. Those are pretty good odds, actually. 2011: 9-7. AdGuard is a free extension that, as far as I can tell, is on-par with uBlock Origin. And gets the job done 98% of time. Shreik got banned for posting kiddie porn links. It doesn't effect their income so it doesn't matter. How can we tell whether some JS is "bad"? If everybody would learn to voice-control their smart phones and receive 3x speed audio feedback, battery life would shoot through the roof, since illuminating the screen is so costly, and you'd never need to do it. I do not define "the web" as certain popular browsers, CSS, Javascript, etc. Stuff that is reliable and always works. (Which will be a different HTML than the empty stub that "View Source" gives you.). I did not cut every ad, but I don't often browse new websites and I'm okay with some ads as long as they're not very bad. And for now it still works without javascript execution. You're probably wondering how I ended up in this situation. Affliction Warlock Artifact Weapon: Ulthalesh, the Deadwind Harvester, Hidden Artifact Weapon Appearances and Effects, https://us.battle.net/support/en/article/4300043, https://www.youtube.com/watch?v=vsXwAZPF3lE. It was also easier to install it on my parents' computers than to convince them to change browsers. The speed correlation with HTTP 0.9 and HTTP 1.0 is interesting. As far as I know, all major frontend frameworks can render to a static HTML document. RAWR!! or whatever web developer tell me it is. Skimming through the article didn’t give a meaningful tl;dr - it’s complicated and there is no prevalent answer. Perhaps they should push back, but what leverage do the developers have? While it is probably more the case that newer protocols are serving newer content which is slower for myriad reasons, I find myself wondering if there are interesting correlations regarding what's being served by older protocols. That is, treating a webpage like an application instead of a document. Last night I had the misfortune of experiencing a comcast outage. You might want to proof-read your comments before posting them. This study is flawed for a few reasons that I'd mentioned to the author when it was first published, but they ignored my comments. Since you are dealing with Japanese kanji in localized games (your Monado and Street Fighter articles) I got another game series to add to the list: the .hack//G.U. Unlike Mickens, I cannot save the world, and I am not telling anyone else what to do or not to do, but I made the web fast for myself. I'm sceptical of this statement. The So I just added few URL filters to remove most obnoxious tracking and ads, I added very few CSS edits to the selected websites to remove popups and I added some JS to youtube to remove its ads. Right, but then you conclude that it is just incompetent engineering. I get what you're saying but I don't think the war is against one half versus another half. I guess through webpack? Duration between TTFB and when user events first get pumped? Even if Google didn't have a stake in targeted advertising, shared caches lead to easy identification - browsers have a duty to close off any avenue by which users can be tracked. Firefox. Used to talk to some chick in Hamilton, Ontario. Sadly, the user trackers have messed it up for application error trackers too. It misses the point. I don't think that's the whole story. Your site copies the format of Stripe and they were around in 2010, pre-dating you by 4 years. Tag managers are the scourge of performance. Web servers, the network and computers are plenty fast and still getting faster. Or you can just use pop3 or imap with a real client and avoid the web crap all together. It misses the point. How is this going to solve the chief bandwidth problem on the internet of watching funny cat videos and perusing dank memes? I find Firefox to be a great alternative for Safari. To top it off, we under cache, and aggressively cache bust, and ship unminified bundles with oodles of source maps because of "developer experience" I can't tell you how many times I see, to this day, a fortune 500 company bundling an entire unminified javascript framework. I never need Javascript for those tasks. All this results in bloated, unoptimized javascript bundles. Skipping few steps, it is trust problem. We're saying it's true! I run Wipr on Safari. THREE: reply to others. Tracking application code generally does not need a lot of code. It's a blog article yet has no date on it. Vercel did a write up on it here: https://vercel.com/blog/everything-about-react-server-compon... A large part of the web are static documents and they should be developed as such. This analysis just assumes all websites are JS-application monsters and only differentiates between them. Mild Shock. Many sites are perceivably slow because of rendering work they're doing after dominteractive. I don't use uBlock. But it is also interesting for app developers who have to use some kind of inplace dom modifications. There's the likes of Gatsby [0], which is generally well supported and pairs well with Netlify and a headless CMS such as contentful. > The reason I don't use uBlock is because I think that it's overkill for me to run thousands of filters for every website in the world. Flash was fun for a while with some nice animations and interactivity, then the JS themes took over, they were double edged swords - they enabled us to do things previously we would need separate installable desktop applications for (Email, Powerpoint, etc.) The irony that Google has initiatives like AMP to speed up page loading-- supposedly. It's faster but more complicated. games. I didn't say incompetent, but rather lazy and lower quality.". I got the Malisandre quest a few days ago, killed her and got a skull, killed her again for the world q today and got another skull. Needless to say, the speed went up. Or how much "lossiness" did you introduce to get the 32%? JQuery being more prevalent than Google Analytics is a surprise to me. I got a question for you, Mato. Bad software development slows the web down. But The sites I'm visiting on a daily basis are, in fact, applications and not documents. The sites we visit do. FOUR: no, seriously, they're kissing. https://ads-free.app/Ads-Free!%20Desktop/. *Record Scratch* *Freeze Frame* Y up, that's me. Even without ads third party JS was loading twice as many bytes as our React and first party code. blocklists based on the hosts file cover almost all ads. We're blocking everything regardless of context, and this doesn't allow websites to fix their application errors. I've found the issues get fixed pretty quickly too. edit: Seems that in the Cataclysm xpac, Feign Death will not be resisted, ever. There are so many of them. I was refuting the claim "javascript slows the web down" by qualifying it with "bad javascript slows the web down". pretty much like enterprise software. This often causes the browser to re-layout, re-render, and re-draw content. I was a big fan of the artwiz cure font and using XQuartz/X11 rootless alongside OS X's aqua, lol. I read this as Google is willing to spend millions upon millions to move huge amounts of additional unnecessary network traffic to make sure only they can reliably track most people across the whole of the web. The fact that you're already not in a psych ward for insanity is so baffling I have lost all faith in every kind of justice system. It requires a bunch of post-processing based on various events. Security is part of the reason but the bigger problem I find is tying uptime to somebody else whom I have no control over. It really shows why big tech own the keys to the internet. safari on iphone uses a system-wide ad-block list, so no need for browser plugins. Anyone can go on archive.org and verify that they changed their color scheme and header to ones very similar to CatchJS in Feb 2020, after CatchJS had used that look for 2 years. i turned a bad copypasta into a bad rap Lyrics: Okay, I know this is a really bad idea but / I'm already here so / Here we fuckin' go / Rawr / … In which case, the whole notion of "bad JS" is irrelevant; it's conceptually indistinguishable from "slow page", and hence adds nothing to the discussion (i.e. I was assuming that culling to the "top million" would skew things in favor of GA. Clearly it didn't, but I was surprised. Bundling 20 dependencies and 10k lines of "compiled" JSX into one giant 5MB uncachable blob that needs to get parsed in its entirety, then shoot some requests off then need to complete, be parsed, then compile some HTML and CSS that then need to again get parsed by the browser and then finally, the the browser's rendering engine gets to start laying things out and drawing the first pixels - that is the problem. It tastes closer to real bread without the carbs. > because you're defining slow web sites as being badly developed. I'll try and offer the more controversial point. At the end of the day, we're all humans, we will introduce bugs. This kills lots of rude click/function hijacking that is done by many obnoxious web pages. But I learned how to write chrome extensions and it turned out extremely easy to insert my own CSS and JS snippets to the selected pages. Again, I have no relationship with the developer. How about a standard how heavy the site (and other meta details) before it is loaded within the html spec? I had hoped at some point in the breakdown that they differentiated between Wordpress and.. everything else I guess. i see. I get what you're saying but I don't think the war is against one half versus another half. By checking if it's slow. Of course bad software development slows the web down, because you're defining slow web sites as being badly developed. (We've been around since 2014, so we pre-date them by 4 years). Personally, I don’t (yet) see how an in-browser html parser could be much faster than createElement from e.g. I only load resources from one domain, I forgo graphics, and I do not use big, complex, graphical, "modern" web browsers to make HTTP requests. I always ask myself if something can be static now, and if so, I make it static. You should reconsider. Ahh winter. No idea; maybe getting a bunch of developers to make random refactorings, measuring the speed of each, using that to tell whether the JS is still "bad", and stopping once it's no longer slow and hence no longer "bad"? When you've got a fleet of machines behind load balancers you don't need things like a Host field to support vhosts since it's one site to a host. And the one that doesn't require JS in that list. It took minutes to load on my phone to find out if there was an outage. video. I do not even use wget or curl (only in examples on HN). Yes, JavaScript is necessary, but just like you shouldn't be using HTML for layout and CSS for content and graphics (although CSS-only art is impressive! Urban Dictionary got in on the act with an entry but it really exploded further with Richi Phelps’ Facebook Post. Yes some of this can be boiled down to companies not prioritizing performance, and product managers pushing for more features and tracking, but we've always had these constraints, and tight deadlines, and seemed to deliver just fine. Likewise, Wikipedia is also an extremely popular site. HOUSE! Probably not what you're after, but load the page in Chrome, open Among websites that I routinely visit, only Youtube could be counted as web application, but I'd argue that its main function could be simplified to a