this post was submitted on 23 Apr 2024
331 points (97.4% liked)

Technology

33632 readers
195 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
all 45 comments
sorted by: hot top controversial new old
[–] [email protected] 62 points 2 months ago (2 children)

Would love to see the same tests with an adblocker installed.

[–] [email protected] 12 points 2 months ago

Be fun to see the same M3Max with an ad and tracker blocker. See how much their top line improves.

[–] [email protected] 7 points 2 months ago
[–] [email protected] 57 points 2 months ago (1 children)

I feel that:

There are two attitudes on display here which I see in a lot of software folks. First, that CPU speed is infinite and one shouldn't worry about CPU optimization. And second, that gigantic speedups from hardware should be expected and the only reason hardware engineers wouldn't achieve them is due to spectacular incompetence, so the slow software should be blamed on hardware engineers, not software engineers.

[–] [email protected] 3 points 2 months ago

Hardware keeps getting exponentially faster and software keeps getting exponentially slower. The only people seeming to benefit from better hardware is lazy developers.

[–] [email protected] 46 points 2 months ago* (last edited 2 months ago) (3 children)

How can I check this?

I tried loading (new) Reddit homepage, and based on Network tab in Firefox without any prior cache it transferred 19.20MB compressed to 15.29MB.
But that also includes any pictures shown.
Loading lemmy.world homepage transferred 5.88MB compressed down to 1.82MB.
old.reddit.com 2.82MB compressed down to 947kB. Quite a difference.

Just for comparison, loading Eaglercraft 1.5.2, a fully functional Minecraft JavaScript clone, complete with LAN multiplayer support took 8.35MB.

But what exactly is this measuring?

[–] [email protected] 25 points 2 months ago

But what exactly is this measuring?

Hard to tell honestly.

phpBB and Wordpress are websites engines. It doesn't take into account the content of the websites they are serving, and more importantly the bloated advertising scripts that might be added to the sources.

Mastodon? What are we even talking about here? The content? The engine? Which instance?

So, while it's true that some websites are bloated and some are not, OP's post says absolutely nothing about it. Size means nothing when a single picture can easily outweigh a huge javascript file mining some bitcoins. For the same reasons, loading times mean nothing either.

Memory usage, FPS, Cumulative Layout Shift, First Input Delay, Largest Con­tent­ful Paint, any data gathered from the performance API. There are tons of efficient way to measure a website's efficiency.

Finally, a website can fail to load for many reasons. First of which can be a 504 Bad Gateway Timeout, which is an event based on an arbitrary value on the server's side.

[–] [email protected] 2 points 2 months ago

efficiency?

[–] [email protected] 1 points 2 months ago

The author of the article used chrome with a cpu throttling setting at 10x to make a comparison between an m3 and itself at 1/10th cpu. I imagine you could check it that way!

[–] [email protected] 43 points 2 months ago (1 children)

Yesterday I tried to fill out a google forms sheet on my phone. It was literally unusable. Theform didn't even completely loaded and when I tried to check something it took about 5 seconds till it responded.

[–] [email protected] 23 points 2 months ago (1 children)

Im interested to see lemmyui on this list.

[–] [email protected] 0 points 2 months ago

good question ... the devs definitely aim for efficiency in their choices. Their frontend framework for instance is niche but (at least at the time that they picked it) requires only a small size and performs well (though many devs complain about the use of a niche framework).

[–] [email protected] 21 points 2 months ago (2 children)

What exactly is danluu.com?

[–] [email protected] 30 points 2 months ago

The blog this test is originally from

[–] [email protected] 18 points 2 months ago* (last edited 2 months ago) (1 children)

Bragging rights, in the form of a blog.

[–] [email protected] 4 points 2 months ago

Ugh, it's worse than I thought. The HTML on the front page is awful. It's not even vaguely valid, it uses a made up tag (d), and it runs over HTTP instead of HTTPS. It's just this person discarding any semblance of maintainability to pursue an extremely small wire size.

[–] [email protected] 10 points 2 months ago (2 children)

Hmmm. I'm interested in the difference between new and old WordPress.

[–] [email protected] 4 points 2 months ago (1 children)

I wonder if it's when it switched to the Gutenberg editor maybe?

[–] [email protected] 2 points 2 months ago

I'm mostly just curious, as I haven't done anything with Wordpress in just shy of a decade.

[–] [email protected] 3 points 2 months ago

Unrelated, but wordpress supports activitypub, but I have yet to see a wordpress blog on the fediverse.

Anti Commercial-AI license

[–] [email protected] 5 points 2 months ago (2 children)

Why is discourse so bad here?

[–] [email protected] 0 points 2 months ago (1 children)
[–] [email protected] 1 points 2 months ago (1 children)

Thought it would be better than at least reddit

[–] [email protected] 2 points 2 months ago

Smaller userbase, mostly taken from a specific subset of users. You tend to get extreme views amplified because of that, I think.

[–] [email protected] 4 points 2 months ago

Yet somehow the ads usually load first and just fine.

[–] [email protected] 4 points 2 months ago* (last edited 2 months ago)

In the case of Discourse, a hardware engineer is an embarrassment not deserving of a job if they can't hit 90% of the performance of an all-time-great performance team but, as a software engineer, delivering 3% the performance of a non-highly-optimized application like MyBB is no problem. In Knuth's case, hardware engineers gave programmers a 100x performance increase every decade for decades with little to no work on the part of programmers. The moment this slowed down and programmers had to adapt to take advantage of new hardware, hardware engineers were "all out of ideas", but learning a few "new" (1970s and 1980s era) ideas to take advantage of current hardware would be a waste of time.

You can really tell this guy is some hardware design engineer at nvidia that has absolutely no fucking clue about how real-world user space programming works. Also I like how 74% slowly kept getting inflated until it became 90%.

Like, this dude is trying to claim that fucking Donald Knuth himfuckingself cannot figure out some new computer hardware.

Multiple processors working in concert is not, and never has been, a cure-all. It's highly situational and generally not useful.

What's dumb is that, as a Systems Design Engineer at NVIDIA, Dan Luu should know that. After all, how has SLI been doing recently?

That said, yes, of course, web dev bloat is absolutely out of control, and slow websites absolutely have nothing to do with hardware or network. That's a culprit of bad frameworks, horrific amounts of ads/trackers/bullshit, and honestly just general lack of programming fundamentals in the web dev space. Might as well call them web technicians and really ruffle some feathers. :P

[–] [email protected] 0 points 2 months ago (2 children)

There are way too many confounding factors in these tests to say anything about CPU performance for web pages. My only real takeaway is that some of the tested devices suck for browsing the web. How much is the fault of bloated web pages and how much is the fault of the device? Who knows.

[–] [email protected] 24 points 2 months ago (1 children)

Read the original blog post. The slower devices are the biggest devices some parts of the world rely on because they can't afford anything better. This makes them excluded from the "modern" web.

[–] [email protected] 4 points 2 months ago (1 children)

I did. The author talks about both and associates one with the other. It really only talks about 2 factors: web page size and CPU utilization. And that CPU speed hasn't out paced web page bloat. And then uses the data table to try and prove the point.

I'm not denying that low end devices can have trouble browsing the web. I have issue with the claim that CPU performance hasn't scaled with web page bloat because there are far more factors than just CPU performance and web page bloat in the tests, such as: everything else running on the device (OS, other apps, etc) RAM speed and size, storage speed and size (hopefully doesn't come into play but you never know), network connectivity strength, etc.

It's not even close to an "all else equal" type of testing.

[–] [email protected] 1 points 2 months ago

Those so-called low-end devices are still technically fairly powerful computing devices that aren't even used used to do anything that ought to be very taxing. They're displaying what ought to basically be a text medium.

In my eyes the problem is squarely with the way the sites are designed (and their 967 partners that are interested in whatever you're clicking on).

[–] [email protected] 4 points 2 months ago

The linked article outlines in explicit detail how it is the fault of the websites and not the devices.