The 49MB web page (thatshubham.com)

by kermatt 375 comments 857 points
Read article View on HN

375 comments

[−] PunchyHamster 62d ago
Our developers managed to run around 750MB per website open once.

They have put in ticket with ops that the server is slow and could we look at it. So we looked. Every single video on a page with long video list pre-loaded a part of it. The single reason the site didn't ran like shit for them is coz office had direct fiber to out datacenter few blocks away.

We really shouldn't allow web developers more than 128kbit of connection speed, anything more and they just make nonsense out of it.

[−] vunderba 62d ago
PSA for those who aren’t aware: Chromium/Firefox-based browsers have a Network tab in the developer tools where you can dial down your bandwidth to simulate a slower 3G or 4G connection.

Combined with CPU throttling, it's a decent sanity check to see how well your site will perform on more modest setups.

[−] KronisLV 62d ago
I once spent around an hour optimizing a feature because it felt slow - turns out that the slower simulated connection had just stayed enabled after a restart (can’t remember if it was just the browser or the OS, but I previously needed it and then later just forgot to turn it off). Good times, useful feature though!
[−] nicbou 62d ago
I still test mine on GPRS, because my website should work fine in the Berlin U-Bahn. I also spent a lot of time working from hotels and busses with bad internet, so I care about that stuff.

Developers really ought to test such things better.

[−] redman25 61d ago
CPU/network throttling needs to be set for the product manager and management - that's the only way you might see real change.

We have some egregious slowness in our app that only shows up for our largest customers in production but none of our organizations in development have that much data. I created a load testing organization and keep considering adding management to it so they implicitly get the idea that fixing the slowness is important.

[−] chrismorgan 62d ago
Peanuts! My wife’s workplace has an internal photo gallery page. If your device can cope with it and you wait long enough, it’ll load about 14GB of images (so far). In practice, it will crawl along badly and eventually just crash your browser (or more), especially if you’re on a phone.

The single-line change of adding loading=lazy to the elements wouldn’t fix everything, but it would make the page at least basically usable.

[−] kevin_thibedeau 62d ago

> We really shouldn't allow web developers more than 128kbit

Marketing dept. too. They're the primary culprits in all the tracking scripts.

[−] Joel_Mckay 62d ago
If you want to see context aware pre-fetching done right go to mcmaster.com ...

There are good reasons to have a small cheap development staging server, as the rate-limited connection implicitly trains people what not to include. =3

[−] anthk 62d ago
I used the text web (https://text.npr.org and the like) thru Lyx. Also, Usenet, Gopher, Gemini, some 16 KBPS opus streams, everything under 2.7 KBPS when my phone data plan was throttled and I was using it in tethering mode. Tons of sites did work, but Gopher://magical.fish ran really fast.

Bitlbee saved (and still saves) my ass with tons of the protocols available via IRC using nearly nil data to connect. Also you can connect with any IRC client since early 90's.

Not just web developers. Electron lovers should be trottled with 2GB of RAM machines and some older Celeron/Core Duo machine with a GL 2.1 compatible video card. It it desktop 'app' smooth on that machine, your project it's ready.

[−] hibikir 62d ago
You don't even need video for this: I once worked for a company that put a carousel with everything in the product line, and every element was just pointing to the high resolution photography assets: The one that maybe would be useful for full page print media ads. 6000x4000 pngs. It worked fine in the office, they said. Add another nice background that size, a few more to have on the sides as you scroll down...

I was asked to look at the site when it was already live, and some VP of the parent company decided to visit the site from their phone at home.

[−] ceejayoz 62d ago
Same for fancy computers. Dev on a fast one if you like, but test things out on a Chromebook.
[−] Gravityloss 61d ago
Should also give designers periodically small displays with low maximum contrast, and have them actually try to achieve everyday tasks with the UX they have designed.
[−] jacquesm 61d ago
Yes, and a machine that is at least two generations behind the latest. That will cut down on bloat significantly.
[−] CarlitosHighway 56d ago

> Our developers managed to run around 750MB per website open once.

This breaks my brain..."run around 750MB" "per website" "open once".

Do you mean your company had several websites (as in unique websites), and each had 750MB? Or the transferred data volume of each was 750MB? Or just the first page? Or the source code?

And do you mean "open once", as in "once upon a time", or if you open the website one time? Or did your developers open and shut websites, and the open ones had 750MB? Or one time when you entered your developer's office space, you saw they had opened several websites on their computers, random ones, and by coincidence you saw the network tabs and each was 750MB data transferred?

[−] socalgal2 62d ago
this is a general problem with lots of development. Network, Memory, GPU Speed. Designer / Engineer is on a modern Mac with 16-64 gig of ram and fast internet. They never try how their code/design works on some low end Intel UHD 630 or whatever. Lots of developers making 8-13 layer blob backgrounds that runs at 60 for 120fps on their modern mac but at 5-10fps on the average person's PC because of 15x overdraw.
[−] nitwit005 61d ago
There's essentially zero chance the developers get to make choices about the ads and ad tracking.

I wouldn't even guarantee it's developers adding it. I'm sure they have some sort of content management system for doing article and ad layout.

[−] SenHeng 60d ago
I built an internal dashboard once that displayed thumbnails of building plans. Actually no, I built the dashboard, then the new guy added the thumbnails, but anyway. We didn't actually have a thumbnail generator because the sprint for that was skipped. Most of our users had only a couple of projects so even if each image was several mbs large, it wasn't that huge of an issue for them. The internal dashboard though, loaded a thousand projects per page.

I chewed through 7Gb of data in about 30 minutes while working tethered to my phone.

[−] littlecranky67 61d ago
Well as long as the website was already full loaded and responsive, and the videos show a thumbnail/placeholder, you are not blocked by that. Preloading and even very agressive pre-loading are a thing nowaadays. It is hostile to the user (because it draws their network traffic they pay for) but project managers will often override that to maximize gains from ad revenue.
[−] jakub_g 61d ago

> 422 network requests and 49 megabytes of data

Just FYI how this generally works: it's not developers who add it, but non-technical people.

Developers only add a single