I was looking at a production service we run that was using a few GBs of memory. When I add up all the actual data needed in a naive compact representation I end up with a few MBs. So much waste. That's before thinking of clever ways to compress, or de-duplicate or rearrange that data.
Back in the day getting the 16KB expansion pack for my 1KB RAM ZX81 was a big deal. And I also wrote code for PIC microcontrollers that have 768 bytes of program memory [and 25 bytes of RAM]. It's just so easy to not think about efficiency today, you write one line of code in a high level language and you blow away more bytes than these platforms had without doing anything useful.
Long ago working for a retail store chain, I made some excel DSL to encode business rules to update inventory spreadsheets. While coding I realized that their excel template had a bunch of cells with whitespace in them on row 100000. This forced excel to store the sparse matrix for 0:100000 region, adding 100s of Kb for no reason. Multiplied by 1000s of these files over their internal network. Out of curiosity I added empty cell cleaning in my DSL and I think I managed to fit the entire company excel file set on a small sd card (circa 2010).
Sure, if you don’t count safety features like memory management, crash handling, automatic bounds checks and encryption cyphers; as anything useful.
I do completely agree that there is a lot of waste in modern software. But equally there is also a lot more that has to be included in modern software that wasn’t ever a concern in the 80s.
Networking stacks, safety checks, encryption stacks, etc all contribute massively to software “bloat”.
You can see how this quickly adds up if you write a “hello world” CLI in assembly and compare that to the equivalent in any modern language that imports all these features into its runtime.
And this is all before you take into account that modern graphics and audio is bitmap / PCM and running at resolutions literally orders of magnitude greater than anything supported by 80s micro computers.
Yes, but this doesn't prevent you from being mindful and selecting the right tools with smaller memory footprint while providing the features you need.
Go's "GC disadvantage" is turned on its head by developing "Zero Allocation" libraries which run blazingly fast with fixed memory footprints. Similarly, rolling your own high performance/efficient code where it matters can save tremendous amounts of memory where it matters.
Of course more features and safety nets will consume memory, but we don't have to waste it like there are no other things running on the system, no?
> And this is all before you take into account that modern graphics and audio is bitmap / PCM and running at resolutions literally orders of magnitude greater than anything supported by 80s micro computers.
This demo [0] is a 4kB executable. 4096 bytes. A single file. All assets, graphics, music and whatnot, and can run at high resolutions with real time rendering.
This is [1] 64kB and this [2] is 177kB. This game from the same group is 96kB with full 3D graphics [3].
Programming these days, in some realms, is a lot like shopping for food - some people just take the box off the shelf, don't bother with reading the ingredients, throw it in with some heat and fluid and serve it up as a 3-star meal.
Others carefully select the ingredients, construct the parts they don't already have, spend the time to get the temperatures and oxygenation aligned, and then sit down to a humble meal for one.
Not many programmers, these days, do code-reading like baddies, as they should.
However, kids, the more you do it the better you get at it, so there is simply no excuse for shipping someone elses bloat.
Do you know how many blunt pointers are lined up underneath your BigFatFancyFeature, holding it up?
You speak like a butthurt kiddo who doesn't like to be reminded that there were, actually, good practices before you came along and denigrated them with your own condescending dribble. OP is correct in pointing out that the bloat is all our fault. So fix it.
> Go's "GC disadvantage" is turned on its head by developing "Zero Allocation" libraries which run blazingly fast with fixed memory footprints. Similarly, rolling your own high performance/efficient code where it matters can save tremendous amounts of memory where it matters.
The savings there would be negligible (in modern terms) but the development cost would be significantly increased.
> Of course more features and safety nets will consume memory, but we don't have to waste it like there are no other things running on the system, no?
Safety nets are not a waste. They’re a necessary cost of working with modern requirements. For example, If your personal details were stolen from a MITM attack then I’m sure you’d be asking why that piece of software wasn’t encrypting that data.
The real waste in modern software is:
1. Electron: but we are back to the cost of hiring developers
2. Application theming. But few actual users would want to go back to plain Windows 95 style widgets (many, like myself, on HN wouldn’t mind, but we are a niche and not the norm).
> This demo [0] is a 4kB executable. 4096 bytes. A single file. All assets, graphics, music and whatnot, and can run at high resolutions with real time rendering.
You quoted where i said that modern resolutions are literally orders of magnitude greater and assets stored in bitmaps / PCM then totally ignored that point.
When you wrote audio data in the 80s, you effectively wrote midi files in machine code. Obviously it wasn’t literally midi, but you’d describe notes, envelopes etc. You’d very very rarely store that audio as a waveform because audio chips then simply don’t support a high enough bitrate to make that audio sound good (nor had the storage space to save it). Whereas these days, PCM (eg WAV, MP3, FLAC, etc) sound waaaay better than midi and are much easier for programmers to work with. But even a 2 second long 16bit mono PCM waveform is going to be more than 4KB.
And modern graphics aren’t limited to 2 colour sprites (more colours were achieved via palette swapping) at 8x8 pixels. Scale that up to 32bits (not colours, bits) and you’re increasing the colour depth by literally 32 times. And that’s before you scale again from 64 pixels to thousands of pixels.
You’re then talking exponential memory growth in all dimensions.
I’ve written software for those 80s systems and modern systems too. And it’s simply ridiculous to Compare graphics and audio of those systems to modern systems without taking into account the differences in resolution, colour depth, and audio bitrates.
Software 30 years ago was more amenable to theming. The more system widgets you use, the more effective theming works by swapping them.
Now, we have grudging dark-mode toggles that aren't consistent or universal, not even rising to the level of configurabilty you got with Windows 3.1 themes, let alone things like libXaw3d or libneXtaw where the fundamental widget-drawing code could be swapped out silently.
I get the impression that since about 2005, theming has been on the downturn. Windows XP and OSX both were very close to having first class, user-facing theming systems, but both sort of chickened out at the last minute, and ever since, we've seen less and less control every release.
I think what you're describing as "theming" is more "custom UI". It used to be reserved for games, where stock Windows widgets broke immersion in a medieval fantasy strategy simulator and you were legally obliged to make the cursor a gauntlet or sword. But Electron said to the entire world "go to town, burn the system Human Interface Guidelines and make a branded nightmare!" when your application is a smart-bulb controller or a text editor that could perfectly well fit with native widgets.
We are talking about software development not user configuration. So “theming” here clearly refers specifically to the applications shipping non-standard UIs.
This also isn’t a trend that Electron started. Software has been shipping with bespoke UIs for nearly as long as UI toolkits have been a thing.
> The savings there would be negligible (in modern terms)
A word of praise for Go: it is pretty performant, while using very little memory. I inherited a few Django apps, and each thread just grows to 1GB. Running something like celery quickly eats up all memory and start thrashing. My Go replacements idle at around 20MB, and are a lot faster. It really works.
> The savings there would be negligible (in modern terms) but the development cost would be significantly increased.
...and this effort and small savings here and there is what brings the massive savings at the end of the day. Electron is what "4KB here and there won't hurt", "JS is a very dynamic language so we can move fast", and "time to market is king, software is cheap, network is reliable, YOLO!" banged together. It's a big "Leeroy Jenkins!" move in the worst possible sense, making users pay everyday with resources and lost productivity to save a developer a couple of hours at most.
Users are not cattle to milk, they and their time/resources also deserve respect. Electron is doing none of that.
> You quoted where i said that modern resolutions are literally orders of magnitude greater and assets stored in bitmaps / PCM then totally ignored that point.
Did you watch or ran any of these demos? Some (if not all) of them scale to 4K and all of them have more than two colors. All are hardware accelerated, too.
> And modern graphics aren’t limited to 2 colour sprites (more colours were achieved via palette swapping) at 8x8 pixels. Scale that up to 32bits (not colours, bits) and you’re increasing the colour depth by literally 32 times. And that’s before you scale again from 64 pixels to thousands of pixels.
Sorry to say that, but I know what graphics and high performance programming entails. Had two friends develop their own engines, and I manage HPC systems. I know how much memory matrices need, because everything is matrices after some point.
> Safety nets are not a waste.
I didn't say they are waste. That quote is out of context. Quoting my comment's first paragraph, which directly supports the part you quoted: "Yes, but this doesn't prevent you from being mindful and selecting the right tools with smaller memory footprint while providing the features you need."
So, what I argue is, you don't have to bring in everything and the kitchen sink if all you need is a knife and a cutting board. Bring in the countertop and some steel gloves to prevent cutting yourself.
> I’ve written software for those 80s systems and modern systems too. And it’s simply ridiculous to Compare graphics and audio of those systems to modern systems without taking into account the differences in resolution, colour depth, and audio bitrates.
Me too. I also record music and work on high performance code. While they are not moving much, I take photos and work on them too, so I know what happens under the hood.
I agree. I even said Electron was one piece of bloat I didn’t agree with my my comment. So it wasn’t factored into the calculations I was presenting to you.
> Did you watch or ran any of these demos? Some (if not all) of them scale to 4K and all of them have more than two colors.
You mean the ones you added after I replied?
> I didn't say they are waste. That quote is out of context.
Every part of your comment was quoted in my comment. Bar the stuff you added after I commented.
> Had two friends develop their own engines
I have friends who are doctors but that doesn’t mean I should be giving out medical advice ;)
> Just watch the demos. It's worth your time.
I’m familiar with the demo scene. I know what’s possible with a lot of effort. But writing cool effects for the demo scene is very different to writing software for a business which has to offset developer costs against software sales and delivery deadlines.
I’m also not advocating that software should be written in Electron. My point was modern software, even without Electron, is still going to be orders of magnitude larger in size and for the reasons I outlined.
I did no edits after your comment has appeared. Yep, I did edits, but your reply was not visible to me while I did these. Sometimes HN delays replies and you're accusing me of things I'm not. That's not nice.
> writing cool effects for the demo scene is very different to writing software for a business which has to offset developer costs against software sales and delivery deadlines.
The point is not "cool effects" and "infinite time" though. If we continue about talking farbrausch, they are not bunch of nerds which pump out raw assembly for effects. They have their own framework, libraries and whatnot. Not dissimilar to business software development. So, their code is not that different from a business software package.
For the size, while you can't fit a whole business software package to 64kB, you don't need to choose the biggest and most inefficient library "just because". Spending a couple of hours more, you might find a better library/tool which might allow you to create a much better software package, after all.
Again, for the third time, while safety nets and other doodads make software packages bigger, cargo culting and worshipping deadlines and ROI more than the product itself contributes more to software bloat. That's my point.
Oh I overlooked this gem:
> I have friends who are doctors but that doesn’t mean I should be giving out medical advice ;)
Yet, we designed some part of that thing together, and I had the pleasure of fighting with GPU drivers with them trying to understand what it's trying to do while neglecting our requests from it.
IOW, yep, I didn't wrote one, but I was neck deep in both of them, for years.
> I did no edits after your comment has appeared. Yep, I did edits, but your reply was not visible to me while I did these.
Which isn’t the same thing as what I said.
I’m not suggesting you did it maliciously, but the fact remains they were added afterwards so it’s understandable I missed them.
> Yet, we designed some part of that thing together, and I had the pleasure of fighting with GPU drivers with them trying to understand what it's trying to do while neglecting our requests from it.
That is quite a bit different from your original comment though. This would imply you also worked on game engines and it wasn’t just your friends.
I'm not following the scene for the last couple of years, but I doubt that. On the other hand, there are other very capable people doing very interesting things.
That C64 demo doing sprite wizardy and 8088MPH comes to my mind. The latter one, as you most probably know, can't be emulated since it (ab)uses hardware directly. :D
As a trivia: After watching .the .product, I declared "if a computer can do this with a 64kB binary, and people can make a computer do this, I can do this", and high performance/efficient programming became my passion.
From any mundane utility to something performance sensitive, that demo is my northern star. The code I write shall be as small, performant and efficient as possible while cutting no corners. This doesn't mean everything is written in assembly, but utmost care is given how something I wrote works and feels while it's running.
I would also add internationalization. There were multi-language games back in the day, but the overhead of producing different versions for different markets was extremely high. Unicode has .. not quite trivialized this, but certainly made a lot of things possible that weren't.
Much respect to people who've manage to retrofit it: there are guerilla translated versions of some Japanese-only games.
> this is all before you take into account that modern graphics and audio is bitmap / PCM and running at resolutions literally orders of magnitude greater
Yes, people underestimate how much this contributes, especially to runtime memory usage.
Back the day people had BASIC and some machines had Forth and it was like
print "Hello world"
or
." Hello world " / .( Hello world )
for Forth.
By comparison, giving how they optimized the games for 8 and 16 bit machines I should have been able to compile Cataclysm DDA:BN under my potato netbook and yet it needs GIGABYTES of RAM to compile, it crazy that you need damn swap for something it required far less RAM 15 years ago for the same features.
If the game was reimplemented in Golang it wouldn't feel many times slower. But no, we are suffering the worst from both sides of the coin: something that should have been replaced by Inferno -plan9 people, the C and Unix creators and now Golang, their cousin- with horrible compiline times, horrible and incompatible ABI's, featuritis, crazy syntax with templates and if you are lucky, memory safety.
Meanwhile I wish the forked Inferno/Purgatorio got a seamless -no virtual desktops- mode so you fired the application in a VM integrated with the guest window manager -a la Java- and that's it. Limbo+Tk+Sqlite would have been incredible for CRUD/RAD software once the GUI was polished up a little, with sticky menus as TCL/Tk and the like. In the end, if you know Golang you could learn Limbo's syntax (same channels too) with ease.
I implemented a system recently that is a drop in replacement for a component of ours, old used 250gb of memory, new one uses 6gb, exact same from the outside.
Bad code is bad code, poor choices are poor choices — but I think it’s often times pretty fair to judge things harshly on resource usage sometimes.
>Sure, if you don’t count safety features like memory management, crash handling, automatic bounds checks and encryption cyphers; as anything useful.
>Networking stacks, safety checks, encryption stacks, etc all contribute massively to software “bloat”.
They had most of this stuff in the 1980s, and even earlier really. Not on your little 8-bit microcomputer that cost $299 that might have had as a kid, but they certainly did exist on large time-sharing systems used in universities and industry and government. And those systems had only a tiny fraction of the memory that a typical x86-64 laptop has now.
We had safer programming languages fitting into 640 KB with MS-DOS, which is why I keep repeating youngsters don't get how much they can do with an ESP32.
The BASIC 10Liner competition wants you to know that there is a growing movement of hackers who recognize the bloat and see, with crystal clarity, where things kind of went wrong ...
".. and time and again it leads to amazingly elegant, clever, and sometimes delightfully crazy solutions. Over the past 14 editions, more than 1,000 BASIC 10Liners have been created — each one a small experiment, a puzzle, or a piece of digital creativity .."
I grew up with and absolutely adore The Last Ninja series. I'm not going to comment on the size thing because it's so trite.
Instead - here's [0] Ben Daglish (on flute) performing "Wastelands" together with the Norwegian C64/Amiga tribute band FastLoaders. He unfortunately passed away in 2018, just 52 years old.
If that tickled your fancy, here's [1] a full concert with them where they perform all songs from The Last Ninja.
> isometric on the C64 with such an amazing level of detail - simply gorgeous
Or a convincing representation of that. A lot of old tricks mean that the games are doing less than you think that they are, and are better understood when you stop thinking “how do they do that” and “how are they convincing my brain that is what they are doing”.
Look at how little RAM the original Elite ran in on a BBC Model B, with some swapping of code on disk⁰. 32KB, less the 7.75KB taken by the game's custom screen mode² and a little more reserved for other things¹. I saw breathy reviews at the time and have seen similar nostalgic reviews more recently talking about “8 whole galaxies!” when the game could easily have had far more than that and was at one point going to. They cut it down not for technical reasons but because having more didn't feel usefully more fun and might actually put people off. The galaxies were created by a clever little procedural generator so adding more would have only added a couple of bytes (to hold the seed and maybe other params for the generator) each.
Another great example of not quite doing what it looks like the game is doing is the apparently live-drawn 3D view in the game Sentinel on a number of 8-bit platforms.
--------
[0] There were two blocks of code that were swapped in as you entered or self a space station: one for while docked and one for while in-flight. Also the ship blueprints were not all in memory at the same time, and a different set was loaded as you jumped from one system to another.
[1] the CPU call stack (technically up to a quarter K tough the game code only needed less than half of that), scratch-space on page-zero mostly used for game variables but some of which was used by things like the disk controller ROM and sound generator, etc.
[2] Normal screen modes close to that consumed 10KB. Screen memory consumption on the BBC Master Enhanced version was doubled as it was tweaked to use double the bit depths (4ppb for the control panel and 2bbp for the exterior, instead of 2bbp and 1ppb respectively).
If we're talking about fitting a quart into a pint pot, it would be remiss not to mention Elite fitting into a BBC Model B, 32kb, and the excellent code archaeology of it, and variants by Mark Moxon here: https://www.bbcelite.com/
We lost something in the bloat, folks. Its time to turn around and take another look at the past - or at least re-adjust the rearview mirror to actually look at the road and not ones makeup ..
Some Pokémon Crystal ROMs pack a huge amount of gaming in very few MB. Z80-ish ASM, KB's of RAM.
The ZMachine games, ditto. A few kb's and an impressive simulated environment will run even under 8bit machines running a virtual machine. Of course z3 machine games will have less features for parsing/obj interaction than z8 machine games, but from a 16 bit machine and up (nothing today, a DOS PC would count) will run z8 games and get pretty complex text adventures. Compare Tristam Island or the first Zork I-III to Spiritwrak, where a subway it's simulated, or Anchorhead.
And you can code the games with Inform6 and Inform6lib with maybe a 286 with DOS or 386 and any text editor. Check Inform Beginner's Guide and DM4.pdf
And not just DOS, Windows, Linux, BSD, Macs... even Android under Termux. And the games will run either Frotz for Termux or Lectrote, or Fabularium. Under iOS, too.
Nethack/lashem weights MB's and has tons of replayability. Written in C. It will even run under a 68020 System 7 based Mac... emulated under 9front with an 720 CPU as the host. It will fly from a 486 CPU and up.
Meanwhile, Cataclysm DDA uses C++ and it needs a huge chunk of RAM and a fastly CPU to compile it today. Some high end Pentium4 with 512MB of RAM will run it well enough, but you need to cross compile it.
If I had the skills I would rewrite (no AI/LLM's please) CDDA:BN into Golang. The compiling times would plummet down and the CPU usage would be nearly the same. OFC the GC would shine here prunning tons of unused code and data from generated worlds.
Pretty much every 8-bit computer game of 1987 or earlier (before the 128kB machines became popular) were < 40Kb? The Spectrum and Commodore combined probably had a library in excess of 50,000 games.
Most games back then where small. An C64 only had 64k and most game didn't use all of it. An Atari 800 had max 48k. It wasn't until the 1200 that it went up. Both systems are cartridge based games, many of which were 8k.
Honestly though, I don't read much into the sizes. Sure they were small games and had lots of game play for some defintion of game play. I enjoyed them immensely. But it's hard to go back to just a few colors, low-res graphics, often no way to save, etc... for me at least, the modern affordances mean something. Of course I don't need every game to look like Horizon Zero Dawn. A Short Hike was great. It's also 400meg (according to steam)
It's not just that programs had small images. The noteworthy thing is that they weren't small due to externalizing their dependencies. They relied on no third party code other than a few meagre services of the operating system. They did their own sound and graphics down to the pixel level.
We can't compare 40 KB image today to a 40 KB image from 1980 something, if the contemporary one relies on 100 MB of external cruft, like a rich programming language runtime (fetched and install separately) and packages.
I remember this game, the way it drew itself on each screen, the nice graphics. Growing up with games on Atari, Commodore, Amstrad, and Spectrum, was a lot of fun.
By comparison, COD Modern Warfare 3 is 6,000,000 times larger at 240GB. Imagine telling that to someone in 1987.
I shipped a browser game that was 8KB. Okay, plus 30 million lines of Chromium ;)
Most of my games are roughly in that range though. I think my MMO was 32KB, and it had a sound effects generator and speech synth in it. (Jsfxr and SAM)
I built it in a few days for a game jam.
I'm not trying to brag, I'm trying to say this stuff is easy if you actually care. Just look at JS13K. Every game there is 13KB or below, and there's some real masterpieces there. (My game was just squares, but I've seen games with whole custom animation systems in them.)
Once you learn how, it's pretty easy. But you'll never learn if you don't care.
You have to care because there's nothing forcing you. Arguably The Last Ninja would have been a lot more than 40KB if there weren't the hardware limitations of the time.
They weren't trying to make it 40KB, they were just trying to make a game.
In my case, I enjoy the challenge! (Also I like it when things load instantly :)
I think I'll make a PS1 game next. I was inspired by this guy who made a Minecraft clone for Playstation:
How times have changed. My best-selling program "Apple Writer", for the Apple II, ran in eight kilobytes. It was written entirely in 6502 assembly language.
A few years ago, I decompiled a good part of the PC version of Might & Magic 1 for fun. According to Wikipedia, it had been released in 1986, although I don't know whether that refers to the PC version or to the original Apple II version.
It is a quite big game: the main executable is 117KB, plus around 50 overlay files of 1.5 KB each for the different dungeons and cities, plus the graphics files. I guess it was even too big for the average PC hardware at that time, or it was a limitation inherited from the original Apple II version: When you want to cast a spell you have to enter the number of the spell from the manual, maybe because there was not enough memory to fit the names of the 94 spells into RAM. Apart from that and the limited graphics and the lack of sound, the internal ruleset is very complete. You have all kind of spells and objects, capabilities, an aging mechanism, shops, etc.. The usual stuff that you also see in today's RPGs.
The modern uninstall.exe that came with it (I bought the game on GOG) was 1.3MB big.
Around the time DirectX came around and first games requiring it appeared, which in my memory coincided with hard drives getting way bigger and first games being delivered on a CD instead of floppies, I've been apalled at how I could see literal BMPs being written to disk during the installation. This was the same time when cracked games were being distributed via BBS at a fraction of the original size with custom installers which decompressed MP3s to their original WAV files. I've asked the same questions then: why WAV, why BMP, why the bloat? With time I've learned the answer: disk space is cheap, memory and CPU cycles are not, if you can afford to save yourself the decoding step, you just do it, your players will love it. You work with constraints you have and when there loosen up, your possibilities expand too.
That's just incredible. People used to be so much better at programming, or at least great programmers had it easier to get funded. Most of what I see today is exceptionally low quality and just getting worse with time.
A lot of trial and error. I've built graphical tools with GD in PHP, the difficult part for me what that the coordinates where inverted..
I only knew how to draw lines and pixels, but I got the job done.
My game YOYOZO is 39 KB and was listed as one of the "Best Games of 2023" alongside Mario & Zelda & Baldur's Gate 3. So it's still possible to do this sort of thing if you care enough and have the right constraints! If you don't have those constraints, simply impose some on yourself.
https://news.ycombinator.com/item?id=38372936
Wow that search/interact mechanic is obnoxious, you can see the player fumbling it every time, despite knowing exactly where the item is they’re trying to collect.
Some comments here sound like the ones I hear from car "enthusiasts" praising old engines for being simple to run and easy to fix, then complaining about modern engines being too complicated and how we should return to the "good old days", all that without taking into account the decades of progress since then.
Want to prove a point? Give me Skyrim in 64k of ram. Go ahead! I dare you!
Speaking of the size: my first PC, built by a family friend, had a 80MB disk, split into two partitions. The second 40MB partition had Windows 3.1 and about two Norton Commander columns full of games on it, largest of which were Wolfenstein 3D and Lost Vikings with about 1.4MB each. Truly a different era.
I never figured out how they did the turtle graphics in this game. The C64 didn't have whole screen bitmaps, you could either use sprites or user defined character sets, neither of which made this straightforward.
And the loading screens were also amazing, particularly for tape loading.
We made the most of limited resources back then. Back in 1980, I was living large with my 64KB Apple II with dual 140KB floppy drives and a 10 inch (9 inch? I can’t quite remember) amber monochrome monitor. Most had less.
I remember playing a version of this game on ZX Spectrum but I cannot find it on the internet. I remember it had bees that you had to avoid and a boat which you were able to untie so that it floats down a stream.
Many mobile J2ME games in the 2005-2015 had a similar size and were impressive too. Sometimes a time window appears and creates the economic incentives for optimization ingenuity.
40kb and it felt like a full world... I'm burning through tokens to get AI to decide whether to go to the tavern or the market. Something went wrong somewhere
Despite being a mid-late-millennial, I can see how this played out. Even compared to the second family computer my parents got in the late 90's, which was an absolute monster at the time, I do realize how many corners and shortcuts developers had to make to get a game going in a few hundred megabytes, seeing mobile games today easily exceeding 10 times that, and not just now but even 10 years ago when I was working at a company that made mobile games. These days, developers are automatically assuming everyone has what are effectively unlimited resources by 90's standards(granted they haven't transitioned to slop-coding, which makes it substantially worse). Personally, I have a very strange but useful habit: when I find myself with some spare time at work, I spin up a very under-powered VM and start running what is in production and try to find optimizations. One of the data pipelines I have is pretty much insanity in terms of scale and running it took over 48 hours. Last time(a few weeks ago actually), I did the VM thing and started looking for optimizations and I found a few, which were completely counter-intuitive at first and everyone was like "na, that makes no sense". But now the pipeline runs in just over 10 hours. It's insane how much shortcuts you force yourself to find when you put a tight fence around you.
192 comments
Back in the day getting the 16KB expansion pack for my 1KB RAM ZX81 was a big deal. And I also wrote code for PIC microcontrollers that have 768 bytes of program memory [and 25 bytes of RAM]. It's just so easy to not think about efficiency today, you write one line of code in a high level language and you blow away more bytes than these platforms had without doing anything useful.
I do completely agree that there is a lot of waste in modern software. But equally there is also a lot more that has to be included in modern software that wasn’t ever a concern in the 80s.
Networking stacks, safety checks, encryption stacks, etc all contribute massively to software “bloat”.
You can see how this quickly adds up if you write a “hello world” CLI in assembly and compare that to the equivalent in any modern language that imports all these features into its runtime.
And this is all before you take into account that modern graphics and audio is bitmap / PCM and running at resolutions literally orders of magnitude greater than anything supported by 80s micro computers.
Go's "GC disadvantage" is turned on its head by developing "Zero Allocation" libraries which run blazingly fast with fixed memory footprints. Similarly, rolling your own high performance/efficient code where it matters can save tremendous amounts of memory where it matters.
Of course more features and safety nets will consume memory, but we don't have to waste it like there are no other things running on the system, no?
> And this is all before you take into account that modern graphics and audio is bitmap / PCM and running at resolutions literally orders of magnitude greater than anything supported by 80s micro computers.
This demo [0] is a 4kB executable. 4096 bytes. A single file. All assets, graphics, music and whatnot, and can run at high resolutions with real time rendering.
This is [1] 64kB and this [2] is 177kB. This game from the same group is 96kB with full 3D graphics [3].
[0]: https://www.pouet.net/prod.php?which=52938
[1]: https://www.pouet.net/prod.php?which=1221
[2]: https://www.pouet.net/prod.php?which=30244
[3]: https://en.wikipedia.org/wiki/.kkrieger
Others carefully select the ingredients, construct the parts they don't already have, spend the time to get the temperatures and oxygenation aligned, and then sit down to a humble meal for one.
Not many programmers, these days, do code-reading like baddies, as they should.
However, kids, the more you do it the better you get at it, so there is simply no excuse for shipping someone elses bloat.
Do you know how many blunt pointers are lined up underneath your BigFatFancyFeature, holding it up?
> Go's "GC disadvantage" is turned on its head by developing "Zero Allocation" libraries which run blazingly fast with fixed memory footprints. Similarly, rolling your own high performance/efficient code where it matters can save tremendous amounts of memory where it matters.
The savings there would be negligible (in modern terms) but the development cost would be significantly increased.
> Of course more features and safety nets will consume memory, but we don't have to waste it like there are no other things running on the system, no?
Safety nets are not a waste. They’re a necessary cost of working with modern requirements. For example, If your personal details were stolen from a MITM attack then I’m sure you’d be asking why that piece of software wasn’t encrypting that data.
The real waste in modern software is:
1. Electron: but we are back to the cost of hiring developers
2. Application theming. But few actual users would want to go back to plain Windows 95 style widgets (many, like myself, on HN wouldn’t mind, but we are a niche and not the norm).
> This demo [0] is a 4kB executable. 4096 bytes. A single file. All assets, graphics, music and whatnot, and can run at high resolutions with real time rendering.
You quoted where i said that modern resolutions are literally orders of magnitude greater and assets stored in bitmaps / PCM then totally ignored that point.
When you wrote audio data in the 80s, you effectively wrote midi files in machine code. Obviously it wasn’t literally midi, but you’d describe notes, envelopes etc. You’d very very rarely store that audio as a waveform because audio chips then simply don’t support a high enough bitrate to make that audio sound good (nor had the storage space to save it). Whereas these days, PCM (eg WAV, MP3, FLAC, etc) sound waaaay better than midi and are much easier for programmers to work with. But even a 2 second long 16bit mono PCM waveform is going to be more than 4KB.
And modern graphics aren’t limited to 2 colour sprites (more colours were achieved via palette swapping) at 8x8 pixels. Scale that up to 32bits (not colours, bits) and you’re increasing the colour depth by literally 32 times. And that’s before you scale again from 64 pixels to thousands of pixels.
You’re then talking exponential memory growth in all dimensions.
I’ve written software for those 80s systems and modern systems too. And it’s simply ridiculous to Compare graphics and audio of those systems to modern systems without taking into account the differences in resolution, colour depth, and audio bitrates.
> Application theming
Software 30 years ago was more amenable to theming. The more system widgets you use, the more effective theming works by swapping them.
Now, we have grudging dark-mode toggles that aren't consistent or universal, not even rising to the level of configurabilty you got with Windows 3.1 themes, let alone things like libXaw3d or libneXtaw where the fundamental widget-drawing code could be swapped out silently.
I get the impression that since about 2005, theming has been on the downturn. Windows XP and OSX both were very close to having first class, user-facing theming systems, but both sort of chickened out at the last minute, and ever since, we've seen less and less control every release.
I think what you're describing as "theming" is more "custom UI". It used to be reserved for games, where stock Windows widgets broke immersion in a medieval fantasy strategy simulator and you were legally obliged to make the cursor a gauntlet or sword. But Electron said to the entire world "go to town, burn the system Human Interface Guidelines and make a branded nightmare!" when your application is a smart-bulb controller or a text editor that could perfectly well fit with native widgets.
This also isn’t a trend that Electron started. Software has been shipping with bespoke UIs for nearly as long as UI toolkits have been a thing.
>But Electron said to the entire world "go to town, burn the system Human Interface Guidelines and make a branded nightmare!"
TBH this sounds pretty medieval too.
> The savings there would be negligible (in modern terms)
A word of praise for Go: it is pretty performant, while using very little memory. I inherited a few Django apps, and each thread just grows to 1GB. Running something like celery quickly eats up all memory and start thrashing. My Go replacements idle at around 20MB, and are a lot faster. It really works.
> The savings there would be negligible (in modern terms) but the development cost would be significantly increased.
...and this effort and small savings here and there is what brings the massive savings at the end of the day. Electron is what "4KB here and there won't hurt", "JS is a very dynamic language so we can move fast", and "time to market is king, software is cheap, network is reliable, YOLO!" banged together. It's a big "Leeroy Jenkins!" move in the worst possible sense, making users pay everyday with resources and lost productivity to save a developer a couple of hours at most.
Users are not cattle to milk, they and their time/resources also deserve respect. Electron is doing none of that.
> You quoted where i said that modern resolutions are literally orders of magnitude greater and assets stored in bitmaps / PCM then totally ignored that point.
Did you watch or ran any of these demos? Some (if not all) of them scale to 4K and all of them have more than two colors. All are hardware accelerated, too.
> And modern graphics aren’t limited to 2 colour sprites (more colours were achieved via palette swapping) at 8x8 pixels. Scale that up to 32bits (not colours, bits) and you’re increasing the colour depth by literally 32 times. And that’s before you scale again from 64 pixels to thousands of pixels.
Sorry to say that, but I know what graphics and high performance programming entails. Had two friends develop their own engines, and I manage HPC systems. I know how much memory matrices need, because everything is matrices after some point.
> Safety nets are not a waste.
I didn't say they are waste. That quote is out of context. Quoting my comment's first paragraph, which directly supports the part you quoted: "Yes, but this doesn't prevent you from being mindful and selecting the right tools with smaller memory footprint while providing the features you need."
So, what I argue is, you don't have to bring in everything and the kitchen sink if all you need is a knife and a cutting board. Bring in the countertop and some steel gloves to prevent cutting yourself.
> I’ve written software for those 80s systems and modern systems too. And it’s simply ridiculous to Compare graphics and audio of those systems to modern systems without taking into account the differences in resolution, colour depth, and audio bitrates.
Me too. I also record music and work on high performance code. While they are not moving much, I take photos and work on them too, so I know what happens under the hood.
Just watch the demos. It's worth your time.
> Electron is doing none of that.
I agree. I even said Electron was one piece of bloat I didn’t agree with my my comment. So it wasn’t factored into the calculations I was presenting to you.
> Did you watch or ran any of these demos? Some (if not all) of them scale to 4K and all of them have more than two colors.
You mean the ones you added after I replied?
> I didn't say they are waste. That quote is out of context.
Every part of your comment was quoted in my comment. Bar the stuff you added after I commented.
> Had two friends develop their own engines
I have friends who are doctors but that doesn’t mean I should be giving out medical advice ;)
> Just watch the demos. It's worth your time.
I’m familiar with the demo scene. I know what’s possible with a lot of effort. But writing cool effects for the demo scene is very different to writing software for a business which has to offset developer costs against software sales and delivery deadlines.
I’m also not advocating that software should be written in Electron. My point was modern software, even without Electron, is still going to be orders of magnitude larger in size and for the reasons I outlined.
> writing cool effects for the demo scene is very different to writing software for a business which has to offset developer costs against software sales and delivery deadlines.
The point is not "cool effects" and "infinite time" though. If we continue about talking farbrausch, they are not bunch of nerds which pump out raw assembly for effects. They have their own framework, libraries and whatnot. Not dissimilar to business software development. So, their code is not that different from a business software package.
For the size, while you can't fit a whole business software package to 64kB, you don't need to choose the biggest and most inefficient library "just because". Spending a couple of hours more, you might find a better library/tool which might allow you to create a much better software package, after all.
Again, for the third time, while safety nets and other doodads make software packages bigger, cargo culting and worshipping deadlines and ROI more than the product itself contributes more to software bloat. That's my point.
Oh I overlooked this gem:
> I have friends who are doctors but that doesn’t mean I should be giving out medical advice ;)
Yet, we designed some part of that thing together, and I had the pleasure of fighting with GPU drivers with them trying to understand what it's trying to do while neglecting our requests from it.
IOW, yep, I didn't wrote one, but I was neck deep in both of them, for years.
> I did no edits after your comment has appeared. Yep, I did edits, but your reply was not visible to me while I did these.
Which isn’t the same thing as what I said.
I’m not suggesting you did it maliciously, but the fact remains they were added afterwards so it’s understandable I missed them.
> Yet, we designed some part of that thing together, and I had the pleasure of fighting with GPU drivers with them trying to understand what it's trying to do while neglecting our requests from it.
That is quite a bit different from your original comment though. This would imply you also worked on game engines and it wasn’t just your friends.
That C64 demo doing sprite wizardy and 8088MPH comes to my mind. The latter one, as you most probably know, can't be emulated since it (ab)uses hardware directly. :D
As a trivia: After watching .the .product, I declared "if a computer can do this with a 64kB binary, and people can make a computer do this, I can do this", and high performance/efficient programming became my passion.
From any mundane utility to something performance sensitive, that demo is my northern star. The code I write shall be as small, performant and efficient as possible while cutting no corners. This doesn't mean everything is written in assembly, but utmost care is given how something I wrote works and feels while it's running.
Much respect to people who've manage to retrofit it: there are guerilla translated versions of some Japanese-only games.
> this is all before you take into account that modern graphics and audio is bitmap / PCM and running at resolutions literally orders of magnitude greater
Yes, people underestimate how much this contributes, especially to runtime memory usage.
By comparison, giving how they optimized the games for 8 and 16 bit machines I should have been able to compile Cataclysm DDA:BN under my potato netbook and yet it needs GIGABYTES of RAM to compile, it crazy that you need damn swap for something it required far less RAM 15 years ago for the same features.
If the game was reimplemented in Golang it wouldn't feel many times slower. But no, we are suffering the worst from both sides of the coin: something that should have been replaced by Inferno -plan9 people, the C and Unix creators and now Golang, their cousin- with horrible compiline times, horrible and incompatible ABI's, featuritis, crazy syntax with templates and if you are lucky, memory safety.
Meanwhile I wish the forked Inferno/Purgatorio got a seamless -no virtual desktops- mode so you fired the application in a VM integrated with the guest window manager -a la Java- and that's it. Limbo+Tk+Sqlite would have been incredible for CRUD/RAD software once the GUI was polished up a little, with sticky menus as TCL/Tk and the like. In the end, if you know Golang you could learn Limbo's syntax (same channels too) with ease.
Bad code is bad code, poor choices are poor choices — but I think it’s often times pretty fair to judge things harshly on resource usage sometimes.
> all contribute massively to software “bloat”.
Could you point to an example where those gigs were really "massively" due crash handling and bounds checks etc?
>Sure, if you don’t count safety features like memory management, crash handling, automatic bounds checks and encryption cyphers; as anything useful.
>Networking stacks, safety checks, encryption stacks, etc all contribute massively to software “bloat”.
They had most of this stuff in the 1980s, and even earlier really. Not on your little 8-bit microcomputer that cost $299 that might have had as a kid, but they certainly did exist on large time-sharing systems used in universities and industry and government. And those systems had only a tiny fraction of the memory that a typical x86-64 laptop has now.
https://basic10liner.com/
".. and time and again it leads to amazingly elegant, clever, and sometimes delightfully crazy solutions. Over the past 14 editions, more than 1,000 BASIC 10Liners have been created — each one a small experiment, a puzzle, or a piece of digital creativity .."
Instead - here's [0] Ben Daglish (on flute) performing "Wastelands" together with the Norwegian C64/Amiga tribute band FastLoaders. He unfortunately passed away in 2018, just 52 years old.
If that tickled your fancy, here's [1] a full concert with them where they perform all songs from The Last Ninja.
[0] https://www.youtube.com/watch?v=ovFgdcapUYI [1] https://www.youtube.com/watch?v=PTZ1O1LJg-k
Or a convincing representation of that. A lot of old tricks mean that the games are doing less than you think that they are, and are better understood when you stop thinking “how do they do that” and “how are they convincing my brain that is what they are doing”.
Look at how little RAM the original Elite ran in on a BBC Model B, with some swapping of code on disk⁰. 32KB, less the 7.75KB taken by the game's custom screen mode² and a little more reserved for other things¹. I saw breathy reviews at the time and have seen similar nostalgic reviews more recently talking about “8 whole galaxies!” when the game could easily have had far more than that and was at one point going to. They cut it down not for technical reasons but because having more didn't feel usefully more fun and might actually put people off. The galaxies were created by a clever little procedural generator so adding more would have only added a couple of bytes (to hold the seed and maybe other params for the generator) each.
Another great example of not quite doing what it looks like the game is doing is the apparently live-drawn 3D view in the game Sentinel on a number of 8-bit platforms.
--------
[0] There were two blocks of code that were swapped in as you entered or self a space station: one for while docked and one for while in-flight. Also the ship blueprints were not all in memory at the same time, and a different set was loaded as you jumped from one system to another.
[1] the CPU call stack (technically up to a quarter K tough the game code only needed less than half of that), scratch-space on page-zero mostly used for game variables but some of which was used by things like the disk controller ROM and sound generator, etc.
[2] Normal screen modes close to that consumed 10KB. Screen memory consumption on the BBC Master Enhanced version was doubled as it was tweaked to use double the bit depths (4ppb for the control panel and 2bbp for the exterior, instead of 2bbp and 1ppb respectively).
https://bunsen.itch.io/the-snake-temple-by-rax
We lost something in the bloat, folks. Its time to turn around and take another look at the past - or at least re-adjust the rearview mirror to actually look at the road and not ones makeup ..
The ZMachine games, ditto. A few kb's and an impressive simulated environment will run even under 8bit machines running a virtual machine. Of course z3 machine games will have less features for parsing/obj interaction than z8 machine games, but from a 16 bit machine and up (nothing today, a DOS PC would count) will run z8 games and get pretty complex text adventures. Compare Tristam Island or the first Zork I-III to Spiritwrak, where a subway it's simulated, or Anchorhead.
And you can code the games with Inform6 and Inform6lib with maybe a 286 with DOS or 386 and any text editor. Check Inform Beginner's Guide and DM4.pdf And not just DOS, Windows, Linux, BSD, Macs... even Android under Termux. And the games will run either Frotz for Termux or Lectrote, or Fabularium. Under iOS, too.
Nethack/lashem weights MB's and has tons of replayability. Written in C. It will even run under a 68020 System 7 based Mac... emulated under 9front with an 720 CPU as the host. It will fly from a 486 CPU and up.
Meanwhile, Cataclysm DDA uses C++ and it needs a huge chunk of RAM and a fastly CPU to compile it today. Some high end Pentium4 with 512MB of RAM will run it well enough, but you need to cross compile it.
If I had the skills I would rewrite (no AI/LLM's please) CDDA:BN into Golang. The compiling times would plummet down and the CPU usage would be nearly the same. OFC the GC would shine here prunning tons of unused code and data from generated worlds.
Feels like they were closer to programs, while modern games are closer to datasets.
Honestly though, I don't read much into the sizes. Sure they were small games and had lots of game play for some defintion of game play. I enjoyed them immensely. But it's hard to go back to just a few colors, low-res graphics, often no way to save, etc... for me at least, the modern affordances mean something. Of course I don't need every game to look like Horizon Zero Dawn. A Short Hike was great. It's also 400meg (according to steam)
We can't compare 40 KB image today to a 40 KB image from 1980 something, if the contemporary one relies on 100 MB of external cruft, like a rich programming language runtime (fetched and install separately) and packages.
By comparison, COD Modern Warfare 3 is 6,000,000 times larger at 240GB. Imagine telling that to someone in 1987.
Most of my games are roughly in that range though. I think my MMO was 32KB, and it had a sound effects generator and speech synth in it. (Jsfxr and SAM)
I built it in a few days for a game jam.
I'm not trying to brag, I'm trying to say this stuff is easy if you actually care. Just look at JS13K. Every game there is 13KB or below, and there's some real masterpieces there. (My game was just squares, but I've seen games with whole custom animation systems in them.)
Once you learn how, it's pretty easy. But you'll never learn if you don't care.
You have to care because there's nothing forcing you. Arguably The Last Ninja would have been a lot more than 40KB if there weren't the hardware limitations of the time.
They weren't trying to make it 40KB, they were just trying to make a game.
In my case, I enjoy the challenge! (Also I like it when things load instantly :)
I think I'll make a PS1 game next. I was inspired by this guy who made a Minecraft clone for Playstation:
https://youtu.be/aXoI3CdlNQc?is=sDNnrGbQGJt_qnV6
P.S. most Flash games were only a few kilobytes, if you remove the music!
Elite was £20 in 1984 and that would be £66 today, which is not very different from what a good game for the PS5 costs today.
Except that games then were made by one or two people and nowadays games are made by teams with coders, musicians, artists, etc.
> “The Last Ninja” was 40 kilobytes
I have got 1.1 GB of MP3s with just remixes of the music from the three games, some of which are from a Kickstarter from the composer for the second.
> ... 40 kilobytes.
How times have changed. My best-selling program "Apple Writer", for the Apple II, ran in eight kilobytes. It was written entirely in 6502 assembly language.
It is a quite big game: the main executable is 117KB, plus around 50 overlay files of 1.5 KB each for the different dungeons and cities, plus the graphics files. I guess it was even too big for the average PC hardware at that time, or it was a limitation inherited from the original Apple II version: When you want to cast a spell you have to enter the number of the spell from the manual, maybe because there was not enough memory to fit the names of the 94 spells into RAM. Apart from that and the limited graphics and the lack of sound, the internal ruleset is very complete. You have all kind of spells and objects, capabilities, an aging mechanism, shops, etc.. The usual stuff that you also see in today's RPGs.
The modern uninstall.exe that came with it (I bought the game on GOG) was 1.3MB big.
https://youtu.be/lC4YLMLar5I
Previously: https://news.ycombinator.com/item?id=38707095
Want to prove a point? Give me Skyrim in 64k of ram. Go ahead! I dare you!
And the loading screens were also amazing, particularly for tape loading.
Anybody remember this one?
I never finished the game, sadly.
Ofcourse luckily our SSDs got bigger too.