A super minor nitpick: it’s jarring to see the Amiga referred to as 16 bit. It wasn’t described that way at the time: it was universally (that I saw anyway) called a 32 bit machine, and reasonably. It had a flat 32 bit address space (although the 68000 itself didn’t support all those address lines because what kind of supercomputer would need 4GB of RAM?). All the registers and operations were 32 bit. Some of the internal operations were implemented in 16 bits, but that was invisible to programmers. Newer models with definitively 32 bit CPUs like the 68060 were nearly 100% backward compatible with older models at the CPU instruction level, even if newer OSes weren’t backward compatible at the API level. In fact, the only program not forward compatible at the instruction level that I remember offhand was Microsoft’s AmigaBASIC. It used the top bits of pointers to store data because the 68000 would ignore them when accessing RAM due to that lack of address lines.
I just don’t see a way to justifiably call the Amiga a 16 bit machine. Although the A1000 had some 16 bit hardware paths, a maxed out A3000 definitely wasn’t 16 bit, and they were nearly completely compatible with each other minus newer features.
Amiga was full-on 32 bit machine. It’s weird to hear it called anything else.
While the 68000's registers are 32-bit, the data bus is 16 bit, the A1000, A2000 and A500 that defined the range had 16-bit fetching chipsets, they literally had 24-bit address buses. None of this says "32-bit". It can't be overlooked.
Many games crashed on the 32-bit clean A3000, A1200, A600, A4000 because programmers used the upper byte of addresses for their IQ or whatever. (Similar issues with ARM2 to ARM3 in Acorns, even RISC OS itself can be categorized into '26-bit' and '32-bit clean' varieties due to Acorn thinking the memory space ignores the upper 6 bits so they can store what they like there)
The competition before the Amiga's launch solidly called itself "8-bit". The next generation called itself "16-bit" to hype itself. Later machines touted their "32-bit"ness, and then came the Nintendo 64 and PSX on MIPS processors...
All the hedges you made, "don't look here, look there" can be reversed to emphasize the 16-bitness!
Does this say something about you? Did you come to the Amiga later in its life, e.g. 1991-1993, when 68020s/030s/040s were an option? Or were you there in 1985 when it debuted?
The Opteron had a 32 bit HyperTransport bus. Modern CPUs only implement 48 address lines. And yet we’d call all of those 64 bit systems. We wouldn’t call them 32 bit systems, and surely not 48 bit.
The 68k’s ISA is 32 bit through and through, however the underlying implementation looks. It did since I bought my A1000, marketed as a 32 bit system, in 1985.
I'm sure there must have been some, but most of Commodore's early Amiga ads didn't mention the number of bits at all, and from looking through old magazines it doesn't seem most vendors did either.
This is the classic vintage computing pissing contest...it's been argued by geniuses and cretins and it's hubris to think you have the One True Answer.
The Z80 is a 4-bit processor (it's ALU is double pumped and 4-bits wide), right? So is the MasPar super (not 32-bits). Obviously, the 8088, mc68008 and ns32008 are 8-bits (size of data bus). And the i386SX is 16-bit. Oh, wait...the 8088 and 68008 are 20-bit processors (address bus width). Unless it's the PLCC version of the 68008, which is a 22-bit processors. IBM 360/20? 16-bit ALU, so there you go. Connection Machines? 1-bit (they're bit-serial) just like the Bendix a couple of generations earlier.
While the 68000's registers are 32-bit, the data bus is 16 bit, the A1000, A2000 and A500 that defined the range had 16-bit fetching chipsets, they literally had 24-bit address buses. None of this says "32-bit". It can't be overlooked.
Oh, please...you're the one overlooking things and marketing isn't a very good source for architectural definitions. It makes a shitload more sense to say "it's a 32-bit processor because the registers are 32bit" that it does to say "well...the 68000 and 68010 are 24-bit, the 68008 is 20 bits unless its a 68008FN where it's 22-bits, and the 68012 is 30-bits, and the everything else is 32-bits, even tho the fundamental unit of computation since the original spin is 32-bits".
I remember the Amiga always being compared to other "16-bit" machines, like the Apple IIgs, Atari ST, and early Macs.
I also remember the 68000 being referred to as 16/32-bit. Still, from a programmer perspective, the 68000 looked like a 32-bit machine, similar to what Intel did with the 386DX and SX.
This is a classic dispute when it comes to the 68000. I'm inclined to agree with your perspective, actually, but my impression is that it's highly contested.
Commodore and Atari marketed their 68K machines as 16/32-bit, which is I guess technically the most correct. And other 68000-based machines, like the Sega Mega Drive/Genesis, were marketed as 16-bit - it even says it right on top of the unit!
The bus always seemed like the oddest part to zero in on. By analogy, an Opteron in 2003 was a 64 bit CPU with a 32 bit HyperTransport bus, but no one called an Opteron system 32 bit. The width of a particular internal implementation detail is a strange duck IMO.
I think part of it was that to hardware companies the bus width is actually extremely important - the whole system is built around it, and the programming model the software guys work with less so.
And then the other part of it is the marketing angle: everyone knew full 32-bit inside and out chips were just on the horizon. Downplaying the 68k’s 32-bitness would give them a selling point for the 68020.
All ALU operations are also more expensive with 32 bit operands. So 16 bit data bus, 24 bit address bus. Slower arithmetic with 32 bit operands. I never though of it as a 32 bit CPU.
I recall the A500 series as being thought of as 16-bit in the UK -- the 32-bit marketing started with the A1200, and devices based on it, like the CD32 (hence the name).
That was because the A1200 was the first Amiga to have a 68020 as the native CPU on the motherboard. The 68020 had 32-bit data registers and 32-bit address registers. Earlier Amiga's were designed around the 68000 CPU which was instruction set compatible with later 680x0 CPUs (which featured backward-compatible super sets). In the 68000's data registers only had 16 data lines connected externally, requiring two cycles to read or write 32-bits and the 32-bit address registers didn't have their upper 8 bits connected to external pins, limiting the directly addressable RAM to 16MB (24-bits). These compromises allowed the CPU to fit in a 64 pin DIP package while the standard 68020 came in a 114 pin PGA package and was fully 32-bit internally and externally.
However, it's confusing because the A1200 had a lower cost version of the 68020, the 68EC020, which also didn't have the top 8 bits connected and came in a smaller 100 pin QFP package. So technically, it had the same addressable RAM limit as the 68000 (although it had other instruction set and clock speed improvements).
Prior the the A1200 (1992) here was an earlier Amiga model, the A2500 (1989), which came with a full 68020 CPU but it was a 68000-based A2000 with Commodore's A2620 add-on accelerator card pre-installed, so it had both CPUs (although the 68000 was unused when the accelerator was added).
For me at least I always remember it being referred to as 16-bit, in all the gaming and computer magazines etc. Part of the 16-bit home computers; I remember the Atari ST being referred to that way as well.
I don’t remember seeing references to 32-bit until the 386/486 days on the home computer side and Sega 32X on the console side.
To someone who was around at the time, this sounds silly. Is the Commodore 64 then a 16-bit machine, because its address pointers are 16 bits? No, the Amiga and related 68000-based machines were generally considered to be 16-bit machines, and their predecessors were all considered to be 8-bit machines.
The 6510 operates internally as an 8-bit processor. The 68000 operates internally as a 32-bit processor for the most part - instructions are 32-bit aligned, registers are 32-bit.
We don't consider the original IBM 5150 PC to be an "8-bit" machine even though the situation is very similar to the 68000 - internal 16-bit operation, but 8-bit data bus.
The 68000 series has always been 32-bit, even if some implementations have used 16-bit connectivity to the rest of the board. Thus, the Amiga has also always been a 32-bit platform.
Would you consider early version of MacOS to be running on a "24-bit" platform, since the high byte of pointers was often used for non-addressing functionality? No, the 68k Mac has also always been a 32-bit platform, since day one, albeit one that wasn't always "32-bit clean". The Amiga never had this issue, however.
In the 80s it was fairly common to consider C64, Amstrad 464 and ZX Spectrum 8 bit, while Amiga and Atari ST 16 bit. In Italy we even had two separate video game magazines: Zzap! for 8 bit and The Games Machine for 16 bit.
While I too am a huge fan of the legendary 68000, as well as the proud owner of many Amigas from 1985 onward, the marketing and media reports sometimes glossed over important technical details. The 68000 CPU, which all Amigas from 1985 to 1990 were designed around, does have 32-bit data and address registers but that doesn't mean it was purely a 32-bit architecture - even internally. Some important internal components like the ALU were only 16-bit. Additionally, the external data width was 16-bit, requiring two accesses to read or write a 32-bit register to RAM, which did have a meaningful performance impact since memory access is a critical bottleneck, especially in a CPU with no cache. As you note, at least this 'double pumping' was automatic and mostly hidden from programmers.
The 68000's address registers didn't have their upper 8 bits connected to external pins, limiting the directly addressable RAM to 16MB (24-bits). These external width compromises allowed the 68000 to fit in a 64 pin DIP package while the standard 68020, which did connect all 32 data and address lines, came in a 114 pin PGA package. Large packages with more pins were a significant cost while double-pumping data accesses and a 16MB limit on addressable RAM weren't significant issues for most 1980s desktop computers - especially since the 68000's elegantly orthogonal instruction set was so performant in other ways.
Thus, many of us more technically literate fans broadly thought of the 68000 as having 32-bits internally but 16-bit data / 24-bit address width externally. However, that was incorrect because the arithmetic logic unit (ALU) and two arithmetic units were also 16-bit only, generally requiring at least twice as many cycles even for purely internal 32-bit math operations, whereas the 68020 and later CPUs didn't. That's why the 68000 is probably best described as "a hybrid 16/32 bit internal architecture with 16-bit external data width and 24-bit addressing."
It gets even more confusing because some later Amiga models like the A1200 (1992) didn't have a standard 68020 but instead a lower cost version, the 68EC020, which also didn't have the top 8 address lines connected and came in a smaller 100 pin QFP package. So technically, it had the same addressable RAM limit as the 68000, although it had full 32-bit internal and external data widths, ALU, a 256 byte cache and many other other instruction set and clock speed improvements common to later 680x0 CPUs. The way a lot of us thought of the 68000's 16/32 architecture as being limited just in the memory addressing was really a more appropriate description of the difference between a full 68020 and 68EC020. The 68000's ALU being 16-bit is the inarguable smoking gun that makes it incorrect to think "it's really a 32-bit CPU internally" as I used to.
However, that should take nothing away from just how incredible the 68000 was. My first computer had the 68000's little brother, the 6809, which was generally the fastest 8-bit CPU clock-for-clock due to being an 8/16 bit design in much same way the 68000 was 16/32 bit. While the 6809 was incredibly fast, when I got a 68000-based A1000 in 1985 and programmed it in assembly language, it blew my mind how incredibly fast it was. Then in 1988 when I added an A2620 accelerator card to my A2000, it's full 68020 with 32-bit internals and direct 32-bit read/write to 4MB of RAM was like going from a Ferrari to a Lear Jet. Despite how the 68000 was confusingly marketed and inaccurately described by some media, it was truly a leap forward, but the reality is the 68020 was really the first true 32-bit CPU in the line.
FFS ruining it for ppl that are old enough as well. I really wanted to try this out :/
Access Restricted for Australian Visitors
As of March 16, 2026, Civitai is no longer accessible to users in Australia.
This is due to Australia's Age-Restricted Material Codes, registered under the Online Safety Act and enforced by the eSafety Commissioner. These codes require platforms that host user-generated content — including AI-generated imagery — to implement age verification measures such as facial age estimation, digital identity wallets, or photo identification checks before allowing access to age-restricted material. Simple self-declaration of age is no longer considered sufficient. Non-compliance carries civil penalties of up to AUD $49.5 million per breach.
There’s something about the Amiga era font and graphic style that I love and I always feel is unique to the Amiga but had trouble pinning it down to a particular developer or graphics artist. Ruff n Tumble is a good example, with like chunky futuristic font, the strong gradients all over everything and even the colours. It’s not common to all games though.
I met Jim at users groups and trade shows and had to the opportunity to hang out with him several times. Not only an incredible artist but extremely humble and just a very nice human.
Yeah, I agree. I also had C64 and DOS, and while both had tons of games, the Amiga was a bit different. In a way the Amiga was kind of a stronger predecessor to e. g. Xbox or similar variants (there were also TV console games, of course, and I played them too, so these may be called more appropriately the forefront-runner towards Xbox and other consoles, but I feel that the Amiga was kind of positioned in two places here, whereas DOS was more on the application-side, business-side, than games side, even though there were also many good DOS games. Master of Orion 1 is one of my all-time favourites; Master of Orion 2 extended many things, but the gameplay also got slower and I did not like that. I loved the fast play style that was possible, also in other games, civilization 1, simcity 1 and so forth).
I'm getting older and forgetting a lot, but I hope I never forget the feeling of seeing this as a kid in 1989.
You can see and experience old things, but it's impossible to recreate the context in which they were originally experienced. You can't erase your experience of 40 years of technical progress which makes this sort of thing feel merely quaint in comparison.
Color cycling in the picture file format was so epic!
Fun memory: I was with my best friend at another friend's place and his father called him to do some chore. He had to quickly mow the small lawn or something like that. So we decided to prank him: I don't remember all the details but basically we launched Deluxe Paint and simulated an Amiga "guru meditation" using a font that wasn't even correct (I think because we were in 320x256 while the real guru meditation was using a mode with smaller pixels). Then in broken english we wrote something like this:
"Hardware failure. If you reboot or turn off your computer it is going to broke forever"
We then did a color cycling between red and black for one of the color and put the drawing software in "full screen".
When our friend came back, we played dumb and said we had no idea what happened but that apparently we really shouldn't turn the computer off. We managed to hold it for something like ten minutes while he though his computer was done for good but we were dying inside.
P.S: as a side note with the help of Claude Code CLI / Sonnet 4.6 I managed to recompile a 30+ years old game I wrote in DOS in the early 90s (and for which I still have the source files and assets but not the tooling) and I was using converter (which I wrote back then) to convert files between the .LBM format and a "tweaked" (320x200 / 4 planes) DOS mode I was using for the game (which allowed double-buffering without tearing). I don't remember the details but I take it that if we had .LBM picture files, me and the artist where using Deluxe Paint on the Amiga.
I missed out on the Amiga (introduced in 1985) at the time, being an early PC adopter. Went from CGA (1981) directly to VGA (1987).
In terms of colors the most popular VGA modes (320x200 or 320x240, 256 color palette, 18 bit color depth) are superior to the most popular Amiga graphics modes (320×200 or 320x256, 32 color palette, 12 bit color depth).
So for anyone looking into old school graphics programming, bit planes are pretty confusing when you don't understand why they exist.
Two big reasons. First, it's about running memory chips in parallel to increase bandwidth. Image data was hard to get to the screen fast enough with hardware in that era.
Second it allowed for simple backwards compatibility. Programs were used to writing directly to video memory, and in an EGA card the start of the video memory was valid CGA data. The rest of the colour data was in a separate bit plane.
This is great stuff! As a side note, I wonder if anyone has created a HAM viewer that runs in the browser? I remember HAM flickering by necessity and being amazed by 4096 colors on-screen at once. There was a certain quality of HAM images on the Amiga that made them instantly identifiable.
I couldn't afford the Amiga in its day, but I often drooled over it's imagery in magazines etc. I really need to pick up a mister fpga setup and see what I missed out on back then. Any recommendations for hardware for that? I can and do build my own hardware, but I think there's a bunch of options nowadays and likely some are better than others...
I liked the Amiga. I would not really use it today, but
I recall having played many games in the 1980s. Those kind
of games are mostly dead now (save for a few Indie games
perhaps). Today's games are usually always the same - 3D
engine with some fancy audio and video and a dumbed down
gameplay. (Not all games, mind you; for instance, I liked
the idea behind Little Nightmares. I never played it myself,
don't have the time, but I watched several clips on youtube
and I found the gameplay different to the "canonical" games
we now have, as perpetual repetition of a money-sell grab.)
Somewhat related, new version of Amiga Vision collection just dropped. Very high quality product you can get for free if you are an Amiga fan. Can't get enough of included demos on my MiSTer setup.
This brought back some memories. So nice to see art from an era where you really needed talent to be able to produce it. Such a nice contrast to the AI slop which takes no talent to produce!
There's just something uniquely special about hand-drawn pixel art at resolutions around 320 x 200 with 16 to 256 colors - especially when viewed on analog CRT screens with their scanlines and phosphor glow which blend colors and soften the hard pixel edges some people today think as "retro" (which isn't at all how this art actually looked in the 80s to the artists or their audiences).
I think a key aspect of the magic is that the technical constraints force art to be representational instead of photo realistic. There just weren't enough pixels or colors, so artists had to make intentional choices about where to focus their limited pixels and palette to imply the detail they couldn't fully draw and that made their images evocative in ways photo-realism usually isn't. Earlier digital graphics with 4 to 16 colors and resolutions around 160 x 120 to were generally 'moving icons' as seen in arcade games like Pac-Man, Donkey Kong and Galaga and most late-70s and early 80s home computers (Apple II, Atari 400/800, C64, etc). Of course, this wasn't just due to pixel and palette limitations but also the 8-bit CPUs at sub-4 MHz clock speeds and limited memory (usually 8k to 32k game size).
It wasn't until around the mid-80s when arcade and personal computer hardware with 16-bit CPUs at 8 Mhz+ and 256K memory hit that magic middle-ground we see as unique to that era of computer and arcade graphics. By the mid-90s it was already starting to vanish as palettes grew beyond 256 colors and resolutions exceeded 15Khz analog video (roughly 240 lines high). A great example of the peak visuals possible from the painstaking care and artistic virtuosity of this era can be seen in the incredible hand-drawn sprites of "Street Fighter II": https://fabiensanglard.net/sf2_sheets/index.html.
The other reason I think so many of us see the art style of this era as uniquely special is it ended suddenly with a huge leap to deep color palettes, higher resolutions and 3D rendered graphics. This happened due to the unique nature of analog 15Khz video and the desire to avoid interlace flicker, causing resolutions for most consumer-priced computers and game consoles to max out in the mid-80s at less than 240 vertical lines. Since artists generally want to work in roughly square pixels, this limits horizontal resolution to around 320. So, for nearly a decade the benefits of using the existing televisions consumers already had, limited the visual output of home computers and game consoles to 240 lines. It even froze the evolution of most arcade machines due to the cost savings of using CRTs made for TVs. Even one of the last 2D arcade hardware platforms, Capcom's 1996 CPS III, was limited to 384 x 224 resolution. After this unprecedented 'hold' of nearly ten years on the march of pixel progress, the next increment most consumers saw was a huge and seemingly sudden leap - a doubling of vertical and horizontal resolutions and a jump from 4 and 8-bit palettes (16 to 256 colors) straight to 16-bit palettes (65,535 colors). And this happened at almost the same moment the rush to 3D rendered graphics killed any interest in hand-drawn pixels. In just a few years, virtually all the computer and game pixels consumers saw changed dramatically in both scope and style, creating a clear divide between hand-drawn 2D pixel art at analog resolutions and everything that came after.
86 comments
I just don’t see a way to justifiably call the Amiga a 16 bit machine. Although the A1000 had some 16 bit hardware paths, a maxed out A3000 definitely wasn’t 16 bit, and they were nearly completely compatible with each other minus newer features.
Amiga was full-on 32 bit machine. It’s weird to hear it called anything else.
Many games crashed on the 32-bit clean A3000, A1200, A600, A4000 because programmers used the upper byte of addresses for their IQ or whatever. (Similar issues with ARM2 to ARM3 in Acorns, even RISC OS itself can be categorized into '26-bit' and '32-bit clean' varieties due to Acorn thinking the memory space ignores the upper 6 bits so they can store what they like there)
The competition before the Amiga's launch solidly called itself "8-bit". The next generation called itself "16-bit" to hype itself. Later machines touted their "32-bit"ness, and then came the Nintendo 64 and PSX on MIPS processors...
All the hedges you made, "don't look here, look there" can be reversed to emphasize the 16-bitness!
Does this say something about you? Did you come to the Amiga later in its life, e.g. 1991-1993, when 68020s/030s/040s were an option? Or were you there in 1985 when it debuted?
The 68k’s ISA is 32 bit through and through, however the underlying implementation looks. It did since I bought my A1000, marketed as a 32 bit system, in 1985.
> marketed as a 32 bit system, in 1985.
I'm sure there must have been some, but most of Commodore's early Amiga ads didn't mention the number of bits at all, and from looking through old magazines it doesn't seem most vendors did either.
The Z80 is a 4-bit processor (it's ALU is double pumped and 4-bits wide), right? So is the MasPar super (not 32-bits). Obviously, the 8088, mc68008 and ns32008 are 8-bits (size of data bus). And the i386SX is 16-bit. Oh, wait...the 8088 and 68008 are 20-bit processors (address bus width). Unless it's the PLCC version of the 68008, which is a 22-bit processors. IBM 360/20? 16-bit ALU, so there you go. Connection Machines? 1-bit (they're bit-serial) just like the Bendix a couple of generations earlier.
While the 68000's registers are 32-bit, the data bus is 16 bit, the A1000, A2000 and A500 that defined the range had 16-bit fetching chipsets, they literally had 24-bit address buses. None of this says "32-bit". It can't be overlooked.
Oh, please...you're the one overlooking things and marketing isn't a very good source for architectural definitions. It makes a shitload more sense to say "it's a 32-bit processor because the registers are 32bit" that it does to say "well...the 68000 and 68010 are 24-bit, the 68008 is 20 bits unless its a 68008FN where it's 22-bits, and the 68012 is 30-bits, and the everything else is 32-bits, even tho the fundamental unit of computation since the original spin is 32-bits".
There isn't one canonical answer.
I remember the Amiga always being compared to other "16-bit" machines, like the Apple IIgs, Atari ST, and early Macs.
I also remember the 68000 being referred to as 16/32-bit. Still, from a programmer perspective, the 68000 looked like a 32-bit machine, similar to what Intel did with the 386DX and SX.
Commodore and Atari marketed their 68K machines as 16/32-bit, which is I guess technically the most correct. And other 68000-based machines, like the Sega Mega Drive/Genesis, were marketed as 16-bit - it even says it right on top of the unit!
And then the other part of it is the marketing angle: everyone knew full 32-bit inside and out chips were just on the horizon. Downplaying the 68k’s 32-bitness would give them a selling point for the 68020.
As it followed up on our ZX Spectrum and Commodore 64 8 bit home computers.
> the 32-bit marketing started with the A1200
That was because the A1200 was the first Amiga to have a 68020 as the native CPU on the motherboard. The 68020 had 32-bit data registers and 32-bit address registers. Earlier Amiga's were designed around the 68000 CPU which was instruction set compatible with later 680x0 CPUs (which featured backward-compatible super sets). In the 68000's data registers only had 16 data lines connected externally, requiring two cycles to read or write 32-bits and the 32-bit address registers didn't have their upper 8 bits connected to external pins, limiting the directly addressable RAM to 16MB (24-bits). These compromises allowed the CPU to fit in a 64 pin DIP package while the standard 68020 came in a 114 pin PGA package and was fully 32-bit internally and externally.
However, it's confusing because the A1200 had a lower cost version of the 68020, the 68EC020, which also didn't have the top 8 bits connected and came in a smaller 100 pin QFP package. So technically, it had the same addressable RAM limit as the 68000 (although it had other instruction set and clock speed improvements).
Prior the the A1200 (1992) here was an earlier Amiga model, the A2500 (1989), which came with a full 68020 CPU but it was a 68000-based A2000 with Commodore's A2620 add-on accelerator card pre-installed, so it had both CPUs (although the 68000 was unused when the accelerator was added).
I don’t remember seeing references to 32-bit until the 386/486 days on the home computer side and Sega 32X on the console side.
We don't consider the original IBM 5150 PC to be an "8-bit" machine even though the situation is very similar to the 68000 - internal 16-bit operation, but 8-bit data bus.
The 68000 series has always been 32-bit, even if some implementations have used 16-bit connectivity to the rest of the board. Thus, the Amiga has also always been a 32-bit platform.
Would you consider early version of MacOS to be running on a "24-bit" platform, since the high byte of pointers was often used for non-addressing functionality? No, the 68k Mac has also always been a 32-bit platform, since day one, albeit one that wasn't always "32-bit clean". The Amiga never had this issue, however.
The 68000's address registers didn't have their upper 8 bits connected to external pins, limiting the directly addressable RAM to 16MB (24-bits). These external width compromises allowed the 68000 to fit in a 64 pin DIP package while the standard 68020, which did connect all 32 data and address lines, came in a 114 pin PGA package. Large packages with more pins were a significant cost while double-pumping data accesses and a 16MB limit on addressable RAM weren't significant issues for most 1980s desktop computers - especially since the 68000's elegantly orthogonal instruction set was so performant in other ways.
Thus, many of us more technically literate fans broadly thought of the 68000 as having 32-bits internally but 16-bit data / 24-bit address width externally. However, that was incorrect because the arithmetic logic unit (ALU) and two arithmetic units were also 16-bit only, generally requiring at least twice as many cycles even for purely internal 32-bit math operations, whereas the 68020 and later CPUs didn't. That's why the 68000 is probably best described as "a hybrid 16/32 bit internal architecture with 16-bit external data width and 24-bit addressing."
It gets even more confusing because some later Amiga models like the A1200 (1992) didn't have a standard 68020 but instead a lower cost version, the 68EC020, which also didn't have the top 8 address lines connected and came in a smaller 100 pin QFP package. So technically, it had the same addressable RAM limit as the 68000, although it had full 32-bit internal and external data widths, ALU, a 256 byte cache and many other other instruction set and clock speed improvements common to later 680x0 CPUs. The way a lot of us thought of the 68000's 16/32 architecture as being limited just in the memory addressing was really a more appropriate description of the difference between a full 68020 and 68EC020. The 68000's ALU being 16-bit is the inarguable smoking gun that makes it incorrect to think "it's really a 32-bit CPU internally" as I used to.
However, that should take nothing away from just how incredible the 68000 was. My first computer had the 68000's little brother, the 6809, which was generally the fastest 8-bit CPU clock-for-clock due to being an 8/16 bit design in much same way the 68000 was 16/32 bit. While the 6809 was incredibly fast, when I got a 68000-based A1000 in 1985 and programmed it in assembly language, it blew my mind how incredibly fast it was. Then in 1988 when I added an A2620 accelerator card to my A2000, it's full 68020 with 32-bit internals and direct 32-bit read/write to 4MB of RAM was like going from a Ferrari to a Lear Jet. Despite how the 68000 was confusingly marketed and inaccurately described by some media, it was truly a leap forward, but the reality is the 68020 was really the first true 32-bit CPU in the line.
I also used that LORA and some video models to try to make a little movie with the same style[2]
Here's a guide on how to generate LORAs too if you're interested[3]
Finally, there's a DeluxePaint clone someone released that is pretty cool to play around with[4]
[1]: https://civitai.com/models/875790/amiga-deluxepaint-or-fluxd
[2]: https://www.youtube.com/watch?v=_18NBAbJSqQ&feature=youtu.be
[3]: https://reticulated.net/dailyai/creating-a-flux-dev-lora-ful...
[4]: https://github.com/mriale/PyDPainter
Access Restricted for Australian Visitors As of March 16, 2026, Civitai is no longer accessible to users in Australia.
This is due to Australia's Age-Restricted Material Codes, registered under the Online Safety Act and enforced by the eSafety Commissioner. These codes require platforms that host user-generated content — including AI-generated imagery — to implement age verification measures such as facial age estimation, digital identity wallets, or photo identification checks before allowing access to age-restricted material. Simple self-declaration of age is no longer considered sufficient. Non-compliance carries civil penalties of up to AUD $49.5 million per breach.
https://wormhole.app/E4zA1z#Q3aQLs6wRmlLkeghIYyZEQ
Amiga Graphics Archive - https://news.ycombinator.com/item?id=38431514 - Nov 2023 (20 comments)
Amiga Graphics Archive - https://news.ycombinator.com/item?id=17783531 - Aug 2018 (27 comments)
The Amiga Boing Ball Explained - https://news.ycombinator.com/item?id=12330689 - Aug 2016 (56 comments)
The Amiga Graphics Archive - https://news.ycombinator.com/item?id=10972849 - Jan 2016 (24 comments)
Jim Sachs was one of the early masters. The Wikipedia article about him does not do him justice: https://en.wikipedia.org/wiki/James_D._Sachs
One amazing thing was that even after the Amiga became available, he continued simultaneously making great art on the C-64.
You can see and experience old things, but it's impossible to recreate the context in which they were originally experienced. You can't erase your experience of 40 years of technical progress which makes this sort of thing feel merely quaint in comparison.
Fun memory: I was with my best friend at another friend's place and his father called him to do some chore. He had to quickly mow the small lawn or something like that. So we decided to prank him: I don't remember all the details but basically we launched Deluxe Paint and simulated an Amiga "guru meditation" using a font that wasn't even correct (I think because we were in 320x256 while the real guru meditation was using a mode with smaller pixels). Then in broken english we wrote something like this:
"Hardware failure. If you reboot or turn off your computer it is going to broke forever"
We then did a color cycling between red and black for one of the color and put the drawing software in "full screen".
When our friend came back, we played dumb and said we had no idea what happened but that apparently we really shouldn't turn the computer off. We managed to hold it for something like ten minutes while he though his computer was done for good but we were dying inside.
All three of us remember that prank to this day.
https://en.wikipedia.org/wiki/Guru_Meditation
P.S: as a side note with the help of Claude Code CLI / Sonnet 4.6 I managed to recompile a 30+ years old game I wrote in DOS in the early 90s (and for which I still have the source files and assets but not the tooling) and I was using converter (which I wrote back then) to convert files between the .LBM format and a "tweaked" (320x200 / 4 planes) DOS mode I was using for the game (which allowed double-buffering without tearing). I don't remember the details but I take it that if we had .LBM picture files, me and the artist where using Deluxe Paint on the Amiga.
In terms of colors the most popular VGA modes (320x200 or 320x240, 256 color palette, 18 bit color depth) are superior to the most popular Amiga graphics modes (320×200 or 320x256, 32 color palette, 12 bit color depth).
But somehow Amiga graphics is still often nicer.
Two big reasons. First, it's about running memory chips in parallel to increase bandwidth. Image data was hard to get to the screen fast enough with hardware in that era.
Second it allowed for simple backwards compatibility. Programs were used to writing directly to video memory, and in an EGA card the start of the video memory was valid CGA data. The rest of the colour data was in a separate bit plane.
https://en.wikipedia.org/wiki/Hold-And-Modify
I think a key aspect of the magic is that the technical constraints force art to be representational instead of photo realistic. There just weren't enough pixels or colors, so artists had to make intentional choices about where to focus their limited pixels and palette to imply the detail they couldn't fully draw and that made their images evocative in ways photo-realism usually isn't. Earlier digital graphics with 4 to 16 colors and resolutions around 160 x 120 to were generally 'moving icons' as seen in arcade games like Pac-Man, Donkey Kong and Galaga and most late-70s and early 80s home computers (Apple II, Atari 400/800, C64, etc). Of course, this wasn't just due to pixel and palette limitations but also the 8-bit CPUs at sub-4 MHz clock speeds and limited memory (usually 8k to 32k game size).
It wasn't until around the mid-80s when arcade and personal computer hardware with 16-bit CPUs at 8 Mhz+ and 256K memory hit that magic middle-ground we see as unique to that era of computer and arcade graphics. By the mid-90s it was already starting to vanish as palettes grew beyond 256 colors and resolutions exceeded 15Khz analog video (roughly 240 lines high). A great example of the peak visuals possible from the painstaking care and artistic virtuosity of this era can be seen in the incredible hand-drawn sprites of "Street Fighter II": https://fabiensanglard.net/sf2_sheets/index.html.
The other reason I think so many of us see the art style of this era as uniquely special is it ended suddenly with a huge leap to deep color palettes, higher resolutions and 3D rendered graphics. This happened due to the unique nature of analog 15Khz video and the desire to avoid interlace flicker, causing resolutions for most consumer-priced computers and game consoles to max out in the mid-80s at less than 240 vertical lines. Since artists generally want to work in roughly square pixels, this limits horizontal resolution to around 320. So, for nearly a decade the benefits of using the existing televisions consumers already had, limited the visual output of home computers and game consoles to 240 lines. It even froze the evolution of most arcade machines due to the cost savings of using CRTs made for TVs. Even one of the last 2D arcade hardware platforms, Capcom's 1996 CPS III, was limited to 384 x 224 resolution. After this unprecedented 'hold' of nearly ten years on the march of pixel progress, the next increment most consumers saw was a huge and seemingly sudden leap - a doubling of vertical and horizontal resolutions and a jump from 4 and 8-bit palettes (16 to 256 colors) straight to 16-bit palettes (65,535 colors). And this happened at almost the same moment the rush to 3D rendered graphics killed any interest in hand-drawn pixels. In just a few years, virtually all the computer and game pixels consumers saw changed dramatically in both scope and style, creating a clear divide between hand-drawn 2D pixel art at analog resolutions and everything that came after.