Warcraft 1 (1994), Warcraft 2 (1995), and StarCraft (1998) all use power-of-2 aligned map sizes (64 blocks, 128 blocks, and 256 blocks) so the shift-factor could be pre-computed to avoid division/multiplication, which was dang slow on those old 386/486 computers.
Each map block was 2x2 cells, and each cell, 8x8 pixels. Made rendering background cells and fog-of-war overlays very straightforward assembly language.
All of Warcraft/etc. had only a few thousand lines of assembly language to render maps/sprites/fonts/fog-of-war into the offscreen buffer, and to blit from the offscreen buffer to the screen.
The rest of the code didn't need to be in assembly, which is too time-consuming to write for code where the performance doesn't matter. Everything else was written in portable assembler, by which I mean C.
Edit:
By way of comparison, Blackthorne for Super Nintendo was all 85816 assembly. The Genesis version (Motorola 68000) and DOS version (Intel 80386) were manually transcribed into their respective assembly languages.
The PC version of Blackthorne also had a lot of custom assembler macros to generate 100K of rendering code to do pixel-scrollable chunky-planar VGA mode X (written by Bryan Waters - https://www.mobygames.com/person/5641/bryan-waters/).
At Blizzard we learned from working on those console app ports that writing assembly code takes too much programmer time.
Edit 2:
I recall that Comanche: Maximum Overkill (1992, a voxel-based helicopter simulator) was written in all assembly in DOS real mode. A huge technical feat, but so much work to port to protected mode that I think they switched to polygon-rendering for later versions.
It's a shame that when a Redditor discovered the source code for the original StarCraft "gold master" on a CD, they sent it back to Blizzard in exchange for some fucking blizzard merch [1]
EA a while back released the source code to (most) of the old Command & Conquer games [2] though interestingly left out Tiberian Sun and Red Alert 2, StarCraft's closest competitors at the time.
Would've been nice for historical preservation to be able to peek behind the curtain and see StarCraft's code in a similar fashion
This entire reddit thread aged really poorly now that Blizzard is a shell of its former self. If anything, the attitude in that thread is what paved Blizzard's decline: complete disrespect for its origin.
The StarCraft source code is something that must be kept behind closed walls, under tight control by Blizzard, even though the original people working on the game at Blizzard have already left and there is nothing to protect here other than eternal shame.
If you worked on Lost Vikings I'd like to thank you for the entertainment during my childhood. Given your background did you ever get involved in the demo scene?
I did work on Lost Vikings; it was the first original game I had occasion to help develop, and it was a standout moment for Blizzard nee Silicon & Synapse: we proved to ourselves that we could make a game from scratch.
I never developed standalone demos -- I was already working so many hours at work there wasn't much left over for, y'know, regular things like having a life.
Were you at blizzard when they lost their source code server and had no backups? I was there for a short time consulting around the time WC3 was released.
Whoops! I didn't hear about that one, though it was a common occurrence in the earlier days of the game industry. Please share details, if you're able.
On a related note, I worked on Battle Chess, porting it from DOS/Amiga to Windows 3.1, and later to Windows 95 ("MPC Battle Chess"). Every few years a new producer from Interplay would call me and ask for a copy of the source code so they could rebuild it, because they lost their own copies. One of those times it was because they discovered their weekly backup tapes were all blank.
Both Comanche and Settlers 1 were so magic to me as a kid. You learned to work with DOS in text mode. Most shiny on the PC was Wordperfect. And suddely your text computer was capable of displaying graphics and ... games. Hooked me for life.
> By way of comparison, Blackthorne for Super Nintendo was all 85816 assembly
Weren't almost all console games up to that era written in assembly? What high level languages were used? I recall hearing many Atari 7800 (?) games were written in Forth? May be mistaken.
> Imagine a programmer asking a game designer if they could change their formula to use an 8 instead of a 9.5 because it is a number that the CPU prefers to calculate with. There is a very good argument to be made that a game designer should never have to worry about the runtime performance characteristics of binary arithmetic in their life, that’s a fate reserved for programmers
Numeric characteristics are absolutely still a consideration for game designers even in 2026, one that influences what numbers they use in their game designs. The good ones, anyways. There are, of course, also countless bad developers/designers who ignore these things these days, but not because it is free to do so; rather, because they don't know better, and in many cases it is one of many silent contributing factors to a noticeable decrease in the quality of their game.
> Numeric characteristics are absolutely still a consideration for game designers even in 2026, one that influences what numbers they use in their game designs. The good ones, anyways.
I used to think like this, not anymore.
What convinced me that these sort of micro-optimizations just don't matter is reading up on the cycle count of modern processors.
One a Zen 5, Integer addition is a single cycle, multiplication 3, and division ~12. But that's not the full story. The CPU can have 5 inflight multiplications running simultaneously. It can have about 3 divisions running simultaneously.
Back in the day of RCT, there was much less pipelining. For the original pentium, a multiplication took 11 cycles, division could take upwards of 46 cycles. These were on CPUs with 100 Mhz clock cycles. So not only did it take more cycles to finish, couldn't be pipelined, the CPUs were also operating at 1/30th to 1/50th the cycle rate of common CPUs today.
And this isn't even touching on SIMD instructions.
Integer tricks and optimizations are pointless. Far more important than those in a modern game is memory layout. That's where the CPU is actually going to be burning most it's time. If you can create and do operations on a int[], you'll be MUCH faster than if you are doing operations against a Monster[]. A cache miss is going to mean anywhere from a 100 to 1000 cycle penalty. That blows out any sort of hit you take cutting your cycles from 3 to 1.
Absolutely. I have written a small but growing CAD kernel which is seeing use in some games and realtime visualization tools ( https://github.com/timschmidt/csgrs ) and can say that computing with numbers isn't really even a solved problem yet.
All possible numerical representations come with inherent trade-offs around speed, accuracy, storage size, complexity, and even the kinds of questions one can ask (it's often not meaningful to ask if two floats equal each other without an epsilon to account for floating point error, for instance).
"Toward an API for the Real Numbers" ( https://dl.acm.org/doi/epdf/10.1145/3385412.3386037 ) is one of the better papers I've found detailing a sort of staged complexity technique for dealing with this, in which most calculations are fast and always return (arbitrary precision calculations can sometimes go on forever or until memory runs out), but one can still ask for more precise answers which require more compute if required. But there are also other options entirely like interval arithmetic, symbolic algebra engines, etc.
One must understand the trade-offs else be bitten by them.
Yeah, I’m quite surprised at this comment. Commercial video games are mass-produced products, and as much as I dislike designers being bogged down in technical minutiae, having a sense of industrial design for the thing you’re making is an incredible boon.
Fumito Ueda was notably quite concerned with the technical/production feasibility of his designs for Shadow of the Colossus. [1] Doom was an exercise in both creativity and expertise.
"and in many cases it is one of many silent contributing factors to a noticeable decrease in the quality of their game"
Game designers are not so constrained anymore by the limits of the hardware, unless they want to push boundaries. Quality of a game is not just the most efficient runtime performance - it is mainly a question if the game is fun to play. Do the mechanics work. Are there severe bugs. Is the story consistent and the characters relatable. Is something breaking immersion. So ... frequent stuttering because of bad programming is definitely a sign of low quality - but if it runs smooth on the targets audience hardware, improvements should be rather done elsewhere.
That makes no sense since multiplication has been fast for the last 30 years (since PS1) and floating point for the last 25 years (since PS2) and anyway numbers relevant for game design are usually used just a few times per frame so only program size matters, which has not been significantly constrained for the last 40 years (since NES)
Related to that, for a consumer electronics product I worked on using an ARM Cortex-M4 series microcontroller, I actually ended up writing a custom pseudorandom number generation routine (well, modifying one off the shelf). I was able to take the magic mixing constants and change them to things that could be loaded as single immediates using the crazy Thumb-2 immediate instructions. It passed every randomness test I could throw at it.
By not having to pull in anything from the constant pools and thereby avoid memory stalls in the fast path, we got to use random numbers profligately and still run quickly and efficiently, and get to sleep quickly and efficiently. It was a fun little piece of engineering. I'm not sure how much it mattered, but I enjoyed writing it. (I think I did most of it after hours either way.)
Alas, I don't think it ever shipped because we eventually moved to an even smaller and cheaper Cortex-M0 processor which lacked those instructions. Also my successor on that project threw most of it out and rewrote it, for reasons both good and bad.
I remember the older driving games. They'd progressively "build" the road as you progressed on it. Curves in the road were drawn as straight line segments.
Which wasn't a problem, but it clearly showed how the programmers improvised to make it perform.
> The same trick can also be used for the other direction to save a division:
NewValue = OldValue >> 3;
This is basically the same as
NewValue = OldValue / 8;
RCT does this trick all the time, and even in its OpenRCT2 version, this syntax hasn’t been changed, since
compilers won’t do this optimization for you.
(emphasis mine)
Not at all true. Assuming the types are such that >> is equivalent to /, modern compilers will implement division by a power of two as a shift every single time.
> The same trick can also be used for the other direction to save a division:
> NewValue = OldValue >> 3;
> This is basically the same as
> NewValue = OldValue / 8;
> RCT does this trick all the time, and even in its OpenRCT2 version, this syntax hasn’t been changed, since compilers won’t do this optimization for you.
The author loses a lot of credibility by suggesting the compiler won't replace multiplying or dividing by a factor of 2 with the equivalent bit shift. That's a trivial optimization that's always been done. I'm sure compilers were doing that in the 70s.
What language is this article talking where compilers don't optimize multiplication and division by powers of two? Even for division of signed integers, current compilers emit inline code that handles positive and negative values separately, still avoiding the division instruction (unless when optimizing for size, of course).
I was reminded of the factorio blog. That game's such a huge optimization challenge even by today's standards and I believe works with the design.
One interesting thing I remember is if you have a long conveyor belt of 10,000 copper coils, you can basically simplify it to just be only the entry and exit tile are actually active. All the others don't actually have to move because nothing changes... As long as the belts are fully or uniformly saturated. So you avoid mechanics which would stop that.
> When reading through OpenRCT2’s source, there is a common syntax that you rarely see in modern code, lines like this:
> NewValue = OldValue << 2;
I disagree with the framing of this section. Bit shifts are used all the time in low-level code. They're not just some archaic optimisation, they're also a natural way of working with binary data (aka all data on a computer). Modern low-level code continues to use lots of bit shifts, bitwise operators, etc.
Low-level programming is absolutely crucial to performant games. Even if you're not doing low-level programming yourself, you're almost certainly using an engine or library that uses it extensively. I'm surprised an article about optimisation in gaming, of all things, would take the somewhat tired "in ye olde days" angle on low-level code.
Is there a place to find stories of recent game optimization? What's most ridiculous on like quick inverse square route. As someone who spent way too much time vraying in prior life, I still can't believe we got real time ray tracing.
On huge games produced by large game studios, I wonder if the idea of using a real world technical challenge as a "feature" within the game is considered genius? Consider a coder and a game designer who are on different teams and don't attend the same meetings.
But if you look at creative writing, story arcs are all about obstacles. A boring story is made interesting by an obstacle. It is what our protagonist needs to overcome. A one-man-band game dev who simultaneously holds the story and the technical challenge their head, might spot the opportunity to use a glitch or limitation as, I dunno, a mini game that riffs on the glitch.
I guess I'm showing my age but having read stuff like Michel Abrash's books years ago this article was a bit underwhelming.
Like no one who wrote any code back in the 80s or 90s even for a homebrew game was skipping this stuff, it was in almost every book and tutorial. Stuff like bit shifting was extremely common and a lot of games would have had design choices that were informed by coding challenges. Lots and lots of code had data structures aligned on byte/word boundaries or had data massaged to fit into the limits of hardware in order to make reads happen in a certain # of cycles, etc.. certainly almost all console games had lots and lots of fascinating design choices like this.
This game may have been exceptionally well optimized but it feels like if the original code is not in the public domain these weren't the best examples.
When he started writing about their being a clean sheet re-implementation I thought there was going to be a benchmark comparison of the modern rewrite vs the original on old hardware or something, that would have been interesting.
Thankfully or not I'm just barely young enough that I never had to write anything professional in assembly, though if I had gone into games maybe I would have.
This is a fun read, its one of my favorite games growing up by far, countless hours sunk into it. I didn't need this write-up to know that Chris Sawyer was god among men and that the open source version is a huge labor of love, but its a good reminder :) I will need to give OpenRCT a try some time, I've tried a little OpenTTD and really enjoy it, but RCT was always my jam.
For the lesson here, I think re-contextualizing the product design in order to ease development should be a core tenant of modern software engineering (or really any form of engineering). This is why we are usually saying that we need to shift left on problems, discussing the constraints up-front lets us inform designers how we might be able to tweak a few designs early in order to save big time on the calendar. All of the projects that I loved being a part of in my career did this well, all of the slogs were ones that employed a leadership-driven approach that amounted to waterfall.
I'm still kind of sad that application development no longer uses much assembly language, WriteNow, which was ~100,000 lines of assembler is still my gold standard for word processor and performance.
The article refers several times to the benefits of the game designer and the coder being the same person. I've often felt that this is the only way to build anything impressive, and in fact I'm amazed that corporations with their hierarchical organisation model ever get anything built at all but I suppose you can brute force anything with enough employees.
It does make you wonder if the future of AI-assisted development will look more like the early days of coding, where one single mind can build and deliver a whole piece of software from beginning to end.
also I remember the excitement of a new game that looked different to others.
Somehow even as a child I just knew that it would be a whole new emergent game play experience.
Ofcourse I didnt know waht went into making Rolelrcoaster Tycoon but I could just by a couple of screenshots how this was clearly a ground up new game with new mechanics that would be extremely fun to play.
I dont get this feeling anymore, as I just assyne everything is just a clone of another game in the same engine generally.
Unless its been a decade in production like Breath of the Wild of GTA 5 i just dont expect much.
Compilers won't do multiplication by power of two to bit shift for you ? I remember reading in ~2000: the only thing writing a<<2 instead of a/4 will do is make your compiler yawn
I have to wonder how much of the original assembly source looked a lot more succinct than whatever's in OpenRCT due to the use of macros. Looking up his gameography on Mobygames, Chris had been writing stuff since 1984 when RCT came out in 1999, it's hard to imagine he was still writing every single opcode out by hand given that I had some macros in the assembler I was fooling around with on my c64 back in the eighties.
170 comments
Each map block was 2x2 cells, and each cell, 8x8 pixels. Made rendering background cells and fog-of-war overlays very straightforward assembly language.
All of Warcraft/etc. had only a few thousand lines of assembly language to render maps/sprites/fonts/fog-of-war into the offscreen buffer, and to blit from the offscreen buffer to the screen.
The rest of the code didn't need to be in assembly, which is too time-consuming to write for code where the performance doesn't matter. Everything else was written in portable assembler, by which I mean C.
Edit:
By way of comparison, Blackthorne for Super Nintendo was all 85816 assembly. The Genesis version (Motorola 68000) and DOS version (Intel 80386) were manually transcribed into their respective assembly languages.
The PC version of Blackthorne also had a lot of custom assembler macros to generate 100K of rendering code to do pixel-scrollable chunky-planar VGA mode X (written by Bryan Waters - https://www.mobygames.com/person/5641/bryan-waters/).
At Blizzard we learned from working on those console app ports that writing assembly code takes too much programmer time.
Edit 2:
I recall that Comanche: Maximum Overkill (1992, a voxel-based helicopter simulator) was written in all assembly in DOS real mode. A huge technical feat, but so much work to port to protected mode that I think they switched to polygon-rendering for later versions.
EA a while back released the source code to (most) of the old Command & Conquer games [2] though interestingly left out Tiberian Sun and Red Alert 2, StarCraft's closest competitors at the time.
Would've been nice for historical preservation to be able to peek behind the curtain and see StarCraft's code in a similar fashion
[1] https://old.reddit.com/r/gamecollecting/comments/68xzxt/star...
[2] https://github.com/electronicarts
The StarCraft source code is something that must be kept behind closed walls, under tight control by Blizzard, even though the original people working on the game at Blizzard have already left and there is nothing to protect here other than eternal shame.
> peek behind the curtain and see StarCraft's code in a similar fashion
Oh, you wouldn't want to, believe me. You'd be scarred by code like this:
It's remarkable the game functioned as well as it did. See https://www.codeofhonor.com/blog/tough-times-on-the-road-to-...PS: Still waiting to hear about the "Westwood's reaction" blog post! :p
I never developed standalone demos -- I was already working so many hours at work there wasn't much left over for, y'know, regular things like having a life.
On a related note, I worked on Battle Chess, porting it from DOS/Amiga to Windows 3.1, and later to Windows 95 ("MPC Battle Chess"). Every few years a new producer from Interplay would call me and ask for a copy of the source code so they could rebuild it, because they lost their own copies. One of those times it was because they discovered their weekly backup tapes were all blank.
> By way of comparison, Blackthorne for Super Nintendo was all 85816 assembly
Weren't almost all console games up to that era written in assembly? What high level languages were used? I recall hearing many Atari 7800 (?) games were written in Forth? May be mistaken.
> Imagine a programmer asking a game designer if they could change their formula to use an 8 instead of a 9.5 because it is a number that the CPU prefers to calculate with. There is a very good argument to be made that a game designer should never have to worry about the runtime performance characteristics of binary arithmetic in their life, that’s a fate reserved for programmers
Numeric characteristics are absolutely still a consideration for game designers even in 2026, one that influences what numbers they use in their game designs. The good ones, anyways. There are, of course, also countless bad developers/designers who ignore these things these days, but not because it is free to do so; rather, because they don't know better, and in many cases it is one of many silent contributing factors to a noticeable decrease in the quality of their game.
> Numeric characteristics are absolutely still a consideration for game designers even in 2026, one that influences what numbers they use in their game designs. The good ones, anyways.
I used to think like this, not anymore.
What convinced me that these sort of micro-optimizations just don't matter is reading up on the cycle count of modern processors.
One a Zen 5, Integer addition is a single cycle, multiplication 3, and division ~12. But that's not the full story. The CPU can have 5 inflight multiplications running simultaneously. It can have about 3 divisions running simultaneously.
Back in the day of RCT, there was much less pipelining. For the original pentium, a multiplication took 11 cycles, division could take upwards of 46 cycles. These were on CPUs with 100 Mhz clock cycles. So not only did it take more cycles to finish, couldn't be pipelined, the CPUs were also operating at 1/30th to 1/50th the cycle rate of common CPUs today.
And this isn't even touching on SIMD instructions.
Integer tricks and optimizations are pointless. Far more important than those in a modern game is memory layout. That's where the CPU is actually going to be burning most it's time. If you can create and do operations on a int[], you'll be MUCH faster than if you are doing operations against a Monster[]. A cache miss is going to mean anywhere from a 100 to 1000 cycle penalty. That blows out any sort of hit you take cutting your cycles from 3 to 1.
All possible numerical representations come with inherent trade-offs around speed, accuracy, storage size, complexity, and even the kinds of questions one can ask (it's often not meaningful to ask if two floats equal each other without an epsilon to account for floating point error, for instance).
"Toward an API for the Real Numbers" ( https://dl.acm.org/doi/epdf/10.1145/3385412.3386037 ) is one of the better papers I've found detailing a sort of staged complexity technique for dealing with this, in which most calculations are fast and always return (arbitrary precision calculations can sometimes go on forever or until memory runs out), but one can still ask for more precise answers which require more compute if required. But there are also other options entirely like interval arithmetic, symbolic algebra engines, etc.
One must understand the trade-offs else be bitten by them.
Fumito Ueda was notably quite concerned with the technical/production feasibility of his designs for Shadow of the Colossus. [1] Doom was an exercise in both creativity and expertise.
[1] https://www.designroom.site/shadow-of-the-colossus-oral-hist...
Game designers are not so constrained anymore by the limits of the hardware, unless they want to push boundaries. Quality of a game is not just the most efficient runtime performance - it is mainly a question if the game is fun to play. Do the mechanics work. Are there severe bugs. Is the story consistent and the characters relatable. Is something breaking immersion. So ... frequent stuttering because of bad programming is definitely a sign of low quality - but if it runs smooth on the targets audience hardware, improvements should be rather done elsewhere.
By not having to pull in anything from the constant pools and thereby avoid memory stalls in the fast path, we got to use random numbers profligately and still run quickly and efficiently, and get to sleep quickly and efficiently. It was a fun little piece of engineering. I'm not sure how much it mattered, but I enjoyed writing it. (I think I did most of it after hours either way.)
Alas, I don't think it ever shipped because we eventually moved to an even smaller and cheaper Cortex-M0 processor which lacked those instructions. Also my successor on that project threw most of it out and rewrote it, for reasons both good and bad.
Which wasn't a problem, but it clearly showed how the programmers improvised to make it perform.
> The same trick can also be used for the other direction to save a division: NewValue = OldValue >> 3; This is basically the same as NewValue = OldValue / 8; RCT does this trick all the time, and even in its OpenRCT2 version, this syntax hasn’t been changed, since
compilers won’t do this optimization for you.(emphasis mine)
Not at all true. Assuming the types are such that >> is equivalent to /, modern compilers will implement division by a power of two as a shift every single time.
"Interview with RollerCoaster Tycoon's Creator, Chris Sawyer (2024)" https://news.ycombinator.com/item?id=46130335
"Rollercoaster Tycoon (Or, MicroProse's Last Hurrah)" https://news.ycombinator.com/item?id=44758842
"RollerCoaster Tycoon at 25: 'It's mind-blowing how it inspired me'" https://news.ycombinator.com/item?id=39792034
"RollerCoaster Tycoon was the last of its kind [video]" https://news.ycombinator.com/item?id=42346463
"The Story of RollerCoaster Tycoon" https://www.youtube.com/watch?v=ts4BD8AqD9g
> The same trick can also be used for the other direction to save a division:
> NewValue = OldValue >> 3; > This is basically the same as
> NewValue = OldValue / 8;
> RCT does this trick all the time, and even in its OpenRCT2 version, this syntax hasn’t been changed, since compilers won’t do this optimization for you.
The author loses a lot of credibility by suggesting the compiler won't replace multiplying or dividing by a factor of 2 with the equivalent bit shift. That's a trivial optimization that's always been done. I'm sure compilers were doing that in the 70s.
At first this sounds like a strange technical obscurity"
Do we not know binary in 2026? Why is this a surprise to the intended audience?
I was reminded of the factorio blog. That game's such a huge optimization challenge even by today's standards and I believe works with the design.
One interesting thing I remember is if you have a long conveyor belt of 10,000 copper coils, you can basically simplify it to just be only the entry and exit tile are actually active. All the others don't actually have to move because nothing changes... As long as the belts are fully or uniformly saturated. So you avoid mechanics which would stop that.
> When reading through OpenRCT2’s source, there is a common syntax that you rarely see in modern code, lines like this:
> NewValue = OldValue << 2;
I disagree with the framing of this section. Bit shifts are used all the time in low-level code. They're not just some archaic optimisation, they're also a natural way of working with binary data (aka all data on a computer). Modern low-level code continues to use lots of bit shifts, bitwise operators, etc.
Low-level programming is absolutely crucial to performant games. Even if you're not doing low-level programming yourself, you're almost certainly using an engine or library that uses it extensively. I'm surprised an article about optimisation in gaming, of all things, would take the somewhat tired "in ye olde days" angle on low-level code.
https://youtu.be/twU1SsFP-bE
He has lots of videos that are deep dives into how RCT works and how things are implemented!
The more I actually started digging into assembly, the more this task seems monumental and impossible.
I didn't know there was a fork and I'm excited to look into it
But if you look at creative writing, story arcs are all about obstacles. A boring story is made interesting by an obstacle. It is what our protagonist needs to overcome. A one-man-band game dev who simultaneously holds the story and the technical challenge their head, might spot the opportunity to use a glitch or limitation as, I dunno, a mini game that riffs on the glitch.
Like no one who wrote any code back in the 80s or 90s even for a homebrew game was skipping this stuff, it was in almost every book and tutorial. Stuff like bit shifting was extremely common and a lot of games would have had design choices that were informed by coding challenges. Lots and lots of code had data structures aligned on byte/word boundaries or had data massaged to fit into the limits of hardware in order to make reads happen in a certain # of cycles, etc.. certainly almost all console games had lots and lots of fascinating design choices like this.
This game may have been exceptionally well optimized but it feels like if the original code is not in the public domain these weren't the best examples.
When he started writing about their being a clean sheet re-implementation I thought there was going to be a benchmark comparison of the modern rewrite vs the original on old hardware or something, that would have been interesting.
Thankfully or not I'm just barely young enough that I never had to write anything professional in assembly, though if I had gone into games maybe I would have.
For the lesson here, I think re-contextualizing the product design in order to ease development should be a core tenant of modern software engineering (or really any form of engineering). This is why we are usually saying that we need to shift left on problems, discussing the constraints up-front lets us inform designers how we might be able to tweak a few designs early in order to save big time on the calendar. All of the projects that I loved being a part of in my career did this well, all of the slogs were ones that employed a leadership-driven approach that amounted to waterfall.
It does make you wonder if the future of AI-assisted development will look more like the early days of coding, where one single mind can build and deliver a whole piece of software from beginning to end.
Somehow even as a child I just knew that it would be a whole new emergent game play experience.
Ofcourse I didnt know waht went into making Rolelrcoaster Tycoon but I could just by a couple of screenshots how this was clearly a ground up new game with new mechanics that would be extremely fun to play.
I dont get this feeling anymore, as I just assyne everything is just a clone of another game in the same engine generally.
Unless its been a decade in production like Breath of the Wild of GTA 5 i just dont expect much.
> The same trick can also be used for the other direction to save a division:
> NewValue = OldValue >> 3;
You need to be careful, because this doesn't work if the value is negative. A
I really wish I could see the source code.