Ada was also ignored because the typical compiler cost tens of thousands of dollars. No open source or free compiler existed during the decades where popular languages could be had for free.
Ada’s failure to escape its niche is overdetermined.
Given the sophistication of the language and the compiler technology of the day, there was no way Ada was going to run well on 1980’s microcomputers. Intel built the i432 “mainframe on a chip” with a bunch of Ada concepts baked into the hardware for performance, and it was still as slow as a dog.
And as we now know, microcomputers later ate the world, carrying along their C and assembly legacy for the better part of two decades, until they got fast enough and compiler technology got good enough that richer languages were plausible.
A huge factor. I used ada for years and the fact everyone I worked with did hobby projects in other languages didn’t help it. And most of us liked Ada.
It had other warts the string handling wasn’t great, which was a huge problem. It was slow too in a time where that mattered more (we had c and ada in our code base.). I remember the concurrency not using the OSs so the one place we used it was a pain. HPUX had an amazing quasi real time extensions, so we just ran a bunch of processes.
GNAT has existed since at least the mid-90s, and in that time period plenty of companies used non-OSS compilers.
In that era, the largest blocker for Ada was it ws viewed as having a lot of overhead for things that weren't generally seen as useful (safety guarantees). The reputation was it only mattered if you were working on military stuff, etc.
The article gives another reason "A second answer is aesthetic. Ada's syntax is verbose in a way that programmers with a background in C find unpleasant. if X then Y; end if; instead of if (x) { y; }. procedure Sort (A : in out Array_Type) instead of void sort(int* a)."
I think this should not be underestimated. There is a huge number of small C compilers. People write their own C compiler because they want to have one.
That doesn't happen we Ada. Very few people liked Ada enough that they would write a compiler for a subset of the language. For example, an Ada subset similar to the feature set of Modula-2 should be quite doable with a modest effort.
Not really, the state of compilers pretty much sucked back then. GCC was the only real free compiler in the 80s and it wasn't really ready for prime time until the late 80s. You were paying (lots) of money for a compiler no matter what language you chose. And if you were targeting a new language the compiler was sure to suck.
Even in the late 90s Jamie Zawinski had a rant against C++. His argument for not using it? The compilers suck! C++ was the main "competitor" of Ada and it was a decade or more behind Ada through most of the time.
The "killer feature" of C++ against Ada (when it came to fighting against compiler maturity) was really that you could pretend to be writing C++ code but really just keep writing C-with-classes.
If Ada had put a modula or pascal compatibility mode in the language and produced a reference compiler that was based on a stable compiler in one of those languages, the history may have been different because people could have just written "PascAda" while waiting for the compilers to catch up.
Given some of the other issues, I’m not sure it would have mattered, but it certainly didn’t even allow the experiment to be run. I would not have wanted to compile Ada in the 1980s on that hardware. Given all the checking, the compiler must have been horribly slow (imagine compiling Rust on that same 1980s hardware).
I like Ada. I can’t believe this whole discussion about how types are handled missed the entire ML family of languages. ML, Standard ML, Concurrent ML, Caml, OCaml, and more have structural types, supported and enforced by the compiler.
Ada has one of the same primary issues as PL/I, PHP, and Perl. As much as one might like it, it’s a huge language with loads of syntax and semantics baked into the core language. The article keeps saying that’s a selling point. To some extent and to some people that’s true. However, it also touts the annexes as something wonderful. That’s also true, and more true in my opinion. If only more of the language had been in standardized annexes with a smaller core it may have seen far more adoption.
I find multiple "strange" flaws with the article, even for my appreciation of Ada _and_ the article as an essay:
* The article claims only Ada has true separation of implementation vs specification (the interface), but as far as I am able to reason, also e.g. JavaScript is perfectly able to define "private" elements (not exported by an ES6 module) while being usable in the module that declares them -- if this isn't "syntactical" (and semantical) separation like what is prescribed to Ada, what is the difference(s) the article tries to point out?
* Similarly, Java is mentioned where private apparently (according to the article) makes the declaration "visible to inheritance, to reflection, and to the compiler itself when it checks subclass compatibility" -- all of which is false if I remember my Java correctly -- a private declaration is _not_ visible to inheritance and consequently the compiler can ignore it / fast-track in a subclass since it works much the same as it has, in the superclass, making the "compatibility" a guarantee by much the same consequence
I am still reading the article, but having discovered the above points, it detracts from my taking it as seriously as I set out to -- wanting to identify value in Ada that we "may have missed" -- a view the article very much wants to front.
I like the article overall but the continually repeated 'Language X didn't have that until ' is very grating after the first ten or so.
I also wish there were concrete code examples. Show me what you are talking about rather than just telling me how great it is. Put some side by side comparisons!
I really don't want this to be AI writing because I enjoyed it, but as other commenters have pointed out, the rate of publishing (according to the linked Twitter account) is very rapid. I'm worried that I can't tell.
The US Air Force intended to use ADA, but had to use JOVIAL instead because ADA took so long to be developed. Most people have never heard of JOVIAL but it still exists in the USAF as a legacy.
I worked with JOVIAL as part of my first project as a programmer in 1981, even though we didn't even have a full JOVIAL compiler there yet (it existed elsewhere). I remember all the talk about the future being ADA but it was only an incomplete specification at the time.
My work on DoD ADA projects tended to focus on DoD STD 2167 (mid to late 1980s).
Sadly the review meetings focused on document structure instead of thoughtful software design and analysis. ADA didn't help; it was cumbersome to get working well, and ADA experience in the contracting agencies was low. The waterfall approach made the projects slow to implement.
"JavaScript's module system — introduced in 2015, thirty-two years after Ada's — provides import and export but no mechanism for a type to have a specification whose representation is hidden from importers."
Then:
"in Ada, the implementation of a private type is not merely inaccessible, it is syntactically absent from the client's view of the world."
Am I missing something -- a JavaScript module is perfectly able to declare a private element by simply not exporting it, accomplishing what the author prescribes to Ada as "is not merely inaccessible, it is syntactically absent from the client's view of the world"? Same would go for some of the other language author somewhat carelessly lumps together with JavaScript.
I loved the article, and I have always had curiosity about Ada -- beyond some of the more modern languages in fact -- but I just don't see where Ada separates interface from implementation in a manner that's distinctly better or different from e.g. JavaScript modules.
> The verbosity was deliberate — Ichbiah wanted programs to be readable by people other than their authors, and readability over time favours explicitness — but it was experienced as bureaucratic and un-hacker-like, and the programming culture that formed in the 1980s and 1990s was organised around the proposition that conciseness was sophistication. Ada was the language of procurement officers. C was the language of people who understood machines. The cultural verdict was delivered early and never substantially revisited.
imo, the real value of Ada/SPARK today is that it enforces a clear split between specification and implementation, which is exactly what your LLM needs.
You define the interface, types, pre/post conditions you want in .ads file, then let the agent loose writing the .adb body file. The language’s focus on readability means your agent has no problem reading and cross referencing specs. The compiler and proof tools verify the body implements the spec.
When I was a young lad, must have been 20 I came across some programming books, including programming in Ada.
I read so much of it but never wrote a line of code in it, despite trying. Couldn't get the build environment to work.
But the idea of contracts in that way seemed so logical. I didn't understand the difference this article underpins though. I learned Java and thought interfaces were the same.
I've written a few small projects in Ada, and it's a better language than it gets credit for.
Yes, it's verbose. I like verbosity; it forces clarity. Once you adjust, the code becomes easier to read, not harder. You spend less time guessing intent and more time verifying it. Or verify it, ignore what you verified, then go back and remind yourself you're an idiot when you realize the code your ignored was right. That might just be me.
In small, purpose-built applications, it's been pleasant to code with. The type system is strict but doesn't yell at you a lot. The language encourages you to be explicit about what the program is actually doing, especially when you're working close to the hardware, which is a nice feature.
It has quirks, like anything else. But most of them feel like the cost of writing better, safer code.
Ada doesn't try to be clever. It tries to be clear, even if it is as clear as mud.
I remember learning ADA at uni in the 90s and not loving it because of the syntax and it being slow to work with. I also remember the Arianne 5 rocket crash in the late 90s being blamed for a software bug, and the software being written in ADA. Now i understand that it was not a pure software issue, but still, all that safety did not prevent the major disaster that it was
It could be a nice article if it wasn't full of mistakes and incorrect assumptions about mechanisms and their origins, unsubstantiated statements, and so forth.
Right now, it sounds like an Ada fanfic - a misinformed one, to be more precise.
Ada is a language that had a lot of useful features much earlier than any of the languages that are popular today, and some of those features are still missing from the languages easily available today.
In the beginning Ada has been criticized mainly for 2 reasons, it was claimed that it is too complex and it was criticized for being too verbose.
Today, the criticism about complexity seems naive, because many later languages have become much more complex than Ada, in many cases because they have started as simpler languages to which extra features have been added later, and because the need for such features had not been anticipated during the initial language design, adding them later was difficult, increasing the complexity of the updated language.
The criticism about verbosity is correct, but it could easily be solved by preserving the abstract Ada syntax and just replacing many tokens with less verbose symbols. This can easily be done with a source preprocessor, but this is avoided in most places, because then the source programs have a non-standard appearance.
It would have been good if the Ada standard had been updated to specify a standardized abbreviated syntax besides the classic syntax. This would not have been unusual, because several old languages have specified abbreviated and non-abbreviated syntactic alternatives, including languages like IBM PL/I or ALGOL 68. Even the language C had a more verbose syntactic alternative (with trigraphs), which has almost never been used, but nonetheless all C compilers had to support both the standard syntax and its trigraph alternative.
However, the real defect of Ada has been neither complexity nor verbosity, but expensive compilers and software tools, which have ensured its replacement by the free C/C++.
The so-called complexity of Ada has always been mitigated by the fact that besides its reference specification document, Ada always had a design rationale document accompanying the language specification. The rationale explained the reasons for the choices made when designing the language.
Such a rationale document would have been extremely useful for many other programming languages, which frequently include some obscure features whose purpose is not obvious, or which look like mistakes, even if sometimes there are good reasons for their existence.
When Ada was introduced, it was marketed as a language similar to Pascal. The reason is that at that time Pascal had become the language most frequently used for teaching programming in universities.
Fortunately the resemblances between Ada and Pascal are only superficial. In reality the Ada syntax and semantics are much more similar to earlier languages like ALGOL 68 and Xerox Mesa, which were languages far superior to Pascal.
The parent article mentions that Ada includes in the language specification the handling of concurrent tasks, instead of delegating such things to a system library (task = term used by IBM since 1964 for what now is normally called "thread", a term first used in 1966 in some Multics documents and popularized much later by the Mach operating system).
However, I do not believe that this is a valuable feature of Ada. You can indeed build any concurrent applications around the Ada mechanism of task "rendez-vous", but I think that this concept is a little too high-level.
It incorporates 2 lower level actions, and for the highest efficiency in implementations sometimes it may be necessary to have access to the lowest level actions. This means that sometimes using a system library for implementing the communication between concurrent threads may provide higher performance than the built-in Ada concurrency primitives.
"the language that built the languages" in the title is editorialising here. It's supposed to be referring to Ada, but it doesn't really make sense, nor is it argued in the article.
but why does "the industry ignored it" hold as the central framing when the actual story seems to be "the DoD mandated it, contractors used it, and it worked fine for exactly what it was built for"? the implicit assumption is that widespread adoption is the metric for a language succeeding, but ada wasnt trying to win over web developers, it was trying to stop missiles from being maintained in 450 incompatible dialects, which... it actaully did?
>Ada's deployment domain meant that Ada's successes were invisible. A software project that compiles without error, runs without race conditions, and has been formally verified to satisfy its specification does not generate incident reports or post-mortems or conference talks about what went wrong. Ada's successes — the aircraft that have not crashed, the railway signalling systems that have not failed, the missile guidance software that has not misguided — are invisible precisely because they are successes.
Um... this is most certainly not true. Back in the late 1990s and early 2000s, Ada was the language of choice at my Australian university for both computer science and software engineering degrees.
I distinctly recall my lecturer telling us a story about a fancy presentation of Ada in military tank (AFV) systems for the DoD. The story goes that during the presentation, in front of a live audience, the presenter AND the audience had to duck after the tank's turret began spinning around and around. The code had entered an infinite loop!
Wonderful article and a good fit with HN’s motto of “move slowly and preserve things” as opposed to Silicon Valley’s jingoistic “move fast and break things”.
It highlights the often perplexing human tendency to reinvent rather than reuse. Why do we, as a species, ignore hard-won experience and instead restart? In doing so, often making mistakes that could have been avoided if we’d taken the time or had the curiosity/humility to learn from others. This seems particularly prevalent in software: “standing on the feet of giants” is a default rather than exception.
That aside, the article was thoroughly educational and enjoyable. I came away with much-deepened insight and admiration for those involved in researching, designing and building the language. Resolved to find and read the referenced “steelman” and language design rationale papers.
> JavaScript's module system — introduced in 2015, thirty-two years after Ada's — provides import and export but no mechanism for a type to have a specification whose representation is hidden from importers.
What?
#1 JavaScript doesn't have formal types. What does it even mean by "representation"?
#2 You can just define a variable and not export it. You can't import a variable that isn't exported.
There are several little LLM hallucinations like this throughout the article. It's distracting and annoying.
Edit: Look, I know that complaining about downvotes is annoying, but I find this genuinely perplexing. Could someone just explain what the hell that paragraph was supposed to mean instead of downvoting me?
224 comments
I think that is the biggest factor of all.
Given the sophistication of the language and the compiler technology of the day, there was no way Ada was going to run well on 1980’s microcomputers. Intel built the i432 “mainframe on a chip” with a bunch of Ada concepts baked into the hardware for performance, and it was still as slow as a dog.
And as we now know, microcomputers later ate the world, carrying along their C and assembly legacy for the better part of two decades, until they got fast enough and compiler technology got good enough that richer languages were plausible.
It had other warts the string handling wasn’t great, which was a huge problem. It was slow too in a time where that mattered more (we had c and ada in our code base.). I remember the concurrency not using the OSs so the one place we used it was a pain. HPUX had an amazing quasi real time extensions, so we just ran a bunch of processes.
In that era, the largest blocker for Ada was it ws viewed as having a lot of overhead for things that weren't generally seen as useful (safety guarantees). The reputation was it only mattered if you were working on military stuff, etc.
I think this should not be underestimated. There is a huge number of small C compilers. People write their own C compiler because they want to have one.
That doesn't happen we Ada. Very few people liked Ada enough that they would write a compiler for a subset of the language. For example, an Ada subset similar to the feature set of Modula-2 should be quite doable with a modest effort.
Even in the late 90s Jamie Zawinski had a rant against C++. His argument for not using it? The compilers suck! C++ was the main "competitor" of Ada and it was a decade or more behind Ada through most of the time.
The "killer feature" of C++ against Ada (when it came to fighting against compiler maturity) was really that you could pretend to be writing C++ code but really just keep writing C-with-classes.
If Ada had put a modula or pascal compatibility mode in the language and produced a reference compiler that was based on a stable compiler in one of those languages, the history may have been different because people could have just written "PascAda" while waiting for the compilers to catch up.
Ada has one of the same primary issues as PL/I, PHP, and Perl. As much as one might like it, it’s a huge language with loads of syntax and semantics baked into the core language. The article keeps saying that’s a selling point. To some extent and to some people that’s true. However, it also touts the annexes as something wonderful. That’s also true, and more true in my opinion. If only more of the language had been in standardized annexes with a smaller core it may have seen far more adoption.
* The article claims only Ada has true separation of implementation vs specification (the interface), but as far as I am able to reason, also e.g. JavaScript is perfectly able to define "private" elements (not exported by an ES6 module) while being usable in the module that declares them -- if this isn't "syntactical" (and semantical) separation like what is prescribed to Ada, what is the difference(s) the article tries to point out?
* Similarly, Java is mentioned where
privateapparently (according to the article) makes the declaration "visible to inheritance, to reflection, and to the compiler itself when it checks subclass compatibility" -- all of which is false if I remember my Java correctly -- a private declaration is _not_ visible to inheritance and consequently the compiler can ignore it / fast-track in a subclass since it works much the same as it has, in the superclass, making the "compatibility" a guarantee by much the same consequenceI am still reading the article, but having discovered the above points, it detracts from my taking it as seriously as I set out to -- wanting to identify value in Ada that we "may have missed" -- a view the article very much wants to front.
I also wish there were concrete code examples. Show me what you are talking about rather than just telling me how great it is. Put some side by side comparisons!
https://xcancel.com/Iqiipi_Essays
There is no named public author. A truly amazing productivity for such a short time period and generously the author does not take any credit.
"These are not positions. They are proposals — structures through which a subject might be examined rather than verdicts about it."
The entire site is AI written.
> Every language that has added sum types in the past twenty years has added, with its own syntax, what Ada's designers put in the original standard.
While true, that doesn't mean that other language's sum types originated in Ada. As [1] states,
> NPL and Hope are notable for being the first languages with call-by-pattern evaluation and algebraic data types
and a modern language like Haskell has origins in Hope (from 1980) through Miranda.
[1] https://en.wikipedia.org/wiki/Hope_(programming_language)
I worked with JOVIAL as part of my first project as a programmer in 1981, even though we didn't even have a full JOVIAL compiler there yet (it existed elsewhere). I remember all the talk about the future being ADA but it was only an incomplete specification at the time.
Sadly the review meetings focused on document structure instead of thoughtful software design and analysis. ADA didn't help; it was cumbersome to get working well, and ADA experience in the contracting agencies was low. The waterfall approach made the projects slow to implement.
https://en.wikipedia.org/wiki/DOD-STD-2167A?wprov=sfti1
"JavaScript's module system — introduced in 2015, thirty-two years after Ada's — provides import and export but no mechanism for a type to have a specification whose representation is hidden from importers."
Then:
"in Ada, the implementation of a private type is not merely inaccessible, it is syntactically absent from the client's view of the world."
Am I missing something -- a JavaScript module is perfectly able to declare a private element by simply not exporting it, accomplishing what the author prescribes to Ada as "is not merely inaccessible, it is syntactically absent from the client's view of the world"? Same would go for some of the other language author somewhat carelessly lumps together with JavaScript.
I loved the article, and I have always had curiosity about Ada -- beyond some of the more modern languages in fact -- but I just don't see where Ada separates interface from implementation in a manner that's distinctly better or different from e.g. JavaScript modules.
> The verbosity was deliberate — Ichbiah wanted programs to be readable by people other than their authors, and readability over time favours explicitness — but it was experienced as bureaucratic and un-hacker-like, and the programming culture that formed in the 1980s and 1990s was organised around the proposition that conciseness was sophistication. Ada was the language of procurement officers. C was the language of people who understood machines. The cultural verdict was delivered early and never substantially revisited.
IMO, this was the telling paragraph.
You define the interface, types, pre/post conditions you want in .ads file, then let the agent loose writing the .adb body file. The language’s focus on readability means your agent has no problem reading and cross referencing specs. The compiler and proof tools verify the body implements the spec.
When I was a young lad, must have been 20 I came across some programming books, including programming in Ada.
I read so much of it but never wrote a line of code in it, despite trying. Couldn't get the build environment to work.
But the idea of contracts in that way seemed so logical. I didn't understand the difference this article underpins though. I learned Java and thought interfaces were the same.
Great article, great language.
Yes, it's verbose. I like verbosity; it forces clarity. Once you adjust, the code becomes easier to read, not harder. You spend less time guessing intent and more time verifying it. Or verify it, ignore what you verified, then go back and remind yourself you're an idiot when you realize the code your ignored was right. That might just be me.
In small, purpose-built applications, it's been pleasant to code with. The type system is strict but doesn't yell at you a lot. The language encourages you to be explicit about what the program is actually doing, especially when you're working close to the hardware, which is a nice feature.
It has quirks, like anything else. But most of them feel like the cost of writing better, safer code.
Ada doesn't try to be clever. It tries to be clear, even if it is as clear as mud.
We use Ada still in industry application. Developing, implementation, installation and forget it because its running.
Right now, it sounds like an Ada fanfic - a misinformed one, to be more precise.
In the beginning Ada has been criticized mainly for 2 reasons, it was claimed that it is too complex and it was criticized for being too verbose.
Today, the criticism about complexity seems naive, because many later languages have become much more complex than Ada, in many cases because they have started as simpler languages to which extra features have been added later, and because the need for such features had not been anticipated during the initial language design, adding them later was difficult, increasing the complexity of the updated language.
The criticism about verbosity is correct, but it could easily be solved by preserving the abstract Ada syntax and just replacing many tokens with less verbose symbols. This can easily be done with a source preprocessor, but this is avoided in most places, because then the source programs have a non-standard appearance.
It would have been good if the Ada standard had been updated to specify a standardized abbreviated syntax besides the classic syntax. This would not have been unusual, because several old languages have specified abbreviated and non-abbreviated syntactic alternatives, including languages like IBM PL/I or ALGOL 68. Even the language C had a more verbose syntactic alternative (with trigraphs), which has almost never been used, but nonetheless all C compilers had to support both the standard syntax and its trigraph alternative.
However, the real defect of Ada has been neither complexity nor verbosity, but expensive compilers and software tools, which have ensured its replacement by the free C/C++.
The so-called complexity of Ada has always been mitigated by the fact that besides its reference specification document, Ada always had a design rationale document accompanying the language specification. The rationale explained the reasons for the choices made when designing the language.
Such a rationale document would have been extremely useful for many other programming languages, which frequently include some obscure features whose purpose is not obvious, or which look like mistakes, even if sometimes there are good reasons for their existence.
When Ada was introduced, it was marketed as a language similar to Pascal. The reason is that at that time Pascal had become the language most frequently used for teaching programming in universities.
Fortunately the resemblances between Ada and Pascal are only superficial. In reality the Ada syntax and semantics are much more similar to earlier languages like ALGOL 68 and Xerox Mesa, which were languages far superior to Pascal.
The parent article mentions that Ada includes in the language specification the handling of concurrent tasks, instead of delegating such things to a system library (task = term used by IBM since 1964 for what now is normally called "thread", a term first used in 1966 in some Multics documents and popularized much later by the Mach operating system).
However, I do not believe that this is a valuable feature of Ada. You can indeed build any concurrent applications around the Ada mechanism of task "rendez-vous", but I think that this concept is a little too high-level.
It incorporates 2 lower level actions, and for the highest efficiency in implementations sometimes it may be necessary to have access to the lowest level actions. This means that sometimes using a system library for implementing the communication between concurrent threads may provide higher performance than the built-in Ada concurrency primitives.
Well, that and the proprietary compilers
And every time I fail.
>Ada's deployment domain meant that Ada's successes were invisible. A software project that compiles without error, runs without race conditions, and has been formally verified to satisfy its specification does not generate incident reports or post-mortems or conference talks about what went wrong. Ada's successes — the aircraft that have not crashed, the railway signalling systems that have not failed, the missile guidance software that has not misguided — are invisible precisely because they are successes.
Um... this is most certainly not true. Back in the late 1990s and early 2000s, Ada was the language of choice at my Australian university for both computer science and software engineering degrees.
I distinctly recall my lecturer telling us a story about a fancy presentation of Ada in military tank (AFV) systems for the DoD. The story goes that during the presentation, in front of a live audience, the presenter AND the audience had to duck after the tank's turret began spinning around and around. The code had entered an infinite loop!
It highlights the often perplexing human tendency to reinvent rather than reuse. Why do we, as a species, ignore hard-won experience and instead restart? In doing so, often making mistakes that could have been avoided if we’d taken the time or had the curiosity/humility to learn from others. This seems particularly prevalent in software: “standing on the feet of giants” is a default rather than exception.
That aside, the article was thoroughly educational and enjoyable. I came away with much-deepened insight and admiration for those involved in researching, designing and building the language. Resolved to find and read the referenced “steelman” and language design rationale papers.
> JavaScript's module system — introduced in 2015, thirty-two years after Ada's — provides import and export but no mechanism for a type to have a specification whose representation is hidden from importers.
What?
#1 JavaScript doesn't have formal types. What does it even mean by "representation"?
#2 You can just define a variable and not export it. You can't import a variable that isn't exported.
There are several little LLM hallucinations like this throughout the article. It's distracting and annoying.
Edit: Look, I know that complaining about downvotes is annoying, but I find this genuinely perplexing. Could someone just explain what the hell that paragraph was supposed to mean instead of downvoting me?