If you program, you should read this piece.
-
I think that's unfair ?
Everybody charged DoD 10x "because of Ada" - simply because they could get away with it, provided Ada didn't become mainstream.
The perverse incentives of military procurement is not in any way a relevant factor, when Judging a programming language, as programing language.
The point about everybody else converging on where they could have started 45 years ago is IMO, totally fair.
@bsdphk Ada was entirely the result of DOD procurement, intended to solve a DOD problem. Dijkstra, rightly, criticized the design process and the final language and his writing on the subject should be required reading.
Ada suffered from the same problem PL/1 did and was almost immediately fragmented into the infamous “profile” subsets that resulted in it failing to meet DOD requirements.
It was not a good language to start from, revisionist views notwithstanding.
-
-
If you program, you should read this piece.
"Ada's successes — the aircraft that have not crashed, the railway signalling systems that have not failed, the missile guidance software that has not misguided — are invisible precisely because they are successes. The languages that failed visibly, in buffer overflows and null pointer exceptions and data races and security vulnerabilities, generated the discourse. [Ada did not]"
@bsdphk this reads like pure AI slop BTW. Too much hand waving and inaccuracies. Looking at the top level just confirms the sloppiness https://www.iqiipi.com
-
-
@DesChips @pfriedma @bsdphk @whyrl Most launchers have had zero *software* failures though. Guidance of a space-launcher is not actually a hard problem, it can be fully simulated beforehand without much trouble. I believe some rockets, e.g. Japan's Lambda flew without any computer at all, using purely a timer to steer the thing.
-
If you program, you should read this piece.
"Ada's successes — the aircraft that have not crashed, the railway signalling systems that have not failed, the missile guidance software that has not misguided — are invisible precisely because they are successes. The languages that failed visibly, in buffer overflows and null pointer exceptions and data races and security vulnerabilities, generated the discourse. [Ada did not]"
@bsdphk There's a lot of, to put it politely, inaccuracy in that piece. Also it's clearly generated with or by AI. But this is the world we're in now, so get used to it. History being rewritten by the machines.
-
If you program, you should read this piece.
"Ada's successes — the aircraft that have not crashed, the railway signalling systems that have not failed, the missile guidance software that has not misguided — are invisible precisely because they are successes. The languages that failed visibly, in buffer overflows and null pointer exceptions and data races and security vulnerabilities, generated the discourse. [Ada did not]"
@bsdphk
When I was in the biz I constantly chafed at the hacker culture of celebrating indiciplined indecipherable code. Even the in the comments here I see the sneering attitude permeating. I have always disliked this intensely.Thanks for sharing this essay.
-
@pfriedma @bsdphk @whyrl
I was thinking it might be nice to use a language with Ada's checking features, but i don't know if i can stomach the C++-like exception semantics.The problem has always been that it completely destroys everyone's ability to reason about control flow. It's kinda glaring for a language that otherwise emphasizes the importance of provable correctness.
(Contrast with modern languages like Swift, where you're forced to annotate call sites that can possibly throw.)
-
@pfriedma @bsdphk @whyrl
I was thinking it might be nice to use a language with Ada's checking features, but i don't know if i can stomach the C++-like exception semantics.The problem has always been that it completely destroys everyone's ability to reason about control flow. It's kinda glaring for a language that otherwise emphasizes the importance of provable correctness.
(Contrast with modern languages like Swift, where you're forced to annotate call sites that can possibly throw.)
@pfriedma @bsdphk @whyrl
...and then on top of that, it's going to unwind the stack, and thereby destroy evidence that would be useful in locating the root cause of a bug, in the event of programming mistakes like null dereferences!?
https://learn.adacore.com/courses/intro-to-ada/chapters/exceptions.htmlI really hope there's a way to disable or alter this behavior and make it trap instead, so that the stack is preserved for debugging...?
(C++ unwinds the stack too, but (1) generally not on UB, and (2) we have sanitizers for UB.)
-
@pfriedma @bsdphk @whyrl
...and then on top of that, it's going to unwind the stack, and thereby destroy evidence that would be useful in locating the root cause of a bug, in the event of programming mistakes like null dereferences!?
https://learn.adacore.com/courses/intro-to-ada/chapters/exceptions.htmlI really hope there's a way to disable or alter this behavior and make it trap instead, so that the stack is preserved for debugging...?
(C++ unwinds the stack too, but (1) generally not on UB, and (2) we have sanitizers for UB.)
-
-
R relay@relay.infosec.exchange shared this topic