Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. So I'm having a "This is why we still use Fortran" moment today.

So I'm having a "This is why we still use Fortran" moment today.

Scheduled Pinned Locked Moved Uncategorized
63 Posts 18 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • curtosis@mastodon.socialC curtosis@mastodon.social

    @arclight @AlgoCompSynth The secret to Julia is it’s essentially a Lisp with infix syntax. Multiple dispatch is how they solve the problem of fast numerical solutions being extremely sensitive to types, which is probably the only way you can sensibly do it, but it has taken a lot of work to get it to where it is now (which AIUI is much better than it was even a year or two ago).

    curtosis@mastodon.socialC This user is from outside of this forum
    curtosis@mastodon.socialC This user is from outside of this forum
    curtosis@mastodon.social
    wrote last edited by
    #39

    @arclight @AlgoCompSynth Speaking of different syntaxes… Coalton is super interesting. Sort of a Lisp-ML-Fortran hybrid.

    Link Preview Image
    A Preview of Coalton 0.2

    By Robert Smith Coalton is a statically typed functional programming language that lives inside Common Lisp. It has had an exciting few years. It is being used for industrial purposes, being put to its limits as a production language to build good, reliable, efficient, and robust products. Happily, with Coalton, many products shipped with tremendous success. But as we built these products, we noticed gaps in the language. As such, we’re setting the stage for the next tranche of Coalton work, and we’re going to preview some of these improvements here, including how Coalton can prove $\sqrt{2+\sqrt{3}} = \sqrt{2}(\sqrt{3}+1)/2$ exactly.

    favicon

    The Coalton Programming Language (coalton-lang.github.io)

    The section in this post on “Real algebraic numbers and xmath” in particular seems like it might activate some of your numerics neuroreceptors.

    1 Reply Last reply
    0
    • arclight@oldbytes.spaceA arclight@oldbytes.space

      That apparently translates to 8 lines of impenetrable Scheme:

      danl1240@mastodon.gamedev.placeD This user is from outside of this forum
      danl1240@mastodon.gamedev.placeD This user is from outside of this forum
      danl1240@mastodon.gamedev.place
      wrote last edited by
      #40

      @arclight as someone who can read scheme reasonably well the only way this is workable is within a repl oriented environment like Dr Racket or emacs. Might be just me though.

      1 Reply Last reply
      0
      • arclight@oldbytes.spaceA arclight@oldbytes.space

        And mind you, this isn't a He-Man Lisp Hater's rant. It's more of a mope that I feel I'm wasting my time, like William from Mallrats who camps out in front of the random-dot-stereogram all day but can never see the sailboat. I just don't see it and I can't tell if I'm looking at it wrong or if there's just nothing there. I assume the former which has put me down this particular rabbit hole. At some point I need to stop digging.

        arclight@oldbytes.spaceA This user is from outside of this forum
        arclight@oldbytes.spaceA This user is from outside of this forum
        arclight@oldbytes.space
        wrote last edited by
        #41

        I think @jannem identified a key pain point related to expected programmer usage - a very specific type or structure of program that languages like Scheme and Forth excel at. @peterb summed it up when quoting this thread - "batteries not included".

        Implementing this example in Forth would probably make this issue clearer. You're not so much programming but building a DSL from the ground up. There's an excruciatingly small - powerful but spartan - set of commands available that are mostly limited to building other commands. The expectation is that your problem lends itself to being decomposed into a set of increasingly detailed expressions. All the comforts of intermediate state or structures beyond expressions are missing. This might be great for the interpreter but it's hell on human parsing and interpretation, diagnostics, and efficiency.

        There's a further unstated expectation with Scheme that your problem can be reduced to simple transaction processing. Go process this arbitrarily long list or sequence of identical simple things or generate a sequence starting with a single simple element.

        The expectation of simplicity and uniformity, of an elegant problem, does not lend itself to most forms of engineering analysis. Here's an example of what I deal with on a regular basis: calculate the diffusion coefficient for use in a higher-level function which models aerosol behavior. This is taken from DOE HD-10216, Vol. VIII "Modifications for the Development of the MAAP-DOE Code Volume VIII: Resolution of the Outstanding Nuclear Fission Product Aerosol Transport and Deposition issues WBS 3.4.2" if you want the full context.

        Link Preview Image
        arclight@oldbytes.spaceA 1 Reply Last reply
        0
        • arclight@oldbytes.spaceA arclight@oldbytes.space

          I think @jannem identified a key pain point related to expected programmer usage - a very specific type or structure of program that languages like Scheme and Forth excel at. @peterb summed it up when quoting this thread - "batteries not included".

          Implementing this example in Forth would probably make this issue clearer. You're not so much programming but building a DSL from the ground up. There's an excruciatingly small - powerful but spartan - set of commands available that are mostly limited to building other commands. The expectation is that your problem lends itself to being decomposed into a set of increasingly detailed expressions. All the comforts of intermediate state or structures beyond expressions are missing. This might be great for the interpreter but it's hell on human parsing and interpretation, diagnostics, and efficiency.

          There's a further unstated expectation with Scheme that your problem can be reduced to simple transaction processing. Go process this arbitrarily long list or sequence of identical simple things or generate a sequence starting with a single simple element.

          The expectation of simplicity and uniformity, of an elegant problem, does not lend itself to most forms of engineering analysis. Here's an example of what I deal with on a regular basis: calculate the diffusion coefficient for use in a higher-level function which models aerosol behavior. This is taken from DOE HD-10216, Vol. VIII "Modifications for the Development of the MAAP-DOE Code Volume VIII: Resolution of the Outstanding Nuclear Fission Product Aerosol Transport and Deposition issues WBS 3.4.2" if you want the full context.

          Link Preview Image
          arclight@oldbytes.spaceA This user is from outside of this forum
          arclight@oldbytes.spaceA This user is from outside of this forum
          arclight@oldbytes.space
          wrote last edited by
          #42

          What I want you to note in this notation is that in most cases the full argument list for each expression is not written. There are a few key variables (radius/volume, temperature) and a whole bunch of ancillary parameters (material properties, physical constants, system conditions). The latter are omitted out of clarity but in code implementation every ancillary parameter is has to be a function argument. Modules let you import parameters and functions directly from within a function but that's compile-time syntactic sugar; get rid of modules and you can move all that baggage into the function interface. I'll leave it as an exercise to write out D(r) with the full list of required arguments and parameters.

          And this is a _simple_ example - it's just a long sequence of tedious arithmetic. No condition statements, decision logic, iterative solution, just a big tedious pile of algebra reduceable to a single function call.

          The only way we make sense of these pro arms as engineers is by introducing intermediate quantities as abstractions and ignoring the details we are not focused on.

          The object-oriented approach helps a huge amount by allowing elements to be composed of other elements which hides detail while still keeping it accessible. A factory function that populated an object capable of calculating this diffusion coefficient would suffer the same fate of needing the same mile-long argument list as the straight Scheme expression. It's an essential complexity of these sorts of problems but it's treated like the constant term in algorithmic complexity (O[n]) analysis - it's not interesting so it's ignored. Which is great until your O[n log n] algorithm gets smoked by an O[n²] algorithm because the constant terms dominate for the actual value of n for your application.

          The diffusion coefficient calculation situation occurs far more frequently in practice than Harper & Sussman's ODE situation. It's tedious and uninteresting from their perspective - they're trying to show the flexibility of (history t n) in an environment without the affordance of indexed (random access) lists so the constant terms aren't their focus. It's fair to do that but it's left for the reader to recognize **they aren't solving the whole problem.**

          As an engineering analysis application developer, I am responsible for identifying and solving the whole problem and for communicating this solution to people who may lack either the subject matter background (software developers) or software engineering background (engineering analysts). Notation - how we name things - is incredibly important. So while I'm not faulting Harper & Sussman for focusing on the central point of their book, it excruciatingly difficult to put their ideas into practice in real engineering applications because of their insistence on using a batteries-not-included implementation language. It's minimal and elegant (in concept at least) but their Scheme syntax maps poorly to the underlying subject matter notation using human eyes on print media. It's made easier with a good IDE but that's now a hidden external dependency for the practitioner. If I want to effectively make sense of their work, the IDE requirement is a serious barrier.

          You might call me out for being a troglodyte, being wed to dead-tree media. I'll counter with the dead-tree media being perpetually accessible and immutable. It may accumulate errata and obsolescence but I'm guaranteed what I read from thd printed pagd is deterministic and repeatable. Requiring the subject to be explained or understood via an IDE makes it ephemeral. This is off in the weeds but we really don't want the medium to be the message here.

          di4na@hachyderm.ioD arclight@oldbytes.spaceA mdhughes@appdot.netM 3 Replies Last reply
          0
          • arclight@oldbytes.spaceA arclight@oldbytes.space

            What I want you to note in this notation is that in most cases the full argument list for each expression is not written. There are a few key variables (radius/volume, temperature) and a whole bunch of ancillary parameters (material properties, physical constants, system conditions). The latter are omitted out of clarity but in code implementation every ancillary parameter is has to be a function argument. Modules let you import parameters and functions directly from within a function but that's compile-time syntactic sugar; get rid of modules and you can move all that baggage into the function interface. I'll leave it as an exercise to write out D(r) with the full list of required arguments and parameters.

            And this is a _simple_ example - it's just a long sequence of tedious arithmetic. No condition statements, decision logic, iterative solution, just a big tedious pile of algebra reduceable to a single function call.

            The only way we make sense of these pro arms as engineers is by introducing intermediate quantities as abstractions and ignoring the details we are not focused on.

            The object-oriented approach helps a huge amount by allowing elements to be composed of other elements which hides detail while still keeping it accessible. A factory function that populated an object capable of calculating this diffusion coefficient would suffer the same fate of needing the same mile-long argument list as the straight Scheme expression. It's an essential complexity of these sorts of problems but it's treated like the constant term in algorithmic complexity (O[n]) analysis - it's not interesting so it's ignored. Which is great until your O[n log n] algorithm gets smoked by an O[n²] algorithm because the constant terms dominate for the actual value of n for your application.

            The diffusion coefficient calculation situation occurs far more frequently in practice than Harper & Sussman's ODE situation. It's tedious and uninteresting from their perspective - they're trying to show the flexibility of (history t n) in an environment without the affordance of indexed (random access) lists so the constant terms aren't their focus. It's fair to do that but it's left for the reader to recognize **they aren't solving the whole problem.**

            As an engineering analysis application developer, I am responsible for identifying and solving the whole problem and for communicating this solution to people who may lack either the subject matter background (software developers) or software engineering background (engineering analysts). Notation - how we name things - is incredibly important. So while I'm not faulting Harper & Sussman for focusing on the central point of their book, it excruciatingly difficult to put their ideas into practice in real engineering applications because of their insistence on using a batteries-not-included implementation language. It's minimal and elegant (in concept at least) but their Scheme syntax maps poorly to the underlying subject matter notation using human eyes on print media. It's made easier with a good IDE but that's now a hidden external dependency for the practitioner. If I want to effectively make sense of their work, the IDE requirement is a serious barrier.

            You might call me out for being a troglodyte, being wed to dead-tree media. I'll counter with the dead-tree media being perpetually accessible and immutable. It may accumulate errata and obsolescence but I'm guaranteed what I read from thd printed pagd is deterministic and repeatable. Requiring the subject to be explained or understood via an IDE makes it ephemeral. This is off in the weeds but we really don't want the medium to be the message here.

            di4na@hachyderm.ioD This user is from outside of this forum
            di4na@hachyderm.ioD This user is from outside of this forum
            di4na@hachyderm.io
            wrote last edited by
            #43

            @arclight Nah I agree with you. I think the Object model has its own share of problems, but yeah, this is the same reason i don't like most Lisp, at least as an human facing interface.

            It is basically the machine language with no sugar, and even then it is limited to an old and reductive model.

            1 Reply Last reply
            0
            • arclight@oldbytes.spaceA arclight@oldbytes.space

              What I want you to note in this notation is that in most cases the full argument list for each expression is not written. There are a few key variables (radius/volume, temperature) and a whole bunch of ancillary parameters (material properties, physical constants, system conditions). The latter are omitted out of clarity but in code implementation every ancillary parameter is has to be a function argument. Modules let you import parameters and functions directly from within a function but that's compile-time syntactic sugar; get rid of modules and you can move all that baggage into the function interface. I'll leave it as an exercise to write out D(r) with the full list of required arguments and parameters.

              And this is a _simple_ example - it's just a long sequence of tedious arithmetic. No condition statements, decision logic, iterative solution, just a big tedious pile of algebra reduceable to a single function call.

              The only way we make sense of these pro arms as engineers is by introducing intermediate quantities as abstractions and ignoring the details we are not focused on.

              The object-oriented approach helps a huge amount by allowing elements to be composed of other elements which hides detail while still keeping it accessible. A factory function that populated an object capable of calculating this diffusion coefficient would suffer the same fate of needing the same mile-long argument list as the straight Scheme expression. It's an essential complexity of these sorts of problems but it's treated like the constant term in algorithmic complexity (O[n]) analysis - it's not interesting so it's ignored. Which is great until your O[n log n] algorithm gets smoked by an O[n²] algorithm because the constant terms dominate for the actual value of n for your application.

              The diffusion coefficient calculation situation occurs far more frequently in practice than Harper & Sussman's ODE situation. It's tedious and uninteresting from their perspective - they're trying to show the flexibility of (history t n) in an environment without the affordance of indexed (random access) lists so the constant terms aren't their focus. It's fair to do that but it's left for the reader to recognize **they aren't solving the whole problem.**

              As an engineering analysis application developer, I am responsible for identifying and solving the whole problem and for communicating this solution to people who may lack either the subject matter background (software developers) or software engineering background (engineering analysts). Notation - how we name things - is incredibly important. So while I'm not faulting Harper & Sussman for focusing on the central point of their book, it excruciatingly difficult to put their ideas into practice in real engineering applications because of their insistence on using a batteries-not-included implementation language. It's minimal and elegant (in concept at least) but their Scheme syntax maps poorly to the underlying subject matter notation using human eyes on print media. It's made easier with a good IDE but that's now a hidden external dependency for the practitioner. If I want to effectively make sense of their work, the IDE requirement is a serious barrier.

              You might call me out for being a troglodyte, being wed to dead-tree media. I'll counter with the dead-tree media being perpetually accessible and immutable. It may accumulate errata and obsolescence but I'm guaranteed what I read from thd printed pagd is deterministic and repeatable. Requiring the subject to be explained or understood via an IDE makes it ephemeral. This is off in the weeds but we really don't want the medium to be the message here.

              arclight@oldbytes.spaceA This user is from outside of this forum
              arclight@oldbytes.spaceA This user is from outside of this forum
              arclight@oldbytes.space
              wrote last edited by
              #44

              My conclusion for the moment is that the functional approach is extremely limited in its application for the sort of software I deal with. Conceptually the notions of immutability and lack of side effects are valuable. Recursion has efficiency issues and subtle but catastrophic failure modes compared to known-finite iteration or array operations. Much of the problem seems to be bound up in notation and the expectated solution form of specific functions languages. The notational issue is obvious but the expectations of uniformity, simplicity, and elegance of problem are not reasonable for even simple (but tedious) non-repetitive calculation. There are good concepts to understand here but practical applicability is extremely limited because of the nature of physical modeling and the extensive need for varied and structured data.

              I appreciate Harper & Sussman's efforts but ultimately, I think the problem is a lack of detailed and realistic case studies of actual physical analysis. The points they are trying to get across would be quickly mired in tedious calculation and coding. That should be a warning that their specific approach may not be able to address simpler common issues. Maybe those are addressed in a different book, maybe in my quick scan of their book I skipped over where they acknowledge these issues or point to a broader reference, I don't know because I jumped to a quick recognizable example. I'm not trying to blame them for not addressing an issue that they didn't set out to address. I'm just noting my frustration with finding any example of a purely functional approach using a common language to solving broadly data-heavy problems from a straightforward domain-specific problem statement. It's a lot of work to do even a small example and nobody wants to put in the effort to document the process of solving a problem with the wrong tool. The question is how do you know a tool is the wrong tool if you don't actually explore how the tool fails in practice and identify the characteristics of problems that are a bad fit for a language (e.g. hardware interfacing, text processing, expression evaluation in Fortran).

              This isn't a complaint about Scheme. The language is what it is. It shouldn't be a surprise given that it descends from Lisp, the other long-lived language originally designed for the IBM 704 in the mid-1950s. It's simply not designed to solve problems with a broad set of unique (non-sequential) data with a moderately high degree of coupling. But as far as I can tell, nobody actually says that in a single coherent sentence. It's always vaguely alluded to but never spelled out. A lot I think is the myopia of the language community or instructors in not understanding the breadth of applications - the classes of problems actually solved by people who are outside computer science and how/which languages get used. There's virtually no feedback from engineering analysis to computer science the same way say web applications or databases or embedded firmware or search or text processing feeds back. CS hits floating point numerics and calls it a day, not looking at the actual applications and classes of problems that use those numerics.

              datarama@hachyderm.ioD arclight@oldbytes.spaceA 2 Replies Last reply
              0
              • arclight@oldbytes.spaceA arclight@oldbytes.space

                My conclusion for the moment is that the functional approach is extremely limited in its application for the sort of software I deal with. Conceptually the notions of immutability and lack of side effects are valuable. Recursion has efficiency issues and subtle but catastrophic failure modes compared to known-finite iteration or array operations. Much of the problem seems to be bound up in notation and the expectated solution form of specific functions languages. The notational issue is obvious but the expectations of uniformity, simplicity, and elegance of problem are not reasonable for even simple (but tedious) non-repetitive calculation. There are good concepts to understand here but practical applicability is extremely limited because of the nature of physical modeling and the extensive need for varied and structured data.

                I appreciate Harper & Sussman's efforts but ultimately, I think the problem is a lack of detailed and realistic case studies of actual physical analysis. The points they are trying to get across would be quickly mired in tedious calculation and coding. That should be a warning that their specific approach may not be able to address simpler common issues. Maybe those are addressed in a different book, maybe in my quick scan of their book I skipped over where they acknowledge these issues or point to a broader reference, I don't know because I jumped to a quick recognizable example. I'm not trying to blame them for not addressing an issue that they didn't set out to address. I'm just noting my frustration with finding any example of a purely functional approach using a common language to solving broadly data-heavy problems from a straightforward domain-specific problem statement. It's a lot of work to do even a small example and nobody wants to put in the effort to document the process of solving a problem with the wrong tool. The question is how do you know a tool is the wrong tool if you don't actually explore how the tool fails in practice and identify the characteristics of problems that are a bad fit for a language (e.g. hardware interfacing, text processing, expression evaluation in Fortran).

                This isn't a complaint about Scheme. The language is what it is. It shouldn't be a surprise given that it descends from Lisp, the other long-lived language originally designed for the IBM 704 in the mid-1950s. It's simply not designed to solve problems with a broad set of unique (non-sequential) data with a moderately high degree of coupling. But as far as I can tell, nobody actually says that in a single coherent sentence. It's always vaguely alluded to but never spelled out. A lot I think is the myopia of the language community or instructors in not understanding the breadth of applications - the classes of problems actually solved by people who are outside computer science and how/which languages get used. There's virtually no feedback from engineering analysis to computer science the same way say web applications or databases or embedded firmware or search or text processing feeds back. CS hits floating point numerics and calls it a day, not looking at the actual applications and classes of problems that use those numerics.

                datarama@hachyderm.ioD This user is from outside of this forum
                datarama@hachyderm.ioD This user is from outside of this forum
                datarama@hachyderm.io
                wrote last edited by
                #45

                @arclight I think Lisp isn't a good tool for that sort of problem, really, unless you involve some DSL.

                Where it shines, in my opinion, is for problems shaped like slinging around tree-shaped data structures. Not all problems can be hammered into that shape, but many ones I like puttering with can with little effort. 🙂

                arclight@oldbytes.spaceA 1 Reply Last reply
                0
                • arclight@oldbytes.spaceA arclight@oldbytes.space

                  What I want you to note in this notation is that in most cases the full argument list for each expression is not written. There are a few key variables (radius/volume, temperature) and a whole bunch of ancillary parameters (material properties, physical constants, system conditions). The latter are omitted out of clarity but in code implementation every ancillary parameter is has to be a function argument. Modules let you import parameters and functions directly from within a function but that's compile-time syntactic sugar; get rid of modules and you can move all that baggage into the function interface. I'll leave it as an exercise to write out D(r) with the full list of required arguments and parameters.

                  And this is a _simple_ example - it's just a long sequence of tedious arithmetic. No condition statements, decision logic, iterative solution, just a big tedious pile of algebra reduceable to a single function call.

                  The only way we make sense of these pro arms as engineers is by introducing intermediate quantities as abstractions and ignoring the details we are not focused on.

                  The object-oriented approach helps a huge amount by allowing elements to be composed of other elements which hides detail while still keeping it accessible. A factory function that populated an object capable of calculating this diffusion coefficient would suffer the same fate of needing the same mile-long argument list as the straight Scheme expression. It's an essential complexity of these sorts of problems but it's treated like the constant term in algorithmic complexity (O[n]) analysis - it's not interesting so it's ignored. Which is great until your O[n log n] algorithm gets smoked by an O[n²] algorithm because the constant terms dominate for the actual value of n for your application.

                  The diffusion coefficient calculation situation occurs far more frequently in practice than Harper & Sussman's ODE situation. It's tedious and uninteresting from their perspective - they're trying to show the flexibility of (history t n) in an environment without the affordance of indexed (random access) lists so the constant terms aren't their focus. It's fair to do that but it's left for the reader to recognize **they aren't solving the whole problem.**

                  As an engineering analysis application developer, I am responsible for identifying and solving the whole problem and for communicating this solution to people who may lack either the subject matter background (software developers) or software engineering background (engineering analysts). Notation - how we name things - is incredibly important. So while I'm not faulting Harper & Sussman for focusing on the central point of their book, it excruciatingly difficult to put their ideas into practice in real engineering applications because of their insistence on using a batteries-not-included implementation language. It's minimal and elegant (in concept at least) but their Scheme syntax maps poorly to the underlying subject matter notation using human eyes on print media. It's made easier with a good IDE but that's now a hidden external dependency for the practitioner. If I want to effectively make sense of their work, the IDE requirement is a serious barrier.

                  You might call me out for being a troglodyte, being wed to dead-tree media. I'll counter with the dead-tree media being perpetually accessible and immutable. It may accumulate errata and obsolescence but I'm guaranteed what I read from thd printed pagd is deterministic and repeatable. Requiring the subject to be explained or understood via an IDE makes it ephemeral. This is off in the weeds but we really don't want the medium to be the message here.

                  mdhughes@appdot.netM This user is from outside of this forum
                  mdhughes@appdot.netM This user is from outside of this forum
                  mdhughes@appdot.net
                  wrote last edited by
                  #46

                  @arclight There are, of course, objects in Scheme, just like in Common Lisp. Typically we start with records, which are pretty primitive. CLOS and Scheme class systems (there's many many variants, everyone's written one or more) are more useful at data collecting/hiding. let-over-lambda objects are the Schemiest and most flexible, but harder to build big structures of.

                  Nobody uses an IDE for Scheme/CL. Some of us use just REPL & vi (or ex, in one case I know), others use emacs & SLIME.
                  #scheme

                  1 Reply Last reply
                  0
                  • datarama@hachyderm.ioD datarama@hachyderm.io

                    @arclight I think Lisp isn't a good tool for that sort of problem, really, unless you involve some DSL.

                    Where it shines, in my opinion, is for problems shaped like slinging around tree-shaped data structures. Not all problems can be hammered into that shape, but many ones I like puttering with can with little effort. 🙂

                    arclight@oldbytes.spaceA This user is from outside of this forum
                    arclight@oldbytes.spaceA This user is from outside of this forum
                    arclight@oldbytes.space
                    wrote last edited by
                    #47

                    @datarama Exactly! Which is one reason I find the incessant cramming of unavoidable functional features into general purpose languages (say Python) really annoying because they make my life difficult. They keep bending the language away from certain classes of problems for what I can only see as fashion or whim.

                    1 Reply Last reply
                    0
                    • arclight@oldbytes.spaceA arclight@oldbytes.space

                      @AlgoCompSynth I tried Haskell and within 30 minutes had hard-locked my desktop and needed to power-cycle it to get it back. Hadn't had that happen in decades. I looked at Julia; it's designed for research code with odd design choices plus this breathless fascination with multiple dispatch. Didn't seem worth pursuing.

                      I'm still having a big problem finding anything but C++ and Modern Fortran for writing production code. Ada was too hard to get traction with and it's more intended for embedded systems rather than desktops and servers. Everything else is single source, the implemtation is the spec. Great until the maintainers decide that slop PRs are acceptable and you're chained to that sinking ship. 😕

                      jannem@fosstodon.orgJ This user is from outside of this forum
                      jannem@fosstodon.orgJ This user is from outside of this forum
                      jannem@fosstodon.org
                      wrote last edited by
                      #48

                      @arclight @AlgoCompSynth
                      Maybe Go or possibly Rust might be possible alternatives for you.

                      But. What you're looking for is an imperative language for numerical computing. And modern Fortran is right there. I honestly don't think you'll find anything that can really improve a lot on that.

                      C++ has, to me, sort of the same issue as Scheme. A language committee keeps creating a new, incomprehensible language every few years and insists it should still be called C++.

                      algocompsynth@mastodon.socialA 1 Reply Last reply
                      0
                      • jannem@fosstodon.orgJ jannem@fosstodon.org

                        @arclight @AlgoCompSynth
                        Maybe Go or possibly Rust might be possible alternatives for you.

                        But. What you're looking for is an imperative language for numerical computing. And modern Fortran is right there. I honestly don't think you'll find anything that can really improve a lot on that.

                        C++ has, to me, sort of the same issue as Scheme. A language committee keeps creating a new, incomprehensible language every few years and insists it should still be called C++.

                        algocompsynth@mastodon.socialA This user is from outside of this forum
                        algocompsynth@mastodon.socialA This user is from outside of this forum
                        algocompsynth@mastodon.social
                        wrote last edited by
                        #49

                        @jannem @arclight I earned a good living with a mix of assembly and Fortran for decades. I haven't touched Fortran since 1990; I'm happy with R because the "slow" parts were moved to C or Fortran libraries decades ago.

                        jannem@fosstodon.orgJ 1 Reply Last reply
                        0
                        • algocompsynth@mastodon.socialA algocompsynth@mastodon.social

                          @jannem @arclight I earned a good living with a mix of assembly and Fortran for decades. I haven't touched Fortran since 1990; I'm happy with R because the "slow" parts were moved to C or Fortran libraries decades ago.

                          jannem@fosstodon.orgJ This user is from outside of this forum
                          jannem@fosstodon.orgJ This user is from outside of this forum
                          jannem@fosstodon.org
                          wrote last edited by
                          #50

                          @AlgoCompSynth @arclight
                          I believe a major concern here is software longevity. You need to be able to grab the source in 15-20 years and it should verifiably still just work, and in the same way it used to.

                          R, for all its good points, is not meant for that. Similar issues would rule out Python, Perl, Ruby, JS and so on.

                          algocompsynth@mastodon.socialA 1 Reply Last reply
                          0
                          • jannem@fosstodon.orgJ jannem@fosstodon.org

                            @AlgoCompSynth @arclight
                            I believe a major concern here is software longevity. You need to be able to grab the source in 15-20 years and it should verifiably still just work, and in the same way it used to.

                            R, for all its good points, is not meant for that. Similar issues would rule out Python, Perl, Ruby, JS and so on.

                            algocompsynth@mastodon.socialA This user is from outside of this forum
                            algocompsynth@mastodon.socialA This user is from outside of this forum
                            algocompsynth@mastodon.social
                            wrote last edited by
                            #51

                            @jannem @arclight I have access to a collection of PDP-11 Fortran 77 that I worked with in the 1980s. gfortran can't deal with it; the team is rewriting most of it in Python. 😉

                            1 Reply Last reply
                            0
                            • arclight@oldbytes.spaceA arclight@oldbytes.space

                              My conclusion for the moment is that the functional approach is extremely limited in its application for the sort of software I deal with. Conceptually the notions of immutability and lack of side effects are valuable. Recursion has efficiency issues and subtle but catastrophic failure modes compared to known-finite iteration or array operations. Much of the problem seems to be bound up in notation and the expectated solution form of specific functions languages. The notational issue is obvious but the expectations of uniformity, simplicity, and elegance of problem are not reasonable for even simple (but tedious) non-repetitive calculation. There are good concepts to understand here but practical applicability is extremely limited because of the nature of physical modeling and the extensive need for varied and structured data.

                              I appreciate Harper & Sussman's efforts but ultimately, I think the problem is a lack of detailed and realistic case studies of actual physical analysis. The points they are trying to get across would be quickly mired in tedious calculation and coding. That should be a warning that their specific approach may not be able to address simpler common issues. Maybe those are addressed in a different book, maybe in my quick scan of their book I skipped over where they acknowledge these issues or point to a broader reference, I don't know because I jumped to a quick recognizable example. I'm not trying to blame them for not addressing an issue that they didn't set out to address. I'm just noting my frustration with finding any example of a purely functional approach using a common language to solving broadly data-heavy problems from a straightforward domain-specific problem statement. It's a lot of work to do even a small example and nobody wants to put in the effort to document the process of solving a problem with the wrong tool. The question is how do you know a tool is the wrong tool if you don't actually explore how the tool fails in practice and identify the characteristics of problems that are a bad fit for a language (e.g. hardware interfacing, text processing, expression evaluation in Fortran).

                              This isn't a complaint about Scheme. The language is what it is. It shouldn't be a surprise given that it descends from Lisp, the other long-lived language originally designed for the IBM 704 in the mid-1950s. It's simply not designed to solve problems with a broad set of unique (non-sequential) data with a moderately high degree of coupling. But as far as I can tell, nobody actually says that in a single coherent sentence. It's always vaguely alluded to but never spelled out. A lot I think is the myopia of the language community or instructors in not understanding the breadth of applications - the classes of problems actually solved by people who are outside computer science and how/which languages get used. There's virtually no feedback from engineering analysis to computer science the same way say web applications or databases or embedded firmware or search or text processing feeds back. CS hits floating point numerics and calls it a day, not looking at the actual applications and classes of problems that use those numerics.

                              arclight@oldbytes.spaceA This user is from outside of this forum
                              arclight@oldbytes.spaceA This user is from outside of this forum
                              arclight@oldbytes.space
                              wrote last edited by
                              #52

                              Reading the book in earnest now starting with the preface and introduction and realizing that I have a companion for Michael Kuipferschmid's "Classical FORTRAN" https://www.routledge.com/Classical-Fortran-Programming-for-Engineering-and-Scientific-Applications-Second-Edition/Kupferschmid/p/book/9781138116436 Both books are well-reasoned and well-written but are based on values which are actively harmful and destructive in my environment. Kuipferschmid rejects the bulk of Modern Fortran and focuses on best practices to work within the capabilities of F77. The premise is flatly wrong and backwards but otherwise it's a great book.

                              With this book, the problem is the notion that software malleability and flexibility is an unalloyed good. This is more true among researchers and hacker hobbyists because they are writing software only for themselves and there are low consequences for failure. I'm in the camp of writing software for a small group of others in a very risk-averse framework. I actively don't want applications to be modified once they pass acceptance installation testing. Eval and treating code as data and data as code are liabilities, not features.

                              An example from That R Code: the input file is actually R code that's eval'd to set input parameters. It's quick and convenient and avoids the tedium of writing an input processor. Unfortunately, it also allows a user to add unfiltered executable code to the application and has virtually no input verification to warn of potentially garbage inputs. "Configuration data as executable code" is a concept that has no place in our environment and probably not in any environment where there are safety or security consequences.

                              Can you build flexible code without such gaping security and assurance holes? Maybe someone can; I don't this book is going to be much help for people who write code where failures have consequences greater than developer inconvenience. Which IMO should be most paid industrial or commercial development (heavily implying that a too-large fraction of code being developed for hire is inconsequential - side-eye at the majority of web code.)

                              I worked on web systems for 13 years; most were garbage then and the situation has only gotten worse since. But I digress.

                              I'll continue with the book because I'm curious about some of the concepts. Hopefully I can get some ideas that work within a high-integrity environment. If not, I still learn something interesting and have much more specific risks to watch for when someone naïvely suggests we need to move to a more Functional code model. 😕

                              Honestly, the book has paid for itself for the discussion it prompted. I read maybe 3 pages, got excellent feedback and discussion, and it all greatly clarified my thinking and understanding.

                              arclight@oldbytes.spaceA 1 Reply Last reply
                              0
                              • arclight@oldbytes.spaceA arclight@oldbytes.space

                                Reading the book in earnest now starting with the preface and introduction and realizing that I have a companion for Michael Kuipferschmid's "Classical FORTRAN" https://www.routledge.com/Classical-Fortran-Programming-for-Engineering-and-Scientific-Applications-Second-Edition/Kupferschmid/p/book/9781138116436 Both books are well-reasoned and well-written but are based on values which are actively harmful and destructive in my environment. Kuipferschmid rejects the bulk of Modern Fortran and focuses on best practices to work within the capabilities of F77. The premise is flatly wrong and backwards but otherwise it's a great book.

                                With this book, the problem is the notion that software malleability and flexibility is an unalloyed good. This is more true among researchers and hacker hobbyists because they are writing software only for themselves and there are low consequences for failure. I'm in the camp of writing software for a small group of others in a very risk-averse framework. I actively don't want applications to be modified once they pass acceptance installation testing. Eval and treating code as data and data as code are liabilities, not features.

                                An example from That R Code: the input file is actually R code that's eval'd to set input parameters. It's quick and convenient and avoids the tedium of writing an input processor. Unfortunately, it also allows a user to add unfiltered executable code to the application and has virtually no input verification to warn of potentially garbage inputs. "Configuration data as executable code" is a concept that has no place in our environment and probably not in any environment where there are safety or security consequences.

                                Can you build flexible code without such gaping security and assurance holes? Maybe someone can; I don't this book is going to be much help for people who write code where failures have consequences greater than developer inconvenience. Which IMO should be most paid industrial or commercial development (heavily implying that a too-large fraction of code being developed for hire is inconsequential - side-eye at the majority of web code.)

                                I worked on web systems for 13 years; most were garbage then and the situation has only gotten worse since. But I digress.

                                I'll continue with the book because I'm curious about some of the concepts. Hopefully I can get some ideas that work within a high-integrity environment. If not, I still learn something interesting and have much more specific risks to watch for when someone naïvely suggests we need to move to a more Functional code model. 😕

                                Honestly, the book has paid for itself for the discussion it prompted. I read maybe 3 pages, got excellent feedback and discussion, and it all greatly clarified my thinking and understanding.

                                arclight@oldbytes.spaceA This user is from outside of this forum
                                arclight@oldbytes.spaceA This user is from outside of this forum
                                arclight@oldbytes.space
                                wrote last edited by
                                #53

                                I wonder sometimes if the people in positions of authority and influence, writing textbooks like this, if they ever deal with software of consequence? I don't expect every senior CS professor to have worked on DO-138C qualified avionics software for example. But it would be nice if the field as a whole taught students that bad engineering kills people and you need to take quality and safety seriously, at least for the duration of the students' education. Take your work seriously and do your best to avoid shipping shoddy, dangerous code. Have some standards and a conscience. #MinimumViableEthics

                                kevinr@masto.free-dissociation.comK dacmot@sunny.gardenD 2 Replies Last reply
                                0
                                • arclight@oldbytes.spaceA arclight@oldbytes.space

                                  I wonder sometimes if the people in positions of authority and influence, writing textbooks like this, if they ever deal with software of consequence? I don't expect every senior CS professor to have worked on DO-138C qualified avionics software for example. But it would be nice if the field as a whole taught students that bad engineering kills people and you need to take quality and safety seriously, at least for the duration of the students' education. Take your work seriously and do your best to avoid shipping shoddy, dangerous code. Have some standards and a conscience. #MinimumViableEthics

                                  kevinr@masto.free-dissociation.comK This user is from outside of this forum
                                  kevinr@masto.free-dissociation.comK This user is from outside of this forum
                                  kevinr@masto.free-dissociation.com
                                  wrote last edited by
                                  #54

                                  @arclight this was one of the first papers we read in my systems engineering course in college and it was very formative for me

                                  https://web.stanford.edu/class/archive/cs/cs240/cs240.1236/old//sp2014/readings/therac-25.pdf

                                  kevinr@masto.free-dissociation.comK dcnorris@scicomm.xyzD 2 Replies Last reply
                                  0
                                  • kevinr@masto.free-dissociation.comK kevinr@masto.free-dissociation.com

                                    @arclight this was one of the first papers we read in my systems engineering course in college and it was very formative for me

                                    https://web.stanford.edu/class/archive/cs/cs240/cs240.1236/old//sp2014/readings/therac-25.pdf

                                    kevinr@masto.free-dissociation.comK This user is from outside of this forum
                                    kevinr@masto.free-dissociation.comK This user is from outside of this forum
                                    kevinr@masto.free-dissociation.com
                                    wrote last edited by
                                    #55

                                    @arclight many years later I discovered the rest of Prof. Leveson’s work and I won’t say that I made it my whole personality but it’s a lot of it

                                    1 Reply Last reply
                                    0
                                    • kevinr@masto.free-dissociation.comK kevinr@masto.free-dissociation.com

                                      @arclight this was one of the first papers we read in my systems engineering course in college and it was very formative for me

                                      https://web.stanford.edu/class/archive/cs/cs240/cs240.1236/old//sp2014/readings/therac-25.pdf

                                      dcnorris@scicomm.xyzD This user is from outside of this forum
                                      dcnorris@scicomm.xyzD This user is from outside of this forum
                                      dcnorris@scicomm.xyz
                                      wrote last edited by
                                      #56

                                      @kevinr @arclight TY for reminding of this. Found a cleaner scan here btw https://sci-hub.st/10.1109/MC.1993.274940

                                      1 Reply Last reply
                                      0
                                      • arclight@oldbytes.spaceA arclight@oldbytes.space

                                        I wonder sometimes if the people in positions of authority and influence, writing textbooks like this, if they ever deal with software of consequence? I don't expect every senior CS professor to have worked on DO-138C qualified avionics software for example. But it would be nice if the field as a whole taught students that bad engineering kills people and you need to take quality and safety seriously, at least for the duration of the students' education. Take your work seriously and do your best to avoid shipping shoddy, dangerous code. Have some standards and a conscience. #MinimumViableEthics

                                        dacmot@sunny.gardenD This user is from outside of this forum
                                        dacmot@sunny.gardenD This user is from outside of this forum
                                        dacmot@sunny.garden
                                        wrote last edited by
                                        #57

                                        @arclight in Canada anyway we have a few Software Engineering programs that attempt to teach safety concerns and consequences. In the program I did 25 years ago we have some professors who had worked on nuclear power plants, avionics and medical equipment like pacemakers.

                                        1 Reply Last reply
                                        0
                                        • arclight@oldbytes.spaceA arclight@oldbytes.space

                                          @nyrath APL is simultaneously genius and batshit. It's incredible what you can do with 2-3 sigils but it is cryptic as hell. It the sort of language used by people who talk to crows.

                                          jonocarroll@fosstodon.orgJ This user is from outside of this forum
                                          jonocarroll@fosstodon.orgJ This user is from outside of this forum
                                          jonocarroll@fosstodon.org
                                          wrote last edited by
                                          #58

                                          @arclight @nyrath dare I curse you with the knowledge that is LispE?

                                          Link Preview Image
                                          5.3 A la APL

                                          An implementation of a full fledged Lisp interpreter with Data Structure, Pattern Programming and High level Functions with Lazy Evaluation à la Haskell. - 5.3 A la APL · naver/lispe Wiki

                                          favicon

                                          GitHub (github.com)

                                          (° '* '(2 3 4) '(1 2 3 4))

                                          nyrath@spacey.spaceN arclight@oldbytes.spaceA 2 Replies Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups