Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. as a toolmaker, there's an inherent tradeoff that I encountered years ago when I just started working at ChipFlow; what I was asked was essentially to develop Amaranth further as a way to de-skill the hardware design (RTL) field.

as a toolmaker, there's an inherent tradeoff that I encountered years ago when I just started working at ChipFlow; what I was asked was essentially to develop Amaranth further as a way to de-skill the hardware design (RTL) field.

Scheduled Pinned Locked Moved Uncategorized
46 Posts 13 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • david_chisnall@infosec.exchangeD david_chisnall@infosec.exchange

    @whitequark

    This is very close to where I parted ways with the FSF. There's always a tension between enabling people to create the desirable thing and enabling people to make the undesirable. Their view is that it should be very hard to make the undesirable thing, and slightly easier to make the desirable thing. My view is that you should make it so easy to make the desirable thing that people always have a choice and then, once the desirable thing exists, you can apply other pressures to get rid of the undesirable thing.

    I don't think deskilling is the right framing for a lot of these things, it's about where you focus cognitive load. There's a line from the Stantec ZEBRA's manual (1956) that says that the 150-instruction limit is not a real problem because no one could possibly write a working program that complex. Small children write programs more complex than that now. That's not a loss to the world, the fact that you don't have to think about certain things means you can think about other things, such as good algorithm and data structure design.

    There was research 20ish years ago comparing C and Java programs and found that the Java programs tended to be more efficient for the same amount of developer effort, because Java programmers would spend more time refining data structure and algorithmic choices and improve entire complexity classes, whereas C programmers spend the time tracking down annoying bug classes that are impossible in Java and doing microoptimisations. Of course, under time pressure, Java developers will simply ship the first thing that works and move onto new features rather than doing that optimisation. C programmers would take longer to get to the MVP level and their poorly optimised code was often faster than poorly optimised Java.

    I see LLMs as very different because they don't provide consistent abstractions. A programmer in a high-level language has a set of well-defined constraints on how their language is lowered to the target hardware and can reason about things, while allowing their run-time environment to make choices within those constraints. Vibe coding does not do this, it delegates thinking to a machine, which then generates code that is not working within a well-defined specification. This really is deskilling because it's not giving you a more abstract reasoning framework, it's removing your ability to reason.

    Letting people accomplish more with less effort, in an environment where their requirements are finite, ends up shifting power to individuals, because it reduces the value of economies of scale.

    whitequark@social.treehouse.systemsW This user is from outside of this forum
    whitequark@social.treehouse.systemsW This user is from outside of this forum
    whitequark@social.treehouse.systems
    wrote last edited by
    #11

    @david_chisnall this is an interesting view, I'll have to think about it.

    1 Reply Last reply
    0
    • whitequark@social.treehouse.systemsW whitequark@social.treehouse.systems

      as a toolmaker, there's an inherent tradeoff that I encountered years ago when I just started working at ChipFlow; what I was asked was essentially to develop Amaranth further as a way to de-skill the hardware design (RTL) field. I agreed because I don't really value the skill of knowing every one of the five hundred different ways in which SystemVerilog is out to fuck you over; I think we'd be better off with tooling that doesn't require you to spend years developing this skill, and that would be a lot more friendly to new RTL developers, and people for whom RTL isn't the primary area of work.

      I also knew that ChipFlow was on the lookout for opportunities to shoehorn AI somewhere into the process. (at first this was limited to "test case generation"—frankly ill conceived idea but one I could hold my nose at and accept—nowadays they've laid off everyone and went all-Claude.) however, it was clear pretty early on that making hardware development more accessible to new people inherently means making it more accessible to new wielders of the wrong machine. benefiting everyone (who isn't a committed SystemVerilog developer) means benefitting everyone, right?

      you can trace this trend in adjacent communities as well. Rust and TypeScript have rich type systems that generally help you write correct code—or bullshit your way towards something that looks more or less correct. I'm pretty sure it's a part of the reason Microsoft spent so much money on TypeScript.

      so today I find myself between a rock and a hard place: every incremental improvement in tooling that I build that makes the field more accessible to new people also means there's less of a barrier to people who just want to extract value from it, squeezing it like Juicero (quite poorly but with an aggressively insulting amount of money behind it). so what do I do now?..

      unlambda@hachyderm.ioU This user is from outside of this forum
      unlambda@hachyderm.ioU This user is from outside of this forum
      unlambda@hachyderm.io
      wrote last edited by
      #12

      @whitequark I feel like the rise of LLMs is mostly just an acceleration of the trend that already existed.

      There's long been the tension within FLOSS software, between the fact that you're making technology more accessible, and you're helping out greedy capitalists who just want to squeeze every ounce of profit out.

      This new trend of LLMs has absolutely accelerated that, and I can definitely see why it causes people to pause and wonder what it is that they're doing and whether they want to support that.

      But I still think it's good to provide better, accessible tooling for new people.

      Shitty people are going to squeeze every ounce of profit and control out of everything no matter what you do. But in the meantime, Amaranth does make it possible for people who want to actually engage with their own brains to learn about hardware design, and I think that's still valuable, whether or not there are awful people out there exploiting it for personal profit.

      whitequark@social.treehouse.systemsW 1 Reply Last reply
      0
      • unlambda@hachyderm.ioU unlambda@hachyderm.io

        @whitequark I feel like the rise of LLMs is mostly just an acceleration of the trend that already existed.

        There's long been the tension within FLOSS software, between the fact that you're making technology more accessible, and you're helping out greedy capitalists who just want to squeeze every ounce of profit out.

        This new trend of LLMs has absolutely accelerated that, and I can definitely see why it causes people to pause and wonder what it is that they're doing and whether they want to support that.

        But I still think it's good to provide better, accessible tooling for new people.

        Shitty people are going to squeeze every ounce of profit and control out of everything no matter what you do. But in the meantime, Amaranth does make it possible for people who want to actually engage with their own brains to learn about hardware design, and I think that's still valuable, whether or not there are awful people out there exploiting it for personal profit.

        whitequark@social.treehouse.systemsW This user is from outside of this forum
        whitequark@social.treehouse.systemsW This user is from outside of this forum
        whitequark@social.treehouse.systems
        wrote last edited by
        #13

        @unlambda It's a bit different if you are (or were, in my case) on their payroll!

        1 Reply Last reply
        0
        • whitequark@social.treehouse.systemsW whitequark@social.treehouse.systems

          @riley sort of? baroque processes are reactionary in nature: they help the incumbent keep its position. if you like the incumbent this is useful. if you don't like the incumbent, like @xgranade didn't like the AI-fication of Calibre, then you get to spend months of your life fixing the plumbing that would otherwise wash out the foundations.

          riley@toot.catR This user is from outside of this forum
          riley@toot.catR This user is from outside of this forum
          riley@toot.cat
          wrote last edited by
          #14

          @whitequark

          I used to do time at Google. Passed the interviews, and, inbetween engineering, got the training to administer them, and took a bunch of interviews from new applicants before I left.

          A running theme in their interviewing criteria, at least back then — it's been a while — was, they looked for an applicant's ability to shift between levels of abstraction.

          In recruitment context, this tends to be conceptualised as a matter of skill and knowledge, but it's actually also a matter of design, to a significant degree. When more effort is put into plugging abstraction leakage, less people have practical "everyday" reasons for moving across those tightly plugged boundaries, get the experience of doing it, and, well, both de-skilling and baroquisation can set in as a result.

          Maybe putting effort into well-designed abstraction leakages, rather than trying to abolish them, would be a useful and pro-social subthread in the work against enshittification. I'm also going to argue that literate programming is a useful tool for managing and understanding (some kinds of) well-designed abstraction leakages.

          @xgranade

          riley@toot.catR whitequark@social.treehouse.systemsW 2 Replies Last reply
          0
          • riley@toot.catR riley@toot.cat

            @whitequark

            I used to do time at Google. Passed the interviews, and, inbetween engineering, got the training to administer them, and took a bunch of interviews from new applicants before I left.

            A running theme in their interviewing criteria, at least back then — it's been a while — was, they looked for an applicant's ability to shift between levels of abstraction.

            In recruitment context, this tends to be conceptualised as a matter of skill and knowledge, but it's actually also a matter of design, to a significant degree. When more effort is put into plugging abstraction leakage, less people have practical "everyday" reasons for moving across those tightly plugged boundaries, get the experience of doing it, and, well, both de-skilling and baroquisation can set in as a result.

            Maybe putting effort into well-designed abstraction leakages, rather than trying to abolish them, would be a useful and pro-social subthread in the work against enshittification. I'm also going to argue that literate programming is a useful tool for managing and understanding (some kinds of) well-designed abstraction leakages.

            @xgranade

            riley@toot.catR This user is from outside of this forum
            riley@toot.catR This user is from outside of this forum
            riley@toot.cat
            wrote last edited by
            #15

            @whitequark Another perspective about this sort of thing is how many basic MUDs and IF languages offer tools for linking up rooms to their immediate neighbours, but a more realistic world-building would often require to also offer some faraway view for the next room over, say, the forest that you can see over the river or a valley, and many early data models of these kinds of languages really weren't very good at making such set-ups convenient. Even though the pattern being modelled is common in real life, it does not come up often enough in Thinking About Thinking kind of discussions (and object-oriented programming classes), and so people designing (meta-)systems often tend to ignore it.

            @xgranade

            1 Reply Last reply
            0
            • whitequark@social.treehouse.systemsW whitequark@social.treehouse.systems

              as a toolmaker, there's an inherent tradeoff that I encountered years ago when I just started working at ChipFlow; what I was asked was essentially to develop Amaranth further as a way to de-skill the hardware design (RTL) field. I agreed because I don't really value the skill of knowing every one of the five hundred different ways in which SystemVerilog is out to fuck you over; I think we'd be better off with tooling that doesn't require you to spend years developing this skill, and that would be a lot more friendly to new RTL developers, and people for whom RTL isn't the primary area of work.

              I also knew that ChipFlow was on the lookout for opportunities to shoehorn AI somewhere into the process. (at first this was limited to "test case generation"—frankly ill conceived idea but one I could hold my nose at and accept—nowadays they've laid off everyone and went all-Claude.) however, it was clear pretty early on that making hardware development more accessible to new people inherently means making it more accessible to new wielders of the wrong machine. benefiting everyone (who isn't a committed SystemVerilog developer) means benefitting everyone, right?

              you can trace this trend in adjacent communities as well. Rust and TypeScript have rich type systems that generally help you write correct code—or bullshit your way towards something that looks more or less correct. I'm pretty sure it's a part of the reason Microsoft spent so much money on TypeScript.

              so today I find myself between a rock and a hard place: every incremental improvement in tooling that I build that makes the field more accessible to new people also means there's less of a barrier to people who just want to extract value from it, squeezing it like Juicero (quite poorly but with an aggressively insulting amount of money behind it). so what do I do now?..

              diondokter@fosstodon.orgD This user is from outside of this forum
              diondokter@fosstodon.orgD This user is from outside of this forum
              diondokter@fosstodon.org
              wrote last edited by
              #16

              @whitequark As a fellow toolmaker, I feel you!

              I know Microsoft has created a couple of drivers using my tools. Let's say it's 3 drivers and it saved a junior San Fran engineer 1 day of work each. Then by my rough estimation they'd have saved about $1500.

              In the mean time, I have seen none of that value in return.

              Idk what to think of it. I made the tool for people like me and I want them to have it for free. But yeah, then MS also gets it for free unless I do weird license things.

              whitequark@social.treehouse.systemsW 1 Reply Last reply
              0
              • riley@toot.catR riley@toot.cat

                @whitequark

                I used to do time at Google. Passed the interviews, and, inbetween engineering, got the training to administer them, and took a bunch of interviews from new applicants before I left.

                A running theme in their interviewing criteria, at least back then — it's been a while — was, they looked for an applicant's ability to shift between levels of abstraction.

                In recruitment context, this tends to be conceptualised as a matter of skill and knowledge, but it's actually also a matter of design, to a significant degree. When more effort is put into plugging abstraction leakage, less people have practical "everyday" reasons for moving across those tightly plugged boundaries, get the experience of doing it, and, well, both de-skilling and baroquisation can set in as a result.

                Maybe putting effort into well-designed abstraction leakages, rather than trying to abolish them, would be a useful and pro-social subthread in the work against enshittification. I'm also going to argue that literate programming is a useful tool for managing and understanding (some kinds of) well-designed abstraction leakages.

                @xgranade

                whitequark@social.treehouse.systemsW This user is from outside of this forum
                whitequark@social.treehouse.systemsW This user is from outside of this forum
                whitequark@social.treehouse.systems
                wrote last edited by
                #17

                @riley @xgranade I think designing around a high-skill-specialization expectation has historically been harmful in this industry; consider how the expectation of needing to know C (a language notoriously lacking in guardrails and good tooling) to do systems programming has both directly contributed to the pervasive gatekeeping and also created a barrier to entry to people not willing to dedicate their life to nvaigating the social and technical aspects of it. it's pretty difficult to me to see how this could be turned around to be prosocial

                riley@toot.catR 1 Reply Last reply
                0
                • diondokter@fosstodon.orgD diondokter@fosstodon.org

                  @whitequark As a fellow toolmaker, I feel you!

                  I know Microsoft has created a couple of drivers using my tools. Let's say it's 3 drivers and it saved a junior San Fran engineer 1 day of work each. Then by my rough estimation they'd have saved about $1500.

                  In the mean time, I have seen none of that value in return.

                  Idk what to think of it. I made the tool for people like me and I want them to have it for free. But yeah, then MS also gets it for free unless I do weird license things.

                  whitequark@social.treehouse.systemsW This user is from outside of this forum
                  whitequark@social.treehouse.systemsW This user is from outside of this forum
                  whitequark@social.treehouse.systems
                  wrote last edited by
                  #18

                  @diondokter I don't really mind that particular bit because my goal with OSS/OSHW is less "creating value" (that's on the agenda but it's more incidental) and more "terraforming", changing the rules by which the world works. I think this is a more interesting mindset to approach OSxx with because a lot of the systems we've been building in the last two decades are of such a high quality that no commercial entity would possibly purchase them (since it's not justifiable to build something like that for a business that would run just fine with a much shittier version of the same thing).

                  yes, under a different economic system, you could have (maybe?) captured some of that value. but under our current one, if Microsoft had to pay you $1500 they would've probably not used your tools at all (because the overhead of figuring out how to get you that money multiplies it severalfold and takes up valuable time of administrative and legal staff). my overall feeling about it, personally, is just "shrug"; I build tools for different reasons

                  diondokter@fosstodon.orgD 1 Reply Last reply
                  0
                  • whitequark@social.treehouse.systemsW whitequark@social.treehouse.systems

                    as a toolmaker, there's an inherent tradeoff that I encountered years ago when I just started working at ChipFlow; what I was asked was essentially to develop Amaranth further as a way to de-skill the hardware design (RTL) field. I agreed because I don't really value the skill of knowing every one of the five hundred different ways in which SystemVerilog is out to fuck you over; I think we'd be better off with tooling that doesn't require you to spend years developing this skill, and that would be a lot more friendly to new RTL developers, and people for whom RTL isn't the primary area of work.

                    I also knew that ChipFlow was on the lookout for opportunities to shoehorn AI somewhere into the process. (at first this was limited to "test case generation"—frankly ill conceived idea but one I could hold my nose at and accept—nowadays they've laid off everyone and went all-Claude.) however, it was clear pretty early on that making hardware development more accessible to new people inherently means making it more accessible to new wielders of the wrong machine. benefiting everyone (who isn't a committed SystemVerilog developer) means benefitting everyone, right?

                    you can trace this trend in adjacent communities as well. Rust and TypeScript have rich type systems that generally help you write correct code—or bullshit your way towards something that looks more or less correct. I'm pretty sure it's a part of the reason Microsoft spent so much money on TypeScript.

                    so today I find myself between a rock and a hard place: every incremental improvement in tooling that I build that makes the field more accessible to new people also means there's less of a barrier to people who just want to extract value from it, squeezing it like Juicero (quite poorly but with an aggressively insulting amount of money behind it). so what do I do now?..

                    coral@empty.cafeC This user is from outside of this forum
                    coral@empty.cafeC This user is from outside of this forum
                    coral@empty.cafe
                    wrote last edited by
                    #19

                    @whitequark I think the processes of value-extraction under capitalism have - structural limitations? - which mean tools like Amaranth are unlikely to be used as part of a destructive and alienating hype bubble.

                    namely, tools contributing to hype bubbles requires not only that a given end is easier than before, but that it's _the easiest_ way to achieve hype at any given moment; any RTL design is never the fastest way to a consumer demo; so Amaranth isn't going to be implicated?

                    whitequark@social.treehouse.systemsW 1 Reply Last reply
                    0
                    • coral@empty.cafeC coral@empty.cafe

                      @whitequark I think the processes of value-extraction under capitalism have - structural limitations? - which mean tools like Amaranth are unlikely to be used as part of a destructive and alienating hype bubble.

                      namely, tools contributing to hype bubbles requires not only that a given end is easier than before, but that it's _the easiest_ way to achieve hype at any given moment; any RTL design is never the fastest way to a consumer demo; so Amaranth isn't going to be implicated?

                      whitequark@social.treehouse.systemsW This user is from outside of this forum
                      whitequark@social.treehouse.systemsW This user is from outside of this forum
                      whitequark@social.treehouse.systems
                      wrote last edited by
                      #20

                      @coral oh, without overtly violating my NDA, I'll just say that flashy demos were absolutely involved

                      coral@empty.cafeC 1 Reply Last reply
                      0
                      • whitequark@social.treehouse.systemsW whitequark@social.treehouse.systems

                        @riley @xgranade I think designing around a high-skill-specialization expectation has historically been harmful in this industry; consider how the expectation of needing to know C (a language notoriously lacking in guardrails and good tooling) to do systems programming has both directly contributed to the pervasive gatekeeping and also created a barrier to entry to people not willing to dedicate their life to nvaigating the social and technical aspects of it. it's pretty difficult to me to see how this could be turned around to be prosocial

                        riley@toot.catR This user is from outside of this forum
                        riley@toot.catR This user is from outside of this forum
                        riley@toot.cat
                        wrote last edited by
                        #21

                        @whitequark I'm thinking something like "An abstraction shall not leak without a good reason to", and considering it an important principle of good design that it is the end engineer (or end user) who gets to ultimately override what reasons are good enough, should upstream reasonings turn out to be problematic. Things like "Thou shalt not hamper logic probe access" would then inherently follow.

                        @xgranade

                        whitequark@social.treehouse.systemsW riley@toot.catR 2 Replies Last reply
                        0
                        • david_chisnall@infosec.exchangeD david_chisnall@infosec.exchange

                          @whitequark

                          This is very close to where I parted ways with the FSF. There's always a tension between enabling people to create the desirable thing and enabling people to make the undesirable. Their view is that it should be very hard to make the undesirable thing, and slightly easier to make the desirable thing. My view is that you should make it so easy to make the desirable thing that people always have a choice and then, once the desirable thing exists, you can apply other pressures to get rid of the undesirable thing.

                          I don't think deskilling is the right framing for a lot of these things, it's about where you focus cognitive load. There's a line from the Stantec ZEBRA's manual (1956) that says that the 150-instruction limit is not a real problem because no one could possibly write a working program that complex. Small children write programs more complex than that now. That's not a loss to the world, the fact that you don't have to think about certain things means you can think about other things, such as good algorithm and data structure design.

                          There was research 20ish years ago comparing C and Java programs and found that the Java programs tended to be more efficient for the same amount of developer effort, because Java programmers would spend more time refining data structure and algorithmic choices and improve entire complexity classes, whereas C programmers spend the time tracking down annoying bug classes that are impossible in Java and doing microoptimisations. Of course, under time pressure, Java developers will simply ship the first thing that works and move onto new features rather than doing that optimisation. C programmers would take longer to get to the MVP level and their poorly optimised code was often faster than poorly optimised Java.

                          I see LLMs as very different because they don't provide consistent abstractions. A programmer in a high-level language has a set of well-defined constraints on how their language is lowered to the target hardware and can reason about things, while allowing their run-time environment to make choices within those constraints. Vibe coding does not do this, it delegates thinking to a machine, which then generates code that is not working within a well-defined specification. This really is deskilling because it's not giving you a more abstract reasoning framework, it's removing your ability to reason.

                          Letting people accomplish more with less effort, in an environment where their requirements are finite, ends up shifting power to individuals, because it reduces the value of economies of scale.

                          giacomo@snac.tesio.itG This user is from outside of this forum
                          giacomo@snac.tesio.itG This user is from outside of this forum
                          giacomo@snac.tesio.it
                          wrote last edited by
                          #22
                          @david_chisnall@infosec.exchange

                          To be honest I think you are misrepresenting #FSF ethical position on the matter that is perfectly aligned with your own: thus the freedom of use for any purpose that is a strong requirement for any #FreeSoftware license.

                          @whitequark@treehouse.systems
                          whitequark@social.treehouse.systemsW david_chisnall@infosec.exchangeD 2 Replies Last reply
                          0
                          • riley@toot.catR riley@toot.cat

                            @whitequark I'm thinking something like "An abstraction shall not leak without a good reason to", and considering it an important principle of good design that it is the end engineer (or end user) who gets to ultimately override what reasons are good enough, should upstream reasonings turn out to be problematic. Things like "Thou shalt not hamper logic probe access" would then inherently follow.

                            @xgranade

                            whitequark@social.treehouse.systemsW This user is from outside of this forum
                            whitequark@social.treehouse.systemsW This user is from outside of this forum
                            whitequark@social.treehouse.systems
                            wrote last edited by
                            #23

                            @riley @xgranade I... don't think that's how things work? all abstractions leak: they take a wide set of possibilities and narrow it down to make it easier to reason about the things you care about, at the cost of making your life harder if you hit one of the things you've decided to leave aside.

                            riley@toot.catR 1 Reply Last reply
                            0
                            • riley@toot.catR riley@toot.cat

                              @whitequark I'm thinking something like "An abstraction shall not leak without a good reason to", and considering it an important principle of good design that it is the end engineer (or end user) who gets to ultimately override what reasons are good enough, should upstream reasonings turn out to be problematic. Things like "Thou shalt not hamper logic probe access" would then inherently follow.

                              @xgranade

                              riley@toot.catR This user is from outside of this forum
                              riley@toot.catR This user is from outside of this forum
                              riley@toot.cat
                              wrote last edited by
                              #24

                              @whitequark Or I might be misunderstanding your argument. Would you like to elaborate on it?

                              @xgranade

                              1 Reply Last reply
                              0
                              • giacomo@snac.tesio.itG giacomo@snac.tesio.it
                                @david_chisnall@infosec.exchange

                                To be honest I think you are misrepresenting #FSF ethical position on the matter that is perfectly aligned with your own: thus the freedom of use for any purpose that is a strong requirement for any #FreeSoftware license.

                                @whitequark@treehouse.systems
                                whitequark@social.treehouse.systemsW This user is from outside of this forum
                                whitequark@social.treehouse.systemsW This user is from outside of this forum
                                whitequark@social.treehouse.systems
                                wrote last edited by
                                #25

                                @giacomo @david_chisnall I think you'll find it that using search to insert yourself uninvited into conversations with people you don't know is a poor way to promote your cause, whatever that is.

                                1 Reply Last reply
                                0
                                • whitequark@social.treehouse.systemsW whitequark@social.treehouse.systems

                                  @riley @xgranade I... don't think that's how things work? all abstractions leak: they take a wide set of possibilities and narrow it down to make it easier to reason about the things you care about, at the cost of making your life harder if you hit one of the things you've decided to leave aside.

                                  riley@toot.catR This user is from outside of this forum
                                  riley@toot.catR This user is from outside of this forum
                                  riley@toot.cat
                                  wrote last edited by
                                  #26

                                  @whitequark Yes, all abstractions leak.

                                  But sometimes, people like to pretend, and/or make laws about pretending, that some don't, or mustn't, or "it's impossible to cross this abstraction boundary, so anybody who does it must be harshly punished" kind of thing. Likewise, some design cultures[1] like to build elaborate wrappers for hiding abstraction leakages, because of the simplistic notion that such leaks are bad design.

                                  [1] Particularly the "enterprise software" school of thought, in what I've seen. But the idea can also be seen outside big corporate environments.

                                  @xgranade

                                  whitequark@social.treehouse.systemsW 1 Reply Last reply
                                  0
                                  • whitequark@social.treehouse.systemsW whitequark@social.treehouse.systems

                                    @diondokter I don't really mind that particular bit because my goal with OSS/OSHW is less "creating value" (that's on the agenda but it's more incidental) and more "terraforming", changing the rules by which the world works. I think this is a more interesting mindset to approach OSxx with because a lot of the systems we've been building in the last two decades are of such a high quality that no commercial entity would possibly purchase them (since it's not justifiable to build something like that for a business that would run just fine with a much shittier version of the same thing).

                                    yes, under a different economic system, you could have (maybe?) captured some of that value. but under our current one, if Microsoft had to pay you $1500 they would've probably not used your tools at all (because the overhead of figuring out how to get you that money multiplies it severalfold and takes up valuable time of administrative and legal staff). my overall feeling about it, personally, is just "shrug"; I build tools for different reasons

                                    diondokter@fosstodon.orgD This user is from outside of this forum
                                    diondokter@fosstodon.orgD This user is from outside of this forum
                                    diondokter@fosstodon.org
                                    wrote last edited by
                                    #27

                                    @whitequark Yeah agreed. The fact that MS has used my tool didn't cost me anything either.

                                    But like I said, I've been building it to help people like me and I think it's succeeding at that. And it generally makes me happy seeing people use it successfully.

                                    > such a high quality that no commercial entity would possibly purchase them

                                    lol yeah, seems paradoxical, but very likely true

                                    whitequark@social.treehouse.systemsW 1 Reply Last reply
                                    0
                                    • diondokter@fosstodon.orgD diondokter@fosstodon.org

                                      @whitequark Yeah agreed. The fact that MS has used my tool didn't cost me anything either.

                                      But like I said, I've been building it to help people like me and I think it's succeeding at that. And it generally makes me happy seeing people use it successfully.

                                      > such a high quality that no commercial entity would possibly purchase them

                                      lol yeah, seems paradoxical, but very likely true

                                      whitequark@social.treehouse.systemsW This user is from outside of this forum
                                      whitequark@social.treehouse.systemsW This user is from outside of this forum
                                      whitequark@social.treehouse.systems
                                      wrote last edited by
                                      #28

                                      @diondokter

                                      lol yeah, seems paradoxical, but very likely true

                                      I didn't come up with that; it's a rephrasing of a very good post on the topic I've read and subsequently neglected to bookmark

                                      1 Reply Last reply
                                      0
                                      • riley@toot.catR riley@toot.cat

                                        @whitequark Yes, all abstractions leak.

                                        But sometimes, people like to pretend, and/or make laws about pretending, that some don't, or mustn't, or "it's impossible to cross this abstraction boundary, so anybody who does it must be harshly punished" kind of thing. Likewise, some design cultures[1] like to build elaborate wrappers for hiding abstraction leakages, because of the simplistic notion that such leaks are bad design.

                                        [1] Particularly the "enterprise software" school of thought, in what I've seen. But the idea can also be seen outside big corporate environments.

                                        @xgranade

                                        whitequark@social.treehouse.systemsW This user is from outside of this forum
                                        whitequark@social.treehouse.systemsW This user is from outside of this forum
                                        whitequark@social.treehouse.systems
                                        wrote last edited by
                                        #29

                                        @riley @xgranade I think we're talking past each other; whatever culture you've encountered at Google sounds borderline traumatizing but I've avoided it by ghosting the recruiter since the culture at the on-site interview location kinda creeped me out; so I don't have your context

                                        riley@toot.catR 1 Reply Last reply
                                        0
                                        • giacomo@snac.tesio.itG giacomo@snac.tesio.it
                                          @david_chisnall@infosec.exchange

                                          To be honest I think you are misrepresenting #FSF ethical position on the matter that is perfectly aligned with your own: thus the freedom of use for any purpose that is a strong requirement for any #FreeSoftware license.

                                          @whitequark@treehouse.systems
                                          david_chisnall@infosec.exchangeD This user is from outside of this forum
                                          david_chisnall@infosec.exchangeD This user is from outside of this forum
                                          david_chisnall@infosec.exchange
                                          wrote last edited by
                                          #30

                                          @giacomo @whitequark

                                          I think you're misunderstanding my point. The FSF decides to promote the creation of Free Software (a goal I agree with) by creating complex licenses.

                                          Developing software reusing software under any license requires understanding the license. The FSF's licenses are sufficiently complex that I have had multiple conversations with lawyers (including some with the FSF's lawyers) where they have not been able to tell me whether a specific use case is permitted. This places a burden on anyone developing Free Software using FSF-approved licenses, because there are a bunch of use cases that the FSF would regard as ethical, but where their licenses do not clearly permit the use.

                                          It places a larger burden on people doing things that the FSF disapproves of. They have to come up with exciting loopholes. Unfortunately, it turns out that this isn't that hard and once you've found a loophole you can keep using it. The FSF responds with even more complex licenses.

                                          EDIT: To be clear, the FSF and I have very similar goals. I just think that their strategy is completely counterproductive. Complex legal documents empower people who can afford expensive lawyers. We're increasingly seeing companies using AGPLv3 to control nominally-Free Software ecosystems.

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups