Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. I generally prefer the MIT license for my personal projects.

I generally prefer the MIT license for my personal projects.

Scheduled Pinned Locked Moved Uncategorized
44 Posts 14 Posters 113 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • C crazyc@mastodon.social

    @gloriouscow The biggest problem I see with AI isn't the technology but the hype trying to convince people it can do things it can't (although this a problem with a lot more than AI). For example, this PR https://github.com/mamedev/mame/pull/15031 made me a bit angry because the guy was certain that claude had written a good change when it had misread a datasheet and he didn't try to look at it himself to see if it was correct.

    gloriouscow@oldbytes.spaceG This user is from outside of this forum
    gloriouscow@oldbytes.spaceG This user is from outside of this forum
    gloriouscow@oldbytes.space
    wrote last edited by
    #17

    @crazyc You'd think @bagder 's well documented bug bounty water torture experience would have gotten enough traction that anyone would feel a certain amount of personal shame submitting an AI-generated PR but I suppose ignorance is the cup that runneth over.

    Since I'm abusing analogies this evening, my feelings on AI kind of match my feelings on firearms. Would I trust you with a gun? Probably, you're a smart person. Do I trust myself? Well yeah, I'm certainly not going to blow my own foot off by accident. Probably.

    Do I trust John Q. Fucking Public? Absolutely fucking not, please stop waving your Claude .45 around until you have taken a safety class.

    gloriouscow@oldbytes.spaceG 1 Reply Last reply
    0
    • gloriouscow@oldbytes.spaceG gloriouscow@oldbytes.space

      The idea that you can cleanroom a codebase with an LLM to safely pivot licensing is really not anything I need to waste words arguing is the thought process of the worst sort of dipshit tech bro.

      If you're on the fediverse you know this already.

      At least this latest indignity to human creativity doesn't seem to involve Rust, a language I deeply love but one that also has a serious Bro problem and is being wielded in similar sorts of license-washing.

      jfaulken@mastodon.gamedev.placeJ This user is from outside of this forum
      jfaulken@mastodon.gamedev.placeJ This user is from outside of this forum
      jfaulken@mastodon.gamedev.place
      wrote last edited by
      #18

      @gloriouscow right! It doesn't matter if it's illegal, it's unethical.

      gloriouscow@oldbytes.spaceG 1 Reply Last reply
      0
      • gloriouscow@oldbytes.spaceG gloriouscow@oldbytes.space

        @crazyc You'd think @bagder 's well documented bug bounty water torture experience would have gotten enough traction that anyone would feel a certain amount of personal shame submitting an AI-generated PR but I suppose ignorance is the cup that runneth over.

        Since I'm abusing analogies this evening, my feelings on AI kind of match my feelings on firearms. Would I trust you with a gun? Probably, you're a smart person. Do I trust myself? Well yeah, I'm certainly not going to blow my own foot off by accident. Probably.

        Do I trust John Q. Fucking Public? Absolutely fucking not, please stop waving your Claude .45 around until you have taken a safety class.

        gloriouscow@oldbytes.spaceG This user is from outside of this forum
        gloriouscow@oldbytes.spaceG This user is from outside of this forum
        gloriouscow@oldbytes.space
        wrote last edited by
        #19

        @crazyc

        Also thank you for reminding me I still need to refactor my 765 emulation.

        Without just stealing it wholesale from MAME it has been somewhat baffling at times so I feel for lil' ol Claude.

        My favorite thing is when I find some copy protected title that seems to want a result flag one way but that breaks another title that seems to want it the other way, and then I'm stuck trying to find whatever I'm missing that makes the paradox make sense.

        C 1 Reply Last reply
        0
        • gloriouscow@oldbytes.spaceG gloriouscow@oldbytes.space

          @crazyc

          Also thank you for reminding me I still need to refactor my 765 emulation.

          Without just stealing it wholesale from MAME it has been somewhat baffling at times so I feel for lil' ol Claude.

          My favorite thing is when I find some copy protected title that seems to want a result flag one way but that breaks another title that seems to want it the other way, and then I'm stuck trying to find whatever I'm missing that makes the paradox make sense.

          C This user is from outside of this forum
          C This user is from outside of this forum
          crazyc@mastodon.social
          wrote last edited by
          #20

          @gloriouscow Yeah, it's hard to get right since the copy protection authors probed every corner for undocumented behavior.

          1 Reply Last reply
          0
          • gloriouscow@oldbytes.spaceG This user is from outside of this forum
            gloriouscow@oldbytes.spaceG This user is from outside of this forum
            gloriouscow@oldbytes.space
            wrote last edited by
            #21

            @yakmacker

            These are all good points, although from the perspective of a consumer of medicine I may not appreciate any personal philosophies beyond a successful diagnosis.

            Let me share a window into my personal temptations - I have more projects rattling around in my brain than I will ever be able to make real the the remaining time I have on this earth. The devil on my shoulder says, you know that Gemini could write that Python script to convert that 9 GB of JSON your Arduino just dumped on your hard drive, right?

            And I can hem and haw about whether writing the miscellaneous glue and tooling and ephemera of my trade, for what it is, is my real passion or not, or if I lose anything by outsourcing it, in the way that many talented scientists with more ideas than time (which I am well aware I have no business comparing myself to) employed various assistants.

            That's the hook - just a little Python, it couldn't hurt. That's how it will start, and then next year I'm going to have a 6' rack of Mac Minis running OpenClaw all vibe coding a MartyPC MMO while I occasionally stop stuffing Cheetos into my grass hole long enough to give a suggestion regarding the exact shade of purple to use in the UI

            1 Reply Last reply
            0
            • jfaulken@mastodon.gamedev.placeJ jfaulken@mastodon.gamedev.place

              @gloriouscow right! It doesn't matter if it's illegal, it's unethical.

              gloriouscow@oldbytes.spaceG This user is from outside of this forum
              gloriouscow@oldbytes.spaceG This user is from outside of this forum
              gloriouscow@oldbytes.space
              wrote last edited by
              #22

              @jfaulken

              from my limited 'murican perspective:

              collectively, we used to ask if something was ethical.

              at some point, the question simply became if it was technically legal

              now we are in the era where the question is 'will anyone do anything about it to stop me?'

              1 Reply Last reply
              0
              • gloriouscow@oldbytes.spaceG gloriouscow@oldbytes.space

                If I just made you sad thinking about that scene, I am sorry.

                retrofan64@oldbytes.spaceR This user is from outside of this forum
                retrofan64@oldbytes.spaceR This user is from outside of this forum
                retrofan64@oldbytes.space
                wrote last edited by
                #23

                @gloriouscow (getting even)

                "They look like big, strong hands"

                1 Reply Last reply
                0
                • gloriouscow@oldbytes.spaceG gloriouscow@oldbytes.space

                  My worry is that the MIT license itself will become something like a scarlet letter. I am really not a proponent of GPL-by-default.

                  If someone wants to take my code and use it in an indie game or something I want them to be able to do that and not feel like they need to release their source code or pay me or do anything other than have my name in a readme somewhere.

                  It just makes me happy every time I get even the slightest hint that something I put effort in could be used in some way by someone else.

                  These are different kinds of liberties. I respect that the GPL prevented wholesale looting of volunteer efforts by corporations and the world would be a worse place without it.

                  But there is a space I think for unencumbered code, just ideas that float freely in the intellectual aether anyone is free to pluck down and use as they please.

                  brouhaha@mastodon.socialB This user is from outside of this forum
                  brouhaha@mastodon.socialB This user is from outside of this forum
                  brouhaha@mastodon.social
                  wrote last edited by
                  #24

                  @gloriouscow
                  One can publish code under GPL, with a statement that the author is willing to consider requests for alternative licensing on a case by case basis.
                  I'd license my code for an indie game for free, but if a big company calls, I expect them to pay.
                  I've sold commercial licenses for some of my GPL'd open source, and the licensees seemed quite happy with the terms and pricing I offered.
                  On occasion, when requested, I've relicensed my code under less restrictive licenses like MIT.

                  brouhaha@mastodon.socialB 1 Reply Last reply
                  0
                  • brouhaha@mastodon.socialB brouhaha@mastodon.social

                    @gloriouscow
                    One can publish code under GPL, with a statement that the author is willing to consider requests for alternative licensing on a case by case basis.
                    I'd license my code for an indie game for free, but if a big company calls, I expect them to pay.
                    I've sold commercial licenses for some of my GPL'd open source, and the licensees seemed quite happy with the terms and pricing I offered.
                    On occasion, when requested, I've relicensed my code under less restrictive licenses like MIT.

                    brouhaha@mastodon.socialB This user is from outside of this forum
                    brouhaha@mastodon.socialB This user is from outside of this forum
                    brouhaha@mastodon.social
                    wrote last edited by
                    #25

                    @gloriouscow
                    My personal default is GPL-v3.0-only.
                    In some cases, where I've anticipated specific non-open-source uses I want to foster, I've chosen MIT or BSD 2-clause up front.

                    gloriouscow@oldbytes.spaceG 1 Reply Last reply
                    0
                    • brouhaha@mastodon.socialB brouhaha@mastodon.social

                      @gloriouscow
                      My personal default is GPL-v3.0-only.
                      In some cases, where I've anticipated specific non-open-source uses I want to foster, I've chosen MIT or BSD 2-clause up front.

                      gloriouscow@oldbytes.spaceG This user is from outside of this forum
                      gloriouscow@oldbytes.spaceG This user is from outside of this forum
                      gloriouscow@oldbytes.space
                      wrote last edited by
                      #26

                      @brouhaha And that's cool, and I respect your ability to choose, and it's cool you'll relicense.

                      I've actually asked in a few cases whether I could people's code that had some sort of MIT-incompatible clause. It's good to point out that you have that option, although, if you're lucky enough to start a very popular project, that GPL is going to become very sticky unless you have all your contributors on speed dial or, apparently, if you have Claude and lack a moral compass.

                      I'm also a smelly, antisocial hermit and I don't want to talk to you about your indie game, just take my code and leave me alone lol

                      brouhaha@mastodon.socialB 1 Reply Last reply
                      0
                      • gloriouscow@oldbytes.spaceG gloriouscow@oldbytes.space

                        @brouhaha And that's cool, and I respect your ability to choose, and it's cool you'll relicense.

                        I've actually asked in a few cases whether I could people's code that had some sort of MIT-incompatible clause. It's good to point out that you have that option, although, if you're lucky enough to start a very popular project, that GPL is going to become very sticky unless you have all your contributors on speed dial or, apparently, if you have Claude and lack a moral compass.

                        I'm also a smelly, antisocial hermit and I don't want to talk to you about your indie game, just take my code and leave me alone lol

                        brouhaha@mastodon.socialB This user is from outside of this forum
                        brouhaha@mastodon.socialB This user is from outside of this forum
                        brouhaha@mastodon.social
                        wrote last edited by
                        #27

                        @gloriouscow
                        And I respect that choice as well.

                        The reality is that most of my published code is so obscure and eclectic that few people even want it. The commercial license requests really took me by surprise.

                        cr1901@mastodon.socialC 1 Reply Last reply
                        0
                        • brouhaha@mastodon.socialB brouhaha@mastodon.social

                          @gloriouscow
                          And I respect that choice as well.

                          The reality is that most of my published code is so obscure and eclectic that few people even want it. The commercial license requests really took me by surprise.

                          cr1901@mastodon.socialC This user is from outside of this forum
                          cr1901@mastodon.socialC This user is from outside of this forum
                          cr1901@mastodon.social
                          wrote last edited by
                          #28

                          @brouhaha @gloriouscow Also, you're flexible with licensing... e.g. m5meta (GPL3) usage in Sentinel (BSD-2).

                          Not that I would abuse that kindness :D!

                          brouhaha@mastodon.socialB 1 Reply Last reply
                          0
                          • cr1901@mastodon.socialC cr1901@mastodon.social

                            @brouhaha @gloriouscow Also, you're flexible with licensing... e.g. m5meta (GPL3) usage in Sentinel (BSD-2).

                            Not that I would abuse that kindness :D!

                            brouhaha@mastodon.socialB This user is from outside of this forum
                            brouhaha@mastodon.socialB This user is from outside of this forum
                            brouhaha@mastodon.social
                            wrote last edited by
                            #29

                            @cr1901 @gloriouscow
                            I was happy to do it, and I've been delinquent in making that change to the official repo.

                            1 Reply Last reply
                            0
                            • gloriouscow@oldbytes.spaceG gloriouscow@oldbytes.space

                              I see a lot of derisive dismissal of AI on grounds other than ethical ones and I somehow feel it is a mistaken approach, almost like a Vegan trying to convince you that all steak tastes bad.

                              I feel it is a dangerous underestimation of the immense resources in both talent and money being brought to bear on the problem.

                              Too many people focus on where AI currently is, forgetting where it was just scant years ago, and ignoring its current velocity.

                              I feel like anyone actually paying attention and testing each model that comes out knows that laughing it off as "slop" is not going to remain particularly amusing for long.

                              Only a year ago ChatGPT couldn't write Hello World in x86 assembly, and now it will emit a complete, working, 32-bit MS-DOS Mandelbrot generator in a single prompt.

                              The slop is starting to not look so very sloppy.

                              The only argument that I predict will not age extremely poorly is the ethical one.

                              After all, it is not like if ChatGPT stopped hallucinating and glazing and regurgitating its inputs tomorrow, you'd suddenly be okay with it - so why use any other argument other than that it is a leviathan in the hands of the oligarchy?

                              Slop or Shakespeare, that doesn't change.

                              asie@mk.asie.plA This user is from outside of this forum
                              asie@mk.asie.plA This user is from outside of this forum
                              asie@mk.asie.pl
                              wrote last edited by
                              #30

                              @gloriouscow@oldbytes.space I think a key reason LLMs do better with programming than other fields is that code is much more hopelessly repetitive than we like to admit to ourselves. To borrow your example, how many Mandelbrot renderers were written on GitHub? And that's a niche example - think of things people write for a living, CRUD services, REST APIs, login pages, parsing libraries, wrappers...

                              I agree, and have said for a while now, that it is a disservice to frame the opposition to the LLM boom in terms of anything other than (a) opposition to Big Tech's view of the world and (b) a kind of labor dispute. Copyright laws can be changed; power efficiency can improve; slop can be made less sloppy by making the number of weight-monkeys approach infinity - under the condition that the music doesn't stop first - which I think is what companies like OpenAI and Anthropic are banking on.

                              Personally, my key issue is the idea of what I call "digital sovereignty". I do not want to be beholden to a cloud subscription to do the most basic elements of my job or my passion, because I have seen where that road takes us: enshittification, raising prices, customer-hostile changes, even geopolitical problems. Notably, this doesn't apply to so-called "open weight" models - but the "good ones" are both still behind SOTA and unviable for all but the largest polycules, not to mention the RAM/SSD pricing upheaval.

                              I am also concerned about the copyright angle, deskilling, AI psychosis, cultural impact, et cetera - but for more practical reasons. I also still believe LLMs are an evolutionary dead end for artificial intelligence, even if they have gotten considerably further than I anticipated.

                              In addition, I've seen many groups concede that while they are not interested in AI generated art or music (Adam Neely's video on Suno AI raises a lot of good points about that), they don't mind, say, AI generated code. This personally makes me a little sad, but I understand that for most people art is an end, but code is merely a means to an end.

                              But I don't believe the technology itself, as in the mathematical equations or the idea of generating tokens using LLMs in response to inputs, is inherently evil. I really like viznut's essay on that matter:
                              http://viznut.fi/texts-en/machine_learning_rant.html - but I've also seen LLM efforts which try to avoid, say, the mass copyright infringement problem, and while their results certainly look more impressive than I anticipated, they also aren't really commercially viable, so to speak.

                              Final note - a lot of people trying LLM-based technology compare it to a slot machine, in that the quality of the result you get is highly unpredictable. I think, outside of niche tech circles, some don't realize that so many things have already become akin to gambling. Sports, mobile games, software bugs, cloud services, apparently the news, etc. - in that lens, ChatGPT becomes just another unreliable tool, not something uniquely unreliable.

                              asie@mk.asie.plA gloriouscow@oldbytes.spaceG 2 Replies Last reply
                              0
                              • gloriouscow@oldbytes.spaceG gloriouscow@oldbytes.space

                                My worry is that the MIT license itself will become something like a scarlet letter. I am really not a proponent of GPL-by-default.

                                If someone wants to take my code and use it in an indie game or something I want them to be able to do that and not feel like they need to release their source code or pay me or do anything other than have my name in a readme somewhere.

                                It just makes me happy every time I get even the slightest hint that something I put effort in could be used in some way by someone else.

                                These are different kinds of liberties. I respect that the GPL prevented wholesale looting of volunteer efforts by corporations and the world would be a worse place without it.

                                But there is a space I think for unencumbered code, just ideas that float freely in the intellectual aether anyone is free to pluck down and use as they please.

                                aliceif@mkultra.x27.oneA This user is from outside of this forum
                                aliceif@mkultra.x27.oneA This user is from outside of this forum
                                aliceif@mkultra.x27.one
                                wrote last edited by
                                #31
                                @gloriouscow@oldbytes.space do not let them make you think that?
                                Also consider looking into permissive licenses that scare boring people away, The Unlicense is on Google's shitlist but not the FSF's for example.
                                1 Reply Last reply
                                0
                                • asie@mk.asie.plA asie@mk.asie.pl

                                  @gloriouscow@oldbytes.space I think a key reason LLMs do better with programming than other fields is that code is much more hopelessly repetitive than we like to admit to ourselves. To borrow your example, how many Mandelbrot renderers were written on GitHub? And that's a niche example - think of things people write for a living, CRUD services, REST APIs, login pages, parsing libraries, wrappers...

                                  I agree, and have said for a while now, that it is a disservice to frame the opposition to the LLM boom in terms of anything other than (a) opposition to Big Tech's view of the world and (b) a kind of labor dispute. Copyright laws can be changed; power efficiency can improve; slop can be made less sloppy by making the number of weight-monkeys approach infinity - under the condition that the music doesn't stop first - which I think is what companies like OpenAI and Anthropic are banking on.

                                  Personally, my key issue is the idea of what I call "digital sovereignty". I do not want to be beholden to a cloud subscription to do the most basic elements of my job or my passion, because I have seen where that road takes us: enshittification, raising prices, customer-hostile changes, even geopolitical problems. Notably, this doesn't apply to so-called "open weight" models - but the "good ones" are both still behind SOTA and unviable for all but the largest polycules, not to mention the RAM/SSD pricing upheaval.

                                  I am also concerned about the copyright angle, deskilling, AI psychosis, cultural impact, et cetera - but for more practical reasons. I also still believe LLMs are an evolutionary dead end for artificial intelligence, even if they have gotten considerably further than I anticipated.

                                  In addition, I've seen many groups concede that while they are not interested in AI generated art or music (Adam Neely's video on Suno AI raises a lot of good points about that), they don't mind, say, AI generated code. This personally makes me a little sad, but I understand that for most people art is an end, but code is merely a means to an end.

                                  But I don't believe the technology itself, as in the mathematical equations or the idea of generating tokens using LLMs in response to inputs, is inherently evil. I really like viznut's essay on that matter:
                                  http://viznut.fi/texts-en/machine_learning_rant.html - but I've also seen LLM efforts which try to avoid, say, the mass copyright infringement problem, and while their results certainly look more impressive than I anticipated, they also aren't really commercially viable, so to speak.

                                  Final note - a lot of people trying LLM-based technology compare it to a slot machine, in that the quality of the result you get is highly unpredictable. I think, outside of niche tech circles, some don't realize that so many things have already become akin to gambling. Sports, mobile games, software bugs, cloud services, apparently the news, etc. - in that lens, ChatGPT becomes just another unreliable tool, not something uniquely unreliable.

                                  asie@mk.asie.plA This user is from outside of this forum
                                  asie@mk.asie.plA This user is from outside of this forum
                                  asie@mk.asie.pl
                                  wrote last edited by
                                  #32

                                  @gloriouscow@oldbytes.space

                                  (And I continue to question how good these tools have become in a general sense. I've seen a community member try, i believe, Gemini-2.5-Flash, to perform summarization of its own scraped Discord posts (in particular, overseas travel advice). It, uh, it didn't go well. Though we did laugh a lot, between the conversations about consent it provoked.)

                                  1 Reply Last reply
                                  0
                                  • gloriouscow@oldbytes.spaceG gloriouscow@oldbytes.space

                                    To be fair I've seen the opposite happen as well, where people will take code released into the public domain and write Rust bindings for it and release those as GPL or some other more restrictive license, and I think that sucks too.

                                    How hard is it to just - keep the same license? Just preserve the author's intent. They had a vision in mind and made a choice when they put their creative energies out into the world. Pass that forward.

                                    xanathar@hachyderm.ioX This user is from outside of this forum
                                    xanathar@hachyderm.ioX This user is from outside of this forum
                                    xanathar@hachyderm.io
                                    wrote last edited by
                                    #33

                                    @gloriouscow bindings are a different thing though, they are little more than a header file on steroids. For example I totally see (and there are plenty of examples) the case for MIT bindings towards LGPL libraries. They don't alter nor remove the original licensing terms, so it makes sense for them to be the least legally binding license imho

                                    gloriouscow@oldbytes.spaceG 1 Reply Last reply
                                    0
                                    • asie@mk.asie.plA asie@mk.asie.pl

                                      @gloriouscow@oldbytes.space I think a key reason LLMs do better with programming than other fields is that code is much more hopelessly repetitive than we like to admit to ourselves. To borrow your example, how many Mandelbrot renderers were written on GitHub? And that's a niche example - think of things people write for a living, CRUD services, REST APIs, login pages, parsing libraries, wrappers...

                                      I agree, and have said for a while now, that it is a disservice to frame the opposition to the LLM boom in terms of anything other than (a) opposition to Big Tech's view of the world and (b) a kind of labor dispute. Copyright laws can be changed; power efficiency can improve; slop can be made less sloppy by making the number of weight-monkeys approach infinity - under the condition that the music doesn't stop first - which I think is what companies like OpenAI and Anthropic are banking on.

                                      Personally, my key issue is the idea of what I call "digital sovereignty". I do not want to be beholden to a cloud subscription to do the most basic elements of my job or my passion, because I have seen where that road takes us: enshittification, raising prices, customer-hostile changes, even geopolitical problems. Notably, this doesn't apply to so-called "open weight" models - but the "good ones" are both still behind SOTA and unviable for all but the largest polycules, not to mention the RAM/SSD pricing upheaval.

                                      I am also concerned about the copyright angle, deskilling, AI psychosis, cultural impact, et cetera - but for more practical reasons. I also still believe LLMs are an evolutionary dead end for artificial intelligence, even if they have gotten considerably further than I anticipated.

                                      In addition, I've seen many groups concede that while they are not interested in AI generated art or music (Adam Neely's video on Suno AI raises a lot of good points about that), they don't mind, say, AI generated code. This personally makes me a little sad, but I understand that for most people art is an end, but code is merely a means to an end.

                                      But I don't believe the technology itself, as in the mathematical equations or the idea of generating tokens using LLMs in response to inputs, is inherently evil. I really like viznut's essay on that matter:
                                      http://viznut.fi/texts-en/machine_learning_rant.html - but I've also seen LLM efforts which try to avoid, say, the mass copyright infringement problem, and while their results certainly look more impressive than I anticipated, they also aren't really commercially viable, so to speak.

                                      Final note - a lot of people trying LLM-based technology compare it to a slot machine, in that the quality of the result you get is highly unpredictable. I think, outside of niche tech circles, some don't realize that so many things have already become akin to gambling. Sports, mobile games, software bugs, cloud services, apparently the news, etc. - in that lens, ChatGPT becomes just another unreliable tool, not something uniquely unreliable.

                                      gloriouscow@oldbytes.spaceG This user is from outside of this forum
                                      gloriouscow@oldbytes.spaceG This user is from outside of this forum
                                      gloriouscow@oldbytes.space
                                      wrote last edited by
                                      #34

                                      @asie

                                      The first thing I did of course is try to find if it had copied something - there were not a lot of examples of ASM mandlebrots to go through on GitHub - many were 16-bit, most that were 32-bit used the FPU, or instructions not available on the 386, or some other disqualifier from direct plagiarism.

                                      After coming up empty on Github I spent a fair bit of time pulling down mandelbrot demos from pouet, as they sometimes include source code.

                                      there were clear and apparent differences in every example I looked at - i learned a rather interesting trick for getting a pointer to the VGA framebuffer going through those!

                                      in any case, it was clear that demo-coders were more skilled, keeping everything in registers in the main iteration loop, whereas as the GPT example was using several temporary variables in RAM.

                                      But I was just impressed that it worked at all.

                                      The entire point was a request that it would have failed miserably a year prior , and something that leaned on the side of having the least training data available as possible - but when these companies have scraped every single corner of the internet by now, it might be difficult to pinpoint any particular task that doesn't have some sort of preceding example it can leverage.

                                      It's difficult for me to measure improvement in quantifiable terms other than giving it these sort of challenges - you can see the various scores on things like ARC-AGI trending upwards with every new model, but that sort of thing is a rather abstract measure - what does that relate to in practical terms?

                                      I feel like the AI companies must thank their lucky stars that coding ended up being AI's "killer app". OpenAI would never succeed with something as vapid as Sora as their flagship product.

                                      The greater acceptance of generative AI by programmers is a very interesting phenomenon. There's probably quite a few psychology thesis papers to mine out of that topic. I'm not really ready to be completely cynical regarding the motivations of programmers vs visual artists or musicians. There may be something more fundamental at play.

                                      asie@mk.asie.plA 1 Reply Last reply
                                      0
                                      • gloriouscow@oldbytes.spaceG gloriouscow@oldbytes.space

                                        @asie

                                        The first thing I did of course is try to find if it had copied something - there were not a lot of examples of ASM mandlebrots to go through on GitHub - many were 16-bit, most that were 32-bit used the FPU, or instructions not available on the 386, or some other disqualifier from direct plagiarism.

                                        After coming up empty on Github I spent a fair bit of time pulling down mandelbrot demos from pouet, as they sometimes include source code.

                                        there were clear and apparent differences in every example I looked at - i learned a rather interesting trick for getting a pointer to the VGA framebuffer going through those!

                                        in any case, it was clear that demo-coders were more skilled, keeping everything in registers in the main iteration loop, whereas as the GPT example was using several temporary variables in RAM.

                                        But I was just impressed that it worked at all.

                                        The entire point was a request that it would have failed miserably a year prior , and something that leaned on the side of having the least training data available as possible - but when these companies have scraped every single corner of the internet by now, it might be difficult to pinpoint any particular task that doesn't have some sort of preceding example it can leverage.

                                        It's difficult for me to measure improvement in quantifiable terms other than giving it these sort of challenges - you can see the various scores on things like ARC-AGI trending upwards with every new model, but that sort of thing is a rather abstract measure - what does that relate to in practical terms?

                                        I feel like the AI companies must thank their lucky stars that coding ended up being AI's "killer app". OpenAI would never succeed with something as vapid as Sora as their flagship product.

                                        The greater acceptance of generative AI by programmers is a very interesting phenomenon. There's probably quite a few psychology thesis papers to mine out of that topic. I'm not really ready to be completely cynical regarding the motivations of programmers vs visual artists or musicians. There may be something more fundamental at play.

                                        asie@mk.asie.plA This user is from outside of this forum
                                        asie@mk.asie.plA This user is from outside of this forum
                                        asie@mk.asie.pl
                                        wrote last edited by
                                        #35

                                        @gloriouscow@oldbytes.space

                                        I don't think observing a difference in values is cynical. If you value productivity more than digital sovereignty or ecology, of if you don't hold a positive view of copyright, or if you hold a positive view of modern day corporate capitalism, why wouldn't you use these tools?

                                        The most cynical thing I think I believe about generative AI users is that the feedback loop of using LLMs often enables a kind of narcissistic-leaning tendency to treat the feedback loop as a first resort over other humans. It was particularly apparent to me in the case of the music generation tool Suno AI, where people were hard-pressed to name other AI generating users who inspire them, or even other AI generated music they listen to! I don't think that's a good change.

                                        And, of course, I am worried for the backlash against AI generated works pivoting against humans who aren't skilled enough to not be accused of being LLM tool users. I mean, this has already been happening.

                                        gloriouscow@oldbytes.spaceG 1 Reply Last reply
                                        0
                                        • xanathar@hachyderm.ioX xanathar@hachyderm.io

                                          @gloriouscow bindings are a different thing though, they are little more than a header file on steroids. For example I totally see (and there are plenty of examples) the case for MIT bindings towards LGPL libraries. They don't alter nor remove the original licensing terms, so it makes sense for them to be the least legally binding license imho

                                          gloriouscow@oldbytes.spaceG This user is from outside of this forum
                                          gloriouscow@oldbytes.spaceG This user is from outside of this forum
                                          gloriouscow@oldbytes.space
                                          wrote last edited by
                                          #36

                                          @xanathar the way that Rust FFI bindings work though is they typically end up wrapped up in a crate with the original source, so that's problematic, because the LGPL code is inside - I noted several FFI binding crates on crates.io that were marked MIT - the implication being you can just happily cargo add to your MIT-licensed project and happily go about your day, but you're actually now linking with GPL code.

                                          The bindings themselves are useless without the code they are binding to so I see no compelling reason to use a difference license.

                                          xanathar@hachyderm.ioX 1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups