Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. Can we just put it bluntly?

Can we just put it bluntly?

Scheduled Pinned Locked Moved Uncategorized
23 Posts 8 Posters 3 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • yosh@toot.yosh.isY yosh@toot.yosh.is

    @soph

    I guess I am slightly more cynical about copyright law. I view it as a tool by capital, for capital — doubly so in countries like the US where bribery is legal in all but name.

    Right now the stock market is fully leveraged on AI. I don't see how the US would ever find itself in a position where a Supreme Court ruling would ever intentionally put the entire economy in peril.

    soph@grrl.meS This user is from outside of this forum
    soph@grrl.meS This user is from outside of this forum
    soph@grrl.me
    wrote last edited by
    #13

    @yosh

    Perhaps, though the recent ruling could have massive impacts. I suspect you're still right, until it becomes convenient for the people in power to enforce things and attack their enemies with it.

    Link Preview Image
    The Supreme Court doesn't care if you want to copyright your AI-generated art

    The highest court in the US declined to review a case about copyrighting artwork created with the help of AI.

    favicon

    Engadget (www.engadget.com)

    1 Reply Last reply
    0
    • yosh@toot.yosh.isY yosh@toot.yosh.is

      @soph

      Am I right that by "vibe-coding" you mean "generating code, with no to little human involvement in the process". Which would be different than: "using tools to generate code, but with a human actively in the loop".

      I believe the crux of the case in the US was that the defendant claimed they did not create the works, a machine did, and because non-humans cannot claim IP protections they lost the case. Or did I misunderstand something about that case?

      soph@grrl.meS This user is from outside of this forum
      soph@grrl.meS This user is from outside of this forum
      soph@grrl.me
      wrote last edited by
      #14

      @yosh

      No, I mean generating code using an LLM at all. Though it'll be up to the lawyers how it's applied. If the machine is the one generating the code, even if you're the one telling it what to generate, then it's still producing things you didn't type or research or... etc

      yosh@toot.yosh.isY 1 Reply Last reply
      0
      • soph@grrl.meS soph@grrl.me

        @yosh

        No, I mean generating code using an LLM at all. Though it'll be up to the lawyers how it's applied. If the machine is the one generating the code, even if you're the one telling it what to generate, then it's still producing things you didn't type or research or... etc

        yosh@toot.yosh.isY This user is from outside of this forum
        yosh@toot.yosh.isY This user is from outside of this forum
        yosh@toot.yosh.is
        wrote last edited by
        #15

        @soph

        Ah ok! In practice I expect there is likely going to be a pretty big difference between the two.

        Once you get down to brass tacks: if a human is the one driving then it becomes hard to come up with language that does ban LLMs, but does not also ban things like compilers and digital cameras.

        Because both of those are also instances of: "I pressed a button and it automatically generated binary output – none of which was produced directly by me."

        soph@grrl.meS 1 Reply Last reply
        0
        • yosh@toot.yosh.isY yosh@toot.yosh.is

          @soph

          Ah ok! In practice I expect there is likely going to be a pretty big difference between the two.

          Once you get down to brass tacks: if a human is the one driving then it becomes hard to come up with language that does ban LLMs, but does not also ban things like compilers and digital cameras.

          Because both of those are also instances of: "I pressed a button and it automatically generated binary output – none of which was produced directly by me."

          soph@grrl.meS This user is from outside of this forum
          soph@grrl.meS This user is from outside of this forum
          soph@grrl.me
          wrote last edited by
          #16

          @yosh

          This is a bit different than those examples, though. In the case of code, we're talking about the source code itself, regardless of further application of tools to it.

          If the code itself cannot be copyrighted, then how it plays into the IP required to participate in open source becomes the issue

          yosh@toot.yosh.isY 1 Reply Last reply
          0
          • soph@grrl.meS soph@grrl.me

            @yosh

            This is a bit different than those examples, though. In the case of code, we're talking about the source code itself, regardless of further application of tools to it.

            If the code itself cannot be copyrighted, then how it plays into the IP required to participate in open source becomes the issue

            yosh@toot.yosh.isY This user is from outside of this forum
            yosh@toot.yosh.isY This user is from outside of this forum
            yosh@toot.yosh.is
            wrote last edited by
            #17

            @soph

            The funniest outcome for sure is that the resulting ruling would make all software defacto illegal.

            "To the maintainers of this open-source project. We are the big co legal department. We would like to get your written sign-off that no 'AI assistive tooling' has ever been used in this project. It is important for our supply chain. We expect a reply within 5 days."

            If anything AI has touched becomes devoid of legal protections, then that would probably implode the tech sector overnight.

            soph@grrl.meS 1 Reply Last reply
            0
            • yosh@toot.yosh.isY yosh@toot.yosh.is

              @soph

              The funniest outcome for sure is that the resulting ruling would make all software defacto illegal.

              "To the maintainers of this open-source project. We are the big co legal department. We would like to get your written sign-off that no 'AI assistive tooling' has ever been used in this project. It is important for our supply chain. We expect a reply within 5 days."

              If anything AI has touched becomes devoid of legal protections, then that would probably implode the tech sector overnight.

              soph@grrl.meS This user is from outside of this forum
              soph@grrl.meS This user is from outside of this forum
              soph@grrl.me
              wrote last edited by
              #18

              @yosh

              I think you're maybe saying something a bit more grandiose than what I'm trying to get at, which is around the code being generated itself by AI.

              Here I'm not talking like autogenerated version bumps, but really truly stuff trained on unknown IP.

              If some projects need to rollback and try again, I don't think that would be devastating. Sure, newer projects might suffer, but there was plenty of waste when the hype was around stuff like blockchain, too.

              yosh@toot.yosh.isY 1 Reply Last reply
              0
              • soph@grrl.meS soph@grrl.me

                @yosh

                I think you're maybe saying something a bit more grandiose than what I'm trying to get at, which is around the code being generated itself by AI.

                Here I'm not talking like autogenerated version bumps, but really truly stuff trained on unknown IP.

                If some projects need to rollback and try again, I don't think that would be devastating. Sure, newer projects might suffer, but there was plenty of waste when the hype was around stuff like blockchain, too.

                yosh@toot.yosh.isY This user is from outside of this forum
                yosh@toot.yosh.isY This user is from outside of this forum
                yosh@toot.yosh.is
                wrote last edited by
                #19

                @soph

                I guess what Im trying to get at is that if *any* amount of AI code is considered uncopyrightable, that would become a poison pill for any project that has had any amount of AI code contributed to it.

                It's not like every line of code authored by an LLM has a label that says: "I was written by an LLM." If I'm not mistaken there are OSS projects like the Linux kernel which will accept PRs that were partially authored by LLMs. I don't see how that could be untangled.

                Z poliorcetics@social.treehouse.systemsP ids1024@mathstodon.xyzI 3 Replies Last reply
                0
                • yosh@toot.yosh.isY yosh@toot.yosh.is

                  @soph

                  I guess what Im trying to get at is that if *any* amount of AI code is considered uncopyrightable, that would become a poison pill for any project that has had any amount of AI code contributed to it.

                  It's not like every line of code authored by an LLM has a label that says: "I was written by an LLM." If I'm not mistaken there are OSS projects like the Linux kernel which will accept PRs that were partially authored by LLMs. I don't see how that could be untangled.

                  Z This user is from outside of this forum
                  Z This user is from outside of this forum
                  zkat@toot.cat
                  wrote last edited by
                  #20

                  @yosh @soph I think the issue at hand is that they may have very well screwed the pooch by doing that

                  1 Reply Last reply
                  0
                  • yosh@toot.yosh.isY yosh@toot.yosh.is

                    @soph

                    I guess what Im trying to get at is that if *any* amount of AI code is considered uncopyrightable, that would become a poison pill for any project that has had any amount of AI code contributed to it.

                    It's not like every line of code authored by an LLM has a label that says: "I was written by an LLM." If I'm not mistaken there are OSS projects like the Linux kernel which will accept PRs that were partially authored by LLMs. I don't see how that could be untangled.

                    poliorcetics@social.treehouse.systemsP This user is from outside of this forum
                    poliorcetics@social.treehouse.systemsP This user is from outside of this forum
                    poliorcetics@social.treehouse.systems
                    wrote last edited by
                    #21

                    @yosh @soph the funniest timeline would be someone leaving a FAANG-type company with all the code made after 2025 and be relaxed in court with the argument that since it was mostly LLM written, it’s not copyrighted

                    1 Reply Last reply
                    0
                    • yosh@toot.yosh.isY yosh@toot.yosh.is

                      @soph

                      I guess what Im trying to get at is that if *any* amount of AI code is considered uncopyrightable, that would become a poison pill for any project that has had any amount of AI code contributed to it.

                      It's not like every line of code authored by an LLM has a label that says: "I was written by an LLM." If I'm not mistaken there are OSS projects like the Linux kernel which will accept PRs that were partially authored by LLMs. I don't see how that could be untangled.

                      ids1024@mathstodon.xyzI This user is from outside of this forum
                      ids1024@mathstodon.xyzI This user is from outside of this forum
                      ids1024@mathstodon.xyz
                      wrote last edited by
                      #22

                      @yosh @soph That part doesn't really seem like a problem, honestly. As I understand.

                      It's already the case that Linux kernel contributors (like most OSS projects) retain copyright on their contributions. The "linux kernel" can't sue anyone for copyright infringement; only the specific copyright holders, for the code they own.

                      A particular contributor's contributions being public domain presumably is similar as far as actual copyright enforcement to that person not being interested in joining as a plaintiff in a copyright lawsuit.

                      (Of course, if the LLM's output were found to be *infringing* that could be a bigger problem.)

                      ids1024@mathstodon.xyzI 1 Reply Last reply
                      0
                      • ids1024@mathstodon.xyzI ids1024@mathstodon.xyz

                        @yosh @soph That part doesn't really seem like a problem, honestly. As I understand.

                        It's already the case that Linux kernel contributors (like most OSS projects) retain copyright on their contributions. The "linux kernel" can't sue anyone for copyright infringement; only the specific copyright holders, for the code they own.

                        A particular contributor's contributions being public domain presumably is similar as far as actual copyright enforcement to that person not being interested in joining as a plaintiff in a copyright lawsuit.

                        (Of course, if the LLM's output were found to be *infringing* that could be a bigger problem.)

                        ids1024@mathstodon.xyzI This user is from outside of this forum
                        ids1024@mathstodon.xyzI This user is from outside of this forum
                        ids1024@mathstodon.xyz
                        wrote last edited by
                        #23

                        @yosh @soph Or perhaps rather it *is* a problem, but it's an existing problem, not a new one. For most projects.

                        The GNU project in contrast generally wants copyright assignment from contributors exactly to help avoid this sort of issue with license enforcement: https://www.gnu.org/licenses/why-assign.html

                        1 Reply Last reply
                        0
                        • R relay@relay.infosec.exchange shared this topic
                        Reply
                        • Reply as topic
                        Log in to reply
                        • Oldest to Newest
                        • Newest to Oldest
                        • Most Votes


                        • Login

                        • Login or register to search.
                        • First post
                          Last post
                        0
                        • Categories
                        • Recent
                        • Tags
                        • Popular
                        • World
                        • Users
                        • Groups