Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

lucy@shrimp.starlightnet.workL

lucy@shrimp.starlightnet.work

@lucy@shrimp.starlightnet.work
About
Posts
10
Topics
1
Shares
0
Groups
0
Followers
0
Following
0

View Original

Posts

Recent Best Controversial

  • I've seen a few people say "eating vegetables that have fat and salt and stuff on them isn't healthy" and it just makes me want to bang my head into a wall.
    lucy@shrimp.starlightnet.workL lucy@shrimp.starlightnet.work
    @gsuberland fat and salt and stuff isn't per se unhealthy either.

    Dose makes the poison.

    Leaving a whole component/building block out of your diet is arguably just as unhealthy
    Uncategorized

  • We heard you: the new Framework Laptop 13 Pro.
    lucy@shrimp.starlightnet.workL lucy@shrimp.starlightnet.work
    @frameworkcomputer why would I wanna buy your laptop when you're just gonna give it to that racist guy anyways?

    I'd rather go with mnt research at this point, tyvm
    Uncategorized

  • ha, nice.
    lucy@shrimp.starlightnet.workL lucy@shrimp.starlightnet.work
    @mrmasterkeyboard you could test the librewolf port that's WIP atm >.>
    Uncategorized netbsd bsd unix tech technology

  • We gotta fight back.
    lucy@shrimp.starlightnet.workL lucy@shrimp.starlightnet.work
    @meowmashine have fun with your stochastic labubu. I'm not trying to convince you to not use it.

    I'm trying to prevent you from giving it my data, because that's not your choice to make. And i encourage everyone who also doesn't want you to give it their data to do the same.

    Also, comparing AI companies stealing private peoples shit with pirating media from huge ass conglomerates is disingenuous as fuck. Don't bullshit me, buddy. You know it's bs just as much as I do, and that's not gonna work with me.

    I think this conversation should end now. I don't accept the premise of assholes.
    Uncategorized

  • We gotta fight back.
    lucy@shrimp.starlightnet.workL lucy@shrimp.starlightnet.work
    @meowmashine bruh
    But still, why?
    are you really asking me why i don't wanna get stolen from? really? i don't think i don't have to explain, do I?
    Personally I want the world to use and work with my software, update and fix it [...] in any way possible.
    My main focus right now with the AI poison-pilling isn't code, it's text, e.g. for a personal website.

    Personally, I want to be a world where we avoid destroying the world just for a little bit of perceived (but disproven) convenience.
    I don't really think that licenses really work, if source is open, people or companies just gonna steal shit.
    Maybe you should research legal cases against companies who breached licenses.
    If ai will learn from my code to create code that runs like clockwork in bad conditions, I am in.
    AI-generated code is provably unreliable, terrible to maintain, riddled with security holes and LLMs poison our air and boil our oceans while generating it. It has been proven that LLMs don't make you more productive, they make you less productive.

    Besides, you are aware that LLMs are just stochastic word-guessers, right?
    Uncategorized

  • We gotta fight back.
    lucy@shrimp.starlightnet.workL lucy@shrimp.starlightnet.work
    @meowmashine

    i would give zero fucks about LLMs if it wouldn't be built using shit they stole from us, and if they didn't ruin the environment in the process.

    As for the "acting like a dog trying to chomp off random car bumpers": you missed the point. If anything, it's more akin to installing proper locks at ones door, so those fucks have a harder time stealing our shit.

    These companies are not entitled to our shit. Period.

    Our data is ours, and we can do whatever the fuck we want with it. If they wanna train their LLM on it, they can ask us nicely, and/or respect the license our shit is under.

    Until then, we need to make sure they can't just take our shit, because apparently, they never heard of the concept "consent".
    Uncategorized

  • We gotta fight back.
    lucy@shrimp.starlightnet.workL lucy@shrimp.starlightnet.work
    We gotta fight back.

    Contacting other open source projects and asking them to (hopefully) adopt a no-AI policy is a good first step, but we need to go further.

    We need to make it harder for users to use AI with our content, and we need to make it harder for these companies to steal our shit.

    I'm working on a tool, called cyanide, in order to poison LLMs at inference-level so they're absolutely useless when told to summarise content from a given website.

    I encourage you to do the same. figure out ways to poison and otherwise break LLMs when they deal with your content. At inference level, at training level, it doesn't matter.

    The more people do this, the more diverse tooling we build to stop these AI bro fucks from stealing our shit, the harder it is for the AI companies to clean up our shit, and the more obvious the failures and shortcomings of LLMs as search engines become for the average user.

    Here are a couple resources I found helpful when researching LLM security topics:
    -
    promptfoo.dev, a database of LLM vulnerabilities with links to research papers, which models are affected, etc.
    - [google scholar, since these things are an active field of research](scholar.google.com)
    - [openrouter.ai, cheap easy testing, especially when tested against multiple models since it's one simple API](openrouter.ai)

    feel free to ping me for resources to add, and other tooling to break LLMs with the data they scrape, parse or train on itself (so while iocaine is cool and helpful, it doesn't really poison inputdata itself, it feeds different data depending on who accesses the page)
    Uncategorized

  • i should write a (not proper) research paper on something, idek know what, and then put it under a name no one will see coming and see how people will think of it
    lucy@shrimp.starlightnet.workL lucy@shrimp.starlightnet.work
    @mrmasterkeyboard SURVEY TIME!!!
    Uncategorized

  • i should write a (not proper) research paper on something, idek know what, and then put it under a name no one will see coming and see how people will think of it
    lucy@shrimp.starlightnet.workL lucy@shrimp.starlightnet.work
    @mrmasterkeyboard statistical analysis: impact of yuri on code quality /silly
    Uncategorized

  • Launched a general effort against all generative AI everywhere on @the's Forgejo with some people from their IRC server's (starlightnet.work) #no-ai channel.
    lucy@shrimp.starlightnet.workL lucy@shrimp.starlightnet.work
    @mrmasterkeyboard @the @projectanchorage Hell yeah, let's goo!

    I wonder if we can get an AI-incompatible license going, that's still free and legally enforcable? Something that would survive the ai act?
    Uncategorized fuckai noai tech technology
  • Login

  • Login or register to search.
  • First post
    Last post
0
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups