I've been saying for a long time now that "open source AI" isn't really open source.
-
I've been saying for a long time now that "open source AI" isn't really open source. We are being manipulated.
I'm glad to hear Meredith Whittaker is fully in agreement on this point.
Signal's Meredith Whittaker says calling "open source AI" open is "narrative arbitrage"
Jump to about 6:25 in the video where she is describing the significant differences between open source software packages and "open source" AI models.
(humansare.social)
-
I've been saying for a long time now that "open source AI" isn't really open source. We are being manipulated.
I'm glad to hear Meredith Whittaker is fully in agreement on this point.
Signal's Meredith Whittaker says calling "open source AI" open is "narrative arbitrage"
Jump to about 6:25 in the video where she is describing the significant differences between open source software packages and "open source" AI models.
(humansare.social)
BTW, you owe it to yourself to watch this entire interview with Whittaker. It's not every day you get to watch a video and feel much smarter after having watched it than you were before. I learned a lot from this and have much food for thought to chew on.
-
BTW, you owe it to yourself to watch this entire interview with Whittaker. It's not every day you get to watch a video and feel much smarter after having watched it than you were before. I learned a lot from this and have much food for thought to chew on.
Brian Long in this interview speaks alarmingly several times about the ability of "open source" models to evade controls against these sorts of fakes.
not only do these models lack signature benefits many of us have traditionally associated with "open" in favor merely of running the black boxes locally, "open source" is being associated directly with this sort of exploitation
Teenage target of sexualized deepfakes fights back with online tools
Elliston Berry was only 14-years-old when a fellow high school student used artificial intelligence to create nude deepfakes of her and several classmates, and then posted them online.
(www.wbur.org)
-
Brian Long in this interview speaks alarmingly several times about the ability of "open source" models to evade controls against these sorts of fakes.
not only do these models lack signature benefits many of us have traditionally associated with "open" in favor merely of running the black boxes locally, "open source" is being associated directly with this sort of exploitation
Teenage target of sexualized deepfakes fights back with online tools
Elliston Berry was only 14-years-old when a fellow high school student used artificial intelligence to create nude deepfakes of her and several classmates, and then posted them online.
(www.wbur.org)
@idlestate wow, OK I'll bookmark that.
It touches on another but somewhat related problem: people associate "open source" with "good" and that's a fallacy. Just because a technology is open source does not mean it is inherently good.
-
I've been saying for a long time now that "open source AI" isn't really open source. We are being manipulated.
I'm glad to hear Meredith Whittaker is fully in agreement on this point.
Signal's Meredith Whittaker says calling "open source AI" open is "narrative arbitrage"
Jump to about 6:25 in the video where she is describing the significant differences between open source software packages and "open source" AI models.
(humansare.social)
The debian foundation also had a proposal to greatly limit what can be called an open model by their standards to require releasing the actual training data too:
Edit: the proposal was withdrawn though, with the proposer saying that it needed more time to be properly discussed before the vote I hope we see a clear definition like that again soon.
-
I've been saying for a long time now that "open source AI" isn't really open source. We are being manipulated.
I'm glad to hear Meredith Whittaker is fully in agreement on this point.
Signal's Meredith Whittaker says calling "open source AI" open is "narrative arbitrage"
Jump to about 6:25 in the video where she is describing the significant differences between open source software packages and "open source" AI models.
(humansare.social)
Agreed. At most, we can call those "open weights" - but declaring a model open-source means that it must be trained exclusively on data compatible with free software and free culture licenses, and preferably be able to have each token be tracked back to the work it was originally trained on for credit purposes. The closest thing I've seen to this is the Comma LLM model, and even that seems to lack the latter capability. huggingface.co/common-pile/com… -
I've been saying for a long time now that "open source AI" isn't really open source. We are being manipulated.
I'm glad to hear Meredith Whittaker is fully in agreement on this point.
Signal's Meredith Whittaker says calling "open source AI" open is "narrative arbitrage"
Jump to about 6:25 in the video where she is describing the significant differences between open source software packages and "open source" AI models.
(humansare.social)
@jaredwhite Calling a model “open weight” is like saying notepad.exe is “open machine code” because you get to look at the bytes. It’s total bullshit.
-
R relay@relay.infosec.exchange shared this topic
-
I've been saying for a long time now that "open source AI" isn't really open source. We are being manipulated.
I'm glad to hear Meredith Whittaker is fully in agreement on this point.
Signal's Meredith Whittaker says calling "open source AI" open is "narrative arbitrage"
Jump to about 6:25 in the video where she is describing the significant differences between open source software packages and "open source" AI models.
(humansare.social)
@jaredwhite IMO it was a mistake for OSI to rush to try to create a label for AI that included the term open source when it is no such thing. It's possible that a model could be fully open source if the training data was also made available, but how many companies are going to do that and expose what went into the training? Not many.
-
I've been saying for a long time now that "open source AI" isn't really open source. We are being manipulated.
I'm glad to hear Meredith Whittaker is fully in agreement on this point.
Signal's Meredith Whittaker says calling "open source AI" open is "narrative arbitrage"
Jump to about 6:25 in the video where she is describing the significant differences between open source software packages and "open source" AI models.
(humansare.social)
Narrative arbitrage is a _hell_ of a phrase.
-
I've been saying for a long time now that "open source AI" isn't really open source. We are being manipulated.
I'm glad to hear Meredith Whittaker is fully in agreement on this point.
Signal's Meredith Whittaker says calling "open source AI" open is "narrative arbitrage"
Jump to about 6:25 in the video where she is describing the significant differences between open source software packages and "open source" AI models.
(humansare.social)
@jaredwhite "Open source AI" is a scam. Just like Facebook, they tell users to volunteer to improve locations in-their-wall-garden for free. Then booom, Facebook kill this feature and get the big database to make ton of money from target advertising.
-
R relay@relay.an.exchange shared this topic