One of the most important thing to do right now to redistribute something to the society, is to use AI to write free software: more free software than ever. If AI will be hard to access in the future, the more software it is released free, the better. If instead things go well (as I hope), there will be just a multiplication of the effect of OSS using today and tomorrow AI. In any way, writing free software using AI is a good idea, IMHO. I believe LLMs are the incarnation of software democratization, which aligns very well with why I used to write OSS. LLMs "steal" ideas, not verbatim code, you can force them to regurgitate some verbatim stuff, but most of it is ideas, and we humans also re-elaborate things we saw and we avoid (like LLMs are able to do) to emit the same stuff verbatim. But the software can't be patented for very good reasons, and LLMs capture all this value that is not subject to intellectual property, and provides it to the people that don't have the right tools and knowledge. And, it allows people that can code, to code 100x more.
this argument is nonsense…I write code on a macbook running macos. it’s not a subscription, but some people also pay a subscription for a proprietary IDE. so any FOSS written with proprietary paid software doesn’t count to you? only if it’s a subscription model?
> I write code on a macbook running macos. it’s not a subscription
You already answered yourself, but let's pretend yours is a valid point: you lose access to Jetbrain IDE you can still code on another free ide/text editor and still give to society without heavily relying on ai somewhere in the cloud of the tech bros, which they don't want to give back to society, they want to be the gatekeepers of programming.
and you can switch AI providers, or use local LLMs. again, a nonsense point to raise about how FOSS is developed. coding “by hand” also doesn’t go away. if you lose your proprietary tools (laptop, OS, IDE, or coding agent) you can always work around it
LLMs work precisely because they converge on the abstract ideas in our languages. Anything verbatim is de minimus or more likely filtered out through the lens of the above doctrines.
He's not a generative AI fan at all, but he argues that if artists think tightening up copyright law to make it harder to train models is a good idea they're going to find themselves strengthening the very corporate overlords that they've been trying to escape.