Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com.
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.
6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
view the rest of the comments
I do not think there is any solid answer.
The interpretation of laws is highly in flux, swinging depending on who is lobbying who on any given day.
I think we're headed towards the worst option though, which people think they want on impulse, but actually really don't.
I think we're going to see regulatory capture on a scale impossible to recover from.
Regulatory capture that will simply have the biggest AI companies buy out all of the rights to all of everyones information to such a degree that it is impossible to compete with them, giving them defacto control over AI and its censorship, and everyone trying to avoid being swept up will inevitably be shovelled into using some forms of platforms that these companies have the rights to.
I think that the way it should work is that AI can input whatever it wants, and just as intended, people can claim product resultants that qualify as copywritten material.
This would both make more incentive towards having the end user run local AI/less centralization, and incentivise ownership amongst citizens.
What we're heading to now might completely eliminate any chance we ever have of having AI that isn't massively controlled by corporate goals and right wing billionaire feelings, and no matter how some people try to avoid it, the affect on the population will be impossible to ignore.