• 0 Posts
  • 13 Comments
Joined 1 year ago
cake
Cake day: June 21st, 2023

help-circle







  • Defederation is a double-edged sword

    Agreed. It’s not the solution.

    The reality is that it’s a whole bunch of entirely separate environments and we’ve walked this path well with email

    On this I disagree. There are many fundamental differences. Email is private, while federated social media is public. Email is one-to-one primarily, or one-to-few. Soc media is broadcast style. The law would see it differently, and the abuse potential is also different. @faeranne@lemmy.blahaj.zone also used e-mail as a parallel and I don’t think that model works well.

    The process here on Mastodon is to decide for yourself what is worth taking action on.

    I agree for myself, but that wouldn’t shield a lay user. I can recommend that a parent sign up for reddit, because I know what they’ll see on the frontpage. Asking them to moderate for themselves can be tricky. As an example, if people could moderate content themselves we wouldn’t have climate skeptics and holocaust deniers. There is an element of housekeeping to be done top-down for a platform to function as a public service, which is what I assume Lemmy wants to be.

    Otherwise there’s always the danger of it becoming an wild-west platform that’ll attract extremists more than casual users looking for information.

    Automated action is bad because there’s no automated identity verification here and it’s an open door to denial of service attacks

    Good point.

    The fediverse actually helps in moderation because each admin is responsible for a group of users and the rest of the fediverse basically decides whether they’re doing their job acceptably via federation and defederation

    The way I see it this will inevitably lead to concentration of users, defeating the purpose of federation. One or two servers will be seen as ‘safe’ and people will recommend that to their friends and family. What stops those two instances from becoming the reddit of 20 years from now? We’ve seen what concentration of power in a few internet companies has done to the Internet itself, why retread the same steps?

    Again I may be very naive, but I think with the big idea that is federation, what is sorely lacking is a robust federated moderation protocol.



  • Understood, thanks. Yes I did misread it as sarcasm. Thanks for clearing that up :)

    However I disagree with @shiri@foggyminds.com in that Lemmy, and the Fediverse, are interfaced with as monolithic entities. Not just by people from the outside, but even by its own users. There are people here saying how they love the community on Lemmy for example. It’s just the way people group things, and no amount of technical explanation will prevent this semantic grouping.

    For example, the person who was arrested for CSAM recently was running a Tor exit node, but that didn’t help his case. As shiri pointed out, defederation works for black-and-white cases. But what about in cases like disagreement, where things are a bit more gray? Like hard political viewpoints? We’ve already seen the open internet devolve into bubbles with no productive discourse. Federation has a unique opportunity to solve that problem starting from scratch, and learning from previous mistakes. Defed is not the solution, it isn’t granular enough for one.

    Another problem defederation is that it is after-the-fact and depends on moderators and admins. There will inevitably be a backlog (pointed out in the article). With enough community reports, could there be a holding-cell style mechanism in federated networks? I think there is space to explore this deeper, and the study does the useful job of pointing out liabilities in the current state-of-the-art.


  • It doesn’t help to bring whataboutism into this discussion. This is a known problem with the open nature of federation. So is bigotry and hate speech. To address these problems, it’s important to first acknowledge that they exist.

    Also, since fed is still in the early stages, now is the time to experiment with mechanisms to control them. Saying that the problem is innate to networks is only sweeping it under the rug. At some point there will be a watershed event that’ll force these conversations anyway.

    The challenge is in moderating such content without being ham-fisted. I must admit I have absolutely no idea how, this is just my read of the situation.