The best part of the fediverse is that anyone can run their own server. The downside of this is that anyone can easily create hordes of fake accounts, as I will now demonstrate.

Fighting fake accounts is hard and most implementations do not currently have an effective way of filtering out fake accounts. I’m sure that the developers will step in if this becomes a bigger problem. Until then, remember that votes are just a number.

  • @[email protected]
    link
    fedilink
    561 year ago

    This is the problem. All the algorithms are based on the upvote count. Bad actors will abuse this.

    • @Derproid
      link
      101 year ago

      So maybe more weight should be put on comment count? Much harder to fake those.

        • @[email protected]
          link
          fedilink
          31 year ago

          So, the question becomes how do we rank posts and comments in a way that is not based on either upvotes or down votes or number of comments? I could see a trust value being made for each user based on trusted users marking others as trusted combined with a personal trust score, but that puts a barrier on new users and enforces echo chambers.

          What else could be tried?

          • @[email protected]
            cake
            link
            fedilink
            31 year ago

            that puts a barrier on new users and enforces echo chambers

            Only if trust starts at 0. A system where trust started high enough to not filter out posts and comments would avoid that issue.

          • @[email protected]
            link
            fedilink
            11 year ago

            Maybe instances should be assigned a rank for how dependable they are. Length of time active, number of active users… Stuff like that and each instance keeps track of its own rankings for each instance it is federated with. Put the upvote and those stats in a magic box to calculate the actual upvote value.