Somewhat new to the AI porn thing, and been scrolling around this community a bit. I’ve noticed though, that a lot of the posts here feature nude photos of women who look like they are under 18.

Obviously they are not real, so there isn’t a way to prove that either way, and the exact line of “looks underaged” is blurry. Not interested in getting in the weeds of morality and different countries laws, just don’t want to see that content mixed in.

Just wondering what the intended rules are on this. is it just down to the individual mods to remove content at their own discretion? If so, I may make another community, erroring in the other direction.

  • Triple Underscore@lemmynsfw.comM
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    1 year ago

    Report them and we’ll remove them. This community is starting to get a bit more active than I anticipated so I can’t really look at every post by myself like I could at the start.

  • Padded Person@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    1 year ago

    The instance wide rule is blurry but definitely leans on the side of remove it if it could be underage rather than keep it.

    That being said it requires individual mods to enforce it, as there are only so many admins and they usually have bigger issues to deal with (such as making sure mods are actually enforcing the rule rather than doing it themselves)

    Looking at the modlog for this community, It does look like they are removing posts they feel are too youthful looking. its possible they just havent seen the posts you have issue with.

    I would suggest that if you find the content too young, you report the individual posts to highlight them for the mods to make it easier for them to review.

    • Mario@lemmynsfw.comM
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      if you find the content too young, you report the individual posts to highlight them for the mods

      Do this. Mods don’t leave stuff in the mod queue for long and nothing sketchy or even borderline slips through. IF it gets reported!

  • Quetzacoatl@aiparadise.moe
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 year ago

    I honestly think splitting communities in smaller and smaller subsections is detrimental. A look at how things worked during the formative years of the internet (specifically WWW) might be helpful: Allow all content that’s not straight-up illegal, and leave it to the users to filter whatever they don’t want to see. You as a user can block certain people/communities, or switch to the “Subscribed” view to only see what you like, just like how it was on Reddit. That’s a better use of everyone’s time than a) getting more and more granular with what’s and what’s not allowed in AI (!) art, and b) immediately resorting to the nuclear option, which is a community split, defederating another instance, or similar measures.

    In general, we should avoid treating instances like specific subreddits, with very micromanaged rules and such. The instance is infrastructure, it should only deal with content insofar it is unhealthy for the network as a whole (banning bot instances, for example). Content on the other hand should be moderated by moderators on the community level (if you don’t like what you see, just make a new one and stop subscribing to the old one).

    This is better for network and community health, for the admins, and for proliferation of Lemmy in general.