Edit - This is a post to the meta group of Blåhaj Lemmy. It is not intended for the entire lemmyverse. If you are not on Blåhaj Lemmy and plan on dropping in to offer your opinion on how we are doing things in a way you don’t agree with, your post will be removed.
==
A user on our instance reported a post on lemmynsfw as CSAM. Upon seeing the post, I looked at the community it was part of, and immediately purged all traces of that community from our instance.
I approached the admins of lemmynsfw and they assured me that the models with content in the community were all verified as being over 18. The fact that the community is explicitly focused on making the models appear as if they’re not 18 was fine with them. The fact that both myself and one a member of this instance assumed it was CSAM, was fine with them. I was in fact told that I was body shaming.
I’m sorry for the lack of warning, but a community skirting the line trying to look like CSAM isn’t a line I’m willing to walk. I have defederated lemmynsfw and won’t be reinstating it whilst that community is active.
For anyone wondering, this is lemmynsfw’s take on the situation.
On a personal level, the vibes are off. Their defense seems really defensive and immediately moves to reframe the situation as body shaming. There’s a difference between an adult who looks underage posting porn of themselves and a community dedicated to porn of adults who look underage. Reducing the latter down to body shaming seems like unfair framing to me.
Did you check the community in question? I’m quite suprised to hear one could think that’s csam. To me it looks just like your typical low-effort onlyfans content. None of the models even looked “barely legal” but more like well over 20 in most cases.
The community in question listed “child-like” in their sidebar until after this defederation. Gross.
That’s a stumble, but it was because they copied and pasted the dictionary definition of “adorable” into the sidebar. The same community has more than two million members on Reddit and has been a staple for almost a decade. However, they simply wrote “It must be adorable.” instead defining adorable like Lemmy did, so there’s that.
Idk, it just seems weird to be outraged when everything is legal, consensual, and not even a fringe kink. This is like Australia banning small-titted pornstars in their late twenties in a recent project against CSAM, because these adults aren’t shaped in morally appropriate ways.
It seems weird to me that you’re on the meta for another instance complaining about a decision that doesn’t affect you.
There’s plenty of other comments here about other awful shit going on at lemmynsfw and I don’t feel like recapping that. From what I can see concerning the actual defederation, it is at worst the right choice for the wrong reasons.
The rest of the argument seems to be about the community’s intent. Including “child-like” in the sidebar of a porn community because it was copy/pasted from a dictionary definition may have been a ‘stumble’ but it was still negligent.
When I checked their communities most were basically empty?
And I didn’t see a community that fits that description.
Edit: I did try to enable nsfw content and tried from other accounts I have on other instances.
Your instance just deferedated from lemmyNSFW. You can’t see any new content there anymore with that account.
I tried in private browser mode and from accounts I have have lemmy.ml and Beehaw
I still didn’t see anything?
IDK what’s up
deleted by creator
As of a few hours ago beehaw was still federated with lemmynsfw.com, but the way the lemmynsfw admin team is handling this seems like that might not last
Yeah. I don’t think they’re sincerely trying to “be inclusive”. I think they’re just trying to misuse progressive concepts to their own advantage.
They know full well what they’re doing. The fact that it isn’t legally CP is just a technicality.
I think it’s really strange to call that a technicality. Adults with babyfaces and braces doing porn (which appears to be what this was about, as far as I can tell) is worlds apart from children being abused. Calling that a “technicality” is like saying the difference between a slasher movie and a snuff film is a “technicality.” People who watch slasher movies arent actually wanting to see snuff films deep down inside. And people who find adults with babyfaces attractive arent actually lusting after kids deep down inside.
They literally said in the post no one looks too young to be lusted after. Major red flag right there.
I feel this needs to be clarified. The point is that anyone of legal age deserves to be lusted after if that’s what they want. You telling them “you look too young, no one is allowed to find you attractive” is a bit… fucked.
To an extent, but there kind of has to be a line somewhere. I hope beyond hope that they can find fulfilling love and lust, if that’s what they want, in their personal lives. I’m all for body positivity in general. I’m just saying I wouldn’t be comfortable if Sandra Rae started posting sexually explicit content of herself. Maybe a bit of an extreme example, but they did say nobody.
I thought we already drew that line: 18 years of age, able to consent, and consenting. For context: I looked at this article where it is said Shauna Rae is 22 years old, however due to a condition her growth was stunted. I don’t think we should tell her not to date or do explicit things with her partner, when she finds one. It’s her body, and she is an adult. Similar to others with growth related conditions, such as dwarfism, or simply people who look petite even after they’ve come of age, who also get thrown under the bus regularly.
Let’s actually go that extra step and pretend she did make sexually explicit content. Now what? It immediately feels very wrong. Put that aside. I’m guessing most people are going to be worried about those with certain urges getting their rocks off…? (Honestly, not sure what to call them here, I was already unfamiliar with the term “CSAM”, so I’ll just leave it at that.) Now there’s content that’s legal and hasn’t harmed a child. That seems … better than the alternative?
I don’t think a person with unhealthy sexual urges gets to choose whether they have these urges or not. Demonizing them to the degree that we are, leads to most of them not being able to get the help they need. If it can’t be done by other means such as therapy, or therapy is not available, an outlet might help. And whether that’s “questionable” but legal porn, roleplaying, or other content or activities involving consenting adults that seems to tick the right boxes, … that’s up to them, not us. Again, miles better than the alternative, even if the immediate reaction is to be disgusted.
It’s an incredibly delicate problem. I’d say the right approach would be to do more scientific studies, but I imagine not many have or will be done because of the societal taboo. It’s also very iffy trying to search for existing research on this matter on the internet, and even if I could find some, I don’t have the expertise to know how scientifically sound it is.
In fact, in writing this and continuously re-reading my comment, I keep feeling like the points I’m making are scarily close to those of an apologist, or worse, someone who wants to normalize the sexualization of minors. I want to make it clear that I’m 100% against this. But I’m also against shaming the bodies of adults, telling them what they can’t and can’t do, because it makes me feel uncomfortable. (And I want to note that this is not meant to be an argument relating to the thread as a whole, as it would not want to tell the admins to host content I hypothesized in this post.)
HEY ADA, THIS GUY ISNT FROM BLAHAJ, WHY ISNT HE CENSORED? I THOUGHT THIS WAS FOR BLAHAJ INPUT ONLY? WHY THE DOUBLE STANDARD, ADA?WHY THE HYPOCRISY? I MEAN, WE ALREADY KNOW, BUT I WANNA SEE YOU SAY IT!
Oh please just shut up
No. 🤠
Removed by mod
“If someone made a community intended to fool people into thinking it was kiddy porn, that would be a real problem. If someone of age goes online and pretends – not roleplays, but pretends with intent to deceive – to be a child and makes porn, that is a real problem. Nobody here is doing that.”
JFC what a shitty take. Roleplay of CP is still fucking disgusting.
all Im seeing is you showing me them saying that they’re also not okay with it?
Read it again? They excluded roleplay as the act they find reprehensible
An adult role playing as a kid isn’t any different than them role playing as a dog, or a car, or a dragon. Are you going to tell me I can’t role play as a dragon while my partner role plays as a car?
Role playing as children having sex/being represented sexually is absolutely different from role-playing as a dragon fucking a car. If you can’t see that you might want to rethink some things.
Ageplay is absolutely a thing, but the point is they are adults. pretending to be something else, doesn’t change what they are.
It’s creepy, I would certainly not take part. But the bottom line is, in reality it’s just two adults playing pretend.
role-playing as a dragon fucking a car.
Is… is that a thing?
I’m going to need some photos to be able to fairly judge…
I think both instance admins have a valid stance on the matter. lemmynsfw appears to take reports very seriously and if necessary does age verification of questionable posts, something that likely takes a lot of time and effort. Blahaj Lemmy doesn’t like the idea of a community that’s dedicated to “adults that look or dress child-like”. While I understand the immediate (and perhaps somewhat reactionary) concern that might raise, is this concern based in fact, or in emotion?
Personally I’m in the camp of “let consenting adults do adult things”, whether that involves fetishes that are typically thought of as gross, dressing up in clothes or doing activities typically associated with younger ages, or simply having a body that appears underage to the average viewer. As the lemmynsfw admin mentioned, such persons have the right to lust and be lusted after, too. That’s why, as a society, we decided to draw the line at 18 years old, right?
I believe the concern is not that such content is not supposed to exist or be shared, but rather that it’s collected within a community. And I think the assumption here is that it makes it easy for “certain people” to find this content. But if it is in fact legal, and well moderated, then is there a problem? I don’t believe there is evidence that seeing such content could change your sexual preferences. On the other hand, saying such communities should not exist could send the wrong message, along the lines of “this is weird and should not exist”, which might be what was meant with “body shaming”.
I’m trying to make sense of the situation here and possibly try to deescalate things, as I do believe lemmynsfw approach to moderation otherwise appears to be very much compatible with Blahaj Lemmy. Is there a potential future where this decision is reconsidered? Would there be some sort of middle-ground that admins from both instances could meet and come to an understanding?
is this concern based in fact, or emotion?
Ada was clear in another comment thread that yes, emotion was absolutely involved in her decision. That isn’t a bad thing. Why is there a social attitude that decision-making is only valid if it’s cold and unfeeling?
Personally I’m in the camp of “let consenting adults do adult things”
Me too. I don’t think anyone is arguing against that. Anyone can still access LemmyNSFW’s content elsewhere, Blahaj Zone simply isn’t going to relay it anymore because some of it is incompatible with Ada’s goals in nurturing this community.
But if it is in fact legal, and well moderated, then is there a problem?
Yes. Legality has nothing to do with acceptability. This instance already bans lots of content that doesn’t actually violate any laws. It’s a judgment call.
Why is there a social attitude that decision-making is only valid if it’s cold and unfeeling?
Probably because everyone agrees that we don’t make the best decisions when emotional? In fact we tend to make our worst decisions when emotional? There’s a pretty significant difference between society judging people for being emotional, and society disapproving of emotional decisions. Because people making significant choices when they aren’t thinking clearly is pretty obviously a bad idea.
Yes. Legality has nothing to do with acceptability. This instance already bans lots of content that doesn’t actually violate any laws. It’s a judgment call.
And yet teen porn is one of the most popular categories around. This sounds like a subcategory confined to a single community, and precisely what the block function is for. There’s a pretty big difference between Exploding Heads and a single disliked community.
Edit: After finally seeing a link to the lemmynsfw discussion, it’s not a kink community or anything fringe. It’s literally a community around cute pornstars.
Yeah, see, it’s that conflation of “emotional” and “not thinking clearly” that bothers me. Those aren’t the same thing, despite the dominant cultural narrative to the contrary. Sometimes they go together, sometimes they don’t.
Are they not…? I mean, thinking clearly and intense emotions genuinely don’t go together. Crimes of passion, riots after sports games, getting “carried away” in the heat of the moment. Temporary insanity being an actual legal defense.
There’s a reason that a lot of good advice when handling intense emotions is all about taking a minute to step back and breath, clarify what you’re feeling, accept it, and then express it safely. There’s nothing wrong with being emotional, but arguing that there’s nothing wrong with making decisions while emotionally charged is just a really not good idea. The fact that the acronym for managing intense emotions is STOPP should be a bit telling.
Sometimes they go together, sometimes they don’t.
I read that, I’m just drawing a blank for moments where intense emotions and thinking clearly go well together beyond something like “I saw a bear and ran”.
Removed by mod
And do you care to provide examples of when high emotion and thinking clearly pair together? And by the way, when I say thinking clearly, I mean, being able to adapt to new information and actually think critically about situations. I don’t think being reactionary is the definition of thinking clearly.
Just a personal anecdote. I have intense emotions when dealing with transphobia but I think I’m able to think clearly. I think there absolutely are times where intense emotions can cloud thoughts but I beleive the converse isn’t true.
“Intense emotions can interfere with clear thinking” does not imply that “clear thinking is impossible when there are intense emotions”
It’s rough that you have to deal with that, and I applaud the restraint and poise that goes hand in hand with operating while under intense emotional strain. That said, emotional biases are problems precisely because their influence can range from the subtle to the obvious, and they’re a lot harder to see from the inside. It’s one of the reasons why STOPP has self analysis when experiencing powerful emotions. Most people don’t need it, but it’s always good to take a breath and evaluate every now and then.
For one, I think I speak for everyone when seeing a huge guy flip out and start screaming in public is alarming because you no longer trust that they will make decisions based on the normal rules of public interaction. I’m not saying that we shouldn’t listen to our emotions, they exist for a number of very important reasons, and paying attention to them is linked to better decisions. That said, making decisions while emotional is tempting because it often narrows attention and jumps to actions with immediate effects, which often feels like clarity when it’s really just expedience.
To sum everything up, intense emotions push for quick, immediate actions to deal with whatever is causing said emotions (a simplification, but it works). This is really great when startled by predators or protecting someone, but not when presented with complex situations lacking easy solutions. So I wouldn’t say that clear thinking is literally impossible when experiencing intense emotions, but I’d say there’s a very strong reason that emergency drills and procedures are set up so that people in high stress situations don’t actually need to think. I spent a bit of time reading up on it to provide a more complete argument than just appealing general wisdom, so apologies for the pile of words.
The reason I brought up emotion in my reply was because I’ve felt that the lemmynsfw admins have been able to explain their decision quite reasonably and seemed to be open to conversation, wheras Ada was set on one goal and upon finding disagreement, wasn’t in the right mindset to continue a constructive conversation. Which, to be fair, due to the nature of the content, is understandable.
If the content that the Blahaj Lemmy admins are concerned about are limited to certain communities, and part of the issue is the concentration of content in said communities in the first place (at least, as I speculated in my original reply), then I don’t quite understand why blocking these communities only isn’t something that was considered, rather than defederating the entire instance. I do respect Blahaj Lemmy’s decision not to want to host such content. Or is there some technical limitation that I’m not aware of?
I don’t quite understand why blocking these communities only isn’t something that was considered, rather than defederating the entire instance
Because I am not ok federating with a space that is ok with content that looks like CSAM. “It’s technically legal” isn’t sufficient in this case.
But whether it’s technically legal is exactly what does or doesn’t make it CSAM. “Looking like” is going to be highly subjective, and I don’t understand how the admins of the other instance are supposed to handle reports, other than to verify whether or not it actually is the case or not.
Are petite looking people not supposed to make explicit content while dressing up cute? Should a trans man not share explicit pictures of himself, because he might look like an underage boy? Do we stop at porn that gives the appearance of someone being young? What about incest or ageplay? Like, what if you or someone else was made sufficiently uncomfortable by some other kind of porn? How do you decide what is and isn’t okay? How do you avoid bias? What would you be telling a model when they ask why you removed their content?
Apologies for going on with this when I’m sure you’re already sick of dealing with this. I had just felt like some of the points I brought up (like in my original reply) were entirely overlooked. Putting effort into an (attempted) thought-out reply doesn’t mean I get to receive a response I was hoping for, but I was at least hoping for something you hadn’t already said elsewhere.
but I was at least hoping for something you hadn’t already said elsewhere.
There is no more to this. I don’t have a list of endless reasons.
The reason is that it looks like CSAM and appeals to folk looking for CSAM. I’m a CSA survivor myself. A space that appeals to folk looking for CSAM isn’t a community that I’m willing to share space with.
I guess the core of the disagreement is that one side values safety higher while the other does expression? It could be argued that moderation can take care of anyone stepping over the line. People can be unwelcome creeps regardless of what they’re into, who would be attracted to other dedicated communities. I imagine someone could have the same concerns you do for similar reasons, when it comes to consensual non-consent roleplay. Interestingly enough, this actually is temporarily restricted on lemmynsfw, which could be because an appropriate moderation policy has not yet been agreed upon.
Reminds me of a lot of the debates around kink at pride/ddlg kink stuff. The latter is really not my thing and makes me uncomfortable, but I recognise that that’s a personal thing between me and my partners that I can’t, and shouldn’t, police among others.
There’s also ethical debates to be had on porn in places like Lemmy/pornhub/etc. – we can’t know that the person has consented to being posted, or that they have recourse to get it taken down and stop it being spreaded if they do not.
Then there’s the realpolitik of, regardless of ethics, whether it’s better to have porn of this type in visible, well moderated communities, or whether it’s better to try to close off ethically dubious posting.
It’s one I don’t really have squared off in my head quite yet. Similarly with kink at pride; I’ve read about the historic importance of kinksters and recognise that, but at the same time I want there to be a space where queer kids can be involved with pride without being exposed to kink. Is that just prudish social norms talking? Idk; I’m still working it through.
For what it’s worth, I feel like while society has become more socially accepting of people being different (imperfectly, but we have), at least in the US we’ve become more and more prudish when it comes to sex itself. Part of the changing era has led to a reduction in exploitation and things that were generally viewed as sketchy, but not all that big of deal (kids inheriting porn mags, sexual harassment, imbalances in power), where now sketchy behavior is quickly called out.
That said, I feel like a lot of hard conversations have been completely avoided because they’d be awkward and uncomfortable and instead we just pretend they aren’t there.
Like in theory, anyone under 18 in the US can’t legally see so much as a titty (unless it’s art), read sexually explicit material, or see a movie or tv show with explicit content. And then, literally nobody wants to talk to teenagers about sex. I watched a reddit thread eat itself alive because a dad was furious that his wife had bought their daughter a dildo after he had confiscated her laptop when catching her looking at them and asked his wife to deal with it. People were calling for her to be reported for sexual abuse, while actual women were being attacked for sharing their own experiences as teens. Things just seem a little crazy.
People are so uncomfortable with the concept that they want to disappear anything that reminds them that 18 isn’t actually a magical division between childhood and adulthood. And then you have this thread, where lemmynsfw was banned because a community sharing “cute” pornstars was a step too far despite being actual professional adults. Idk, it seems exactly like Australia’s whole thing where they started banning pornstars in their late twenties because they have small tits as part of a project to “fight” child porn.
Yeah, this seems very well thought through. For what it’s worth, I’m UK based so will be talking from that perspective. I’m in agreement that sex education is absolutely dire – I can’t see any objection to a dispassionate education in both the cultural and scientific aspects of sex. I don’t even see it as an ‘oh well, if we have to’, since sex forms such an integral part of our cultural identities (of course, including when people fall outside the societal sexual norms).
More broadly on society’s difficulty with dealing with sex (and even the criminal aspects) I’ve read some interesting books on anti-carceral feminism recently that helped give me a different perspective on how I think about sexual crimes and its perpetrators beyond the simple instinctual judgements.
For the people like me that don’t know the term: CSAM is Child Sexual Abuse Materials. It’s the term used instead of CP as “pornography” is more commonly used for pleasure or conveys the idea of consent.
As for the porn that uses people that look under age, it’s no different than the anime children that are thousands of years old. It doesn’t matter how old they are, they look like children and it’s gross.
The world is messed up. I feel like advertising any adult material as “barely legal” should be banned too. It skirts the boundary too close. Not as close as the aforementioned thousand year old child body but it feels almost as bad imo.
I agree with you but not on the last point. There is a difference since they are real people, adults, and that they consent on being sexually attractive and arouse. I am not attracted to young looking bodies but that’s a notable difference to me. Also I don’t know how I feel about a community (in a broader way than a lemmy comm) focusing and fetichising on young looking adults (I do know that it disturbs me but I want to talk about society wise), but I understand that some people are attracted to young looking bodies and/or juvenile ones, and I feel like adults that consent to answer their desires is better than CASM
And that’s where the body shaming comes in, you’re literally telling this 20-something that their body is gross and no one should find them attractive. How would it make you feel if someone said that about you?
I’m not on this instance, but thank you for being so swift and resolute in your actions. Happy to see all due caution is being taken. Not so happy that such a community made it’s way here to the fediverse. Hopefully I won’t see any of it while doomscrolling.
I enjoy NSFW content, but I certainly don’t want to stumble into “how close to CSAM can we get while staying technically legal?” content. And the bullshit lie about this being “body shaming” pisses me off.
This admin decision obviously isn’t up for a vote, but it’s just so obviously the right call. Thank you Ada for handling this, and I’m sorry (in the Canadian way, not the guilty way 😉🇨🇦) you had to see any of that.
Removed by mod
was fine with them
That’s surprising since their rules say that not even fictive under-18 content is allowed:
Posting content involving any person who is under 18 is strictly forbidden. This includes real, drawn, and fictional content.
Could be argued as it is someone that’s over 18 regardless of how they look. This happens in actual mainstream porn, piper perri doesn’t look very old but she’s definitely over 18. I think as long as they don’t outright say that are underage they wouldn’t be breaking those rules
I get the feeling there’s going to be a lot of comments here from people who disagree.
This is not your instance. This is not even my instance, I am just signed up here (and thank you Ada, I like it here and I approve of this decision. CSAM-like porn is icky). There is no need to focus on the morality of sharing porn that ends up being viewed as CSAM. Hosting porn involves legal risk, and federating with an instance that has porn on it means that eventually you will host porn images. If you have your account here and you don’t like this choice, consider moving instances or hosting your own.
Not only that, does anyone remember /r/jailbait on reddit? They did not do anything about that subreddit because the images were “legal”, but the userbase they attracted began sharing real CSAM in the DMs. To be clear: I don’t know what community we’re talking about (lemmynsfw does not appear to have a jailbait community, I did not look hard) but you do not want the sort of people around that this attracts.
edit: remove unintentional link
Let’s be honest; the only reason Reddit ever did anything with that subreddit is because CNN brought bad PR to them.
I totally forgot about this. It’s so sad that you’re right. Sharing stuff in DMs was probably just the justification they needed to ban them without conflict (and oh my god, there was still so much drama.)
the same community (adorableporn) is also on reddit btw with 2.2m subscribers.
i have no grand moral opinion on this type of content. for me it is the same as femboy content for example, where people also push for a youthful, girly aesthetic.
as long as the content is made by consenting verified adults, i don’t care.
it’s like adults cosplaying with japanese school uniforms or calling your partner “mommy” or “daddy”.
probably not the best move in terms of sexual morals for sure, in the grand scheme of things tho this is just how people express their sexuality i guess.
it’s like adults cosplaying with japanese school uniforms or calling your partner “mommy” or “daddy”.
No, it’s not, because no one mistakes those things for actual underage children
No, it’s not, because no one mistakes those things for actual underage children
That’s not what happened here. No one would mistake the image in question. You say there were other images; the admins there give a story that contradicts yours. They say there were no such images. Didn’t see those images removed in the modlog, either.
Could it possibly be that someone has blown things out of proportion and got emotional?
That and the lack of humility afterwards make for a poor leader.
i had no problem distinguishing the models on the community from children.
maybe it’s more difficult in some cases without looking for the onlyfans link or sth similar of the model somewhere in the post, but that’s just human anatomy.
that’s why the guy at the gas station asks for my ID card, because it is not always super clear. but apparently clear enough for reddit admins and PR people from ad companies.
i agree playing into the innocent baby aspect is probably not great for sexual morals and i wouldn’t recommend this comm to a local priest or a nun, but this type of content thrives on pretty much every mainstream platform in some shape or form.
i get it, if this instance wants to be sexually pure and removed from evil carnal desires tho. that’s kind of cool too for sure.
i had no problem distinguishing the models on the community from children.
You didn’t see the content I saw. Content that was reported as CSAM by someone on this instance, who also thought it was CSAM.
maybe it’s more difficult in some cases without looking for the onlyfans link or sth similar of the model somewhere in the post, but that’s just human anatomy.
Again, a group that is focused on models in which that is the only way you can tell that they’re not underage, is a group that is focused on appealing to people who want underage models. That is a hard no.
Spin it how you like, but I am not going to be allowing material that is easily mistaken from CSAM
I thought about this some more and I can feel a lot more sympathy for your decision now.
It must be horrible to get a user report about CSAM and then see a picture, which could be really CSAM on first glance.
Even if every user report was wrong from now until infinity, that initial CSAM suspicion, because of the false user report, probably makes moderating just a soul-crushing activity.
It is great if admins from other instances are willing to handle with these horror reports, just to give their users a bigger platform, but this service is not something that can be taken for granted.
I’m sorry for coming across as ignorant, I just did not consider your perspective that much really.
“Even if every user report was wrong from now until infinity, that initial CSAM suspicion, because of the false user report, probably makes moderating just a soul-crushing activity.”
Then they shouldn’t be doing it. If seeing something that looks even slightly off-putting causes this level of over-reaction, Ada doesn’t need to be moderating a community for marginalized/at-risk people. I myself am a CSA survivor, and seeing my trauma being equated to some legal adults playing pretend is fuckin’ bullshit. Seeing my trauma being equated to drawn pictures is fuckin’ bullshit. My trauma being equated to AI generated shit is fuckin’ bullshit. I’ll tell you one thing, as a CSA kid, one thing I cannot stand is someone making decisions on my behalf. To protect me. Fuck you, I’ll fuckin bite anyone that tries to take away my free agency again.
I myself am a CSA survivor
FYI, so am I
Cool, welcome to the real world where one size does not fit all. We handle our trauma differently. But I don’t subject others to my hangups. I don’t use it as a cudgel to squash dissent. Your trauma is not your fault, but it is your responsibility, not ours, to deal with.
I totally get that and definitely don’t blame Ada for defederating (although I don’t think it’s likely it was actually CSAM, nor that the community it was on is Inherently Problematic, as long as everyone in the posts is 18+, people’s kinks are none of my business).
The thing I don’t get is why reports made by blahaj users on lemmynsfw communities would go to the blahaj moderators at all. That seems like a design flaw in Lemmy, instance mods have no power to moderate content on off-instance communities, so why would they be notified of reports? That seems like it would clutter mod-logs for no reason and cause unnecessary drama (as happened here). Like if every subreddit post report immediately went to the Site Admins, that would be Terrible.
Though if Lemmy really is built like this for whatever reason, I would probably have done the same thing. I wouldn’t want to have to be Subjected to everything that could be reported on an NSFW instance, there’s probably some Heinous Shit that gets posted at least Occasionally, and I wouldn’t want to see all of it either. I just think it’s Really Stupid that lemmy is built this way, we need better moderation tools
The thing I don’t get is why reports made by blahaj users on lemmynsfw communities would go to the blahaj moderators at all.
Reports go to the admins on the instance the reporter is from, to the admins on the instance the reported account is from and to the admins of the instance the community the post was made to is from. The report also goes to the moderators of the community that the content was posted to.
Each instance only gets a single report, however many of those boxes it ticks, and that report can be dealt with by admins or moderators.
However, the results federate differently based on who does the action. So for example, me deleting content from a lemmynsfw community doesn’t federate. It just removes it from my instance. However, a moderator or an admin from lemmynsfw removing lemmynsfw content will federate out.
deleted by creator
“You didn’t see the content I saw.”
Probably because it was removed for being against the rules?
Context always matters. I always check if adult material has actually been made by consenting adults. I would feel sick, if not enough information had been provided for that, but I at least have never encountered CSAM fortunately.
I had classmates in high school with balding or even graying hair and full beards. Some adults older than me, look younger than my nephews. Revenge porn and creepshots are common. (or atleast were, I’m not on platforms where these are popular)
Without context, porn will always be a morally grey area. Even commercialized hyper-capitalist porn is still an intimate affair.
That’s why I didn’t use pornhub for example, before every user had to verify themselves before posting. Before that I only read erotica or looked at suggestive drawings.
I understand your perspective tho. You get hardly paid to keep this instance running, looking at pictures that without context could be CSAM could make this volunteer work very mentally taxing. This is how NSFW works tho.
Without context, any pornographic material featuring real humans could in truth be some piece of evidence for a horrible crime.
Context always matters. I always check if adult material has actually been made by consenting adults. I would feel sick, if not enough information had been provided for that, but I at least have never encountered CSAM fortunately.
If I can’t tell, if I have to look something up because the people I’m looking at look like they’re underage, then it doesn’t matter what the answer is, because the issue is that it looks like CSAM even if it’s not. And a community designed in a way that attracts people looking for underage content is not a space I’m willing to federate with.
Isn’t it kind of shitty to tell an adult woman she can never be attractive or sexy because she looks too young? Do you truly believe that said person should never be allowed to find love, because it’s creepy? Is she supposed to just give up because you think her body is icky?
I’ve covered this many times already.
The issue isn’t individuals that happen to look younger than they are. The issue is with a community gathering sexual content of people that appear to be children.
The community that initiated this isn’t even the worst offender on lemmynsfw. There is at least one other that is explicitly focused on this.
Anyone wanna save me from having to google what CSAM is?
Child Sex Abuse Material
Thanks
Access portal to government services
If it also means something else then my country is going to be exceptionally often flagged
deleted by creator
Thank you. Just the spam in new was bad enough, but CSAM? Holy crap.
To be clear, it is not CSAM. It is legal porn deliberately designed to look like CSAM
In some jurisdictions that is still considered csam. Even if it’s animated or whatever excuse…
Like when Australia floated banning smaller-boobed women from porn, which is also something that everyone agreed with.
Only big tiddy goth gf allowed
-Australian government
Floated? This is law
Great. Even better that they decided to actually enforce that only girls with big tits count as women and everything else is a child. That is an entirely rational and reasonable approach.
Would you mind posting chat logs like the lemmynsfw team for transparency sake? Not trying to cause more drama but I think the whole thing just needs to be more transparent. Sorry if this is an out of line request.
The ones the lemmynsfw admins posted are accurate.
This post isn’t for the Fediverse, it’s an announcement to the users of Blåhaj as it impacts their experiences here.
I don’t intend to get in to he said/she said over it with the wider Fediverse
Thanks wasn’t trying to. Totally not here to pick sides or start trouble. Just wanted an end the speculation I was reading on both sides… I just wanted to make sure it was accurate. Thanks for the verification. Aggressive support to you and your community still 🫡 I’ll be off to my own areas.
Thanks :)
Never mind see I’m the comments below.
Child sexual abuse materials
Ugh. They need to be more than defederated, even if isn’t actually CSAM. Sick people.
Thank you. I was just about to search the acronym CSAM to find out what it meant. Eww.
It still feels in the grey area just like some anime
My “favorite” was the vampire who has the body of a little girl, but the argument was “in the story, she’s actually hundreds of years old, so she’s not a minor!” 🙄
Tbh shoutout to Connect, let’s me block whatever. My block list primarily the hundreds of gross, weird porn that has popped up on this site.
You don’t appreciate 46 different sub-genres of furry porn, each with a separate community, filling up your feed?
Honestly almost all the porn I see on here is straight, I’ve only seen furry a handful of times with casual scrolling.
I’ve seen the joke a few times now, I just think it’s become the thing to joke about. I’m on a yiff instance and have 3 yiff communities of my own here and my feed is still 90% human porn.
The instances that run the degenerate porn are all basically isolated and don’t get much traction. And people in the main fediverse don’t ever hear of them because trying to point one out never devolves into much of a reaction (except Burggit because Loli is an easy target)
Best app I’ve used so far. Why are there so many furry porn communities anyway? They’re pretty much my whole blocklist
It’s ironic this went down over adorableporn and not fauxbait
If this really is about that post sharing an image of /u/Im_Cherry_Blossom I’m a bit on the fence about this, but I leave it to Ada’s discretion.
I acted on the report I saw. By the sounds of it, I’d have acted exactly the same way if the report was for the other community
I’m surprised as well, until I read your comment, I thought this was about fauxbait. That community is a giant red flag IMO.
or the other one about posting pictures of women you know in real life without their knowledge
Removed by mod
There is a stark contrast between fetishizing body parts and fetishizing underage people. Calling it “shitty” is an understatement by several orders of magnitude. If you are attracted to someone because they look underage, I’d have to question your moral judgement when it comes to the real thing. People aren’t known for making wise choices when under the influence of hormones, and I think that the venn diagram of pseudo-pedos and real ones may overlap a lot more than you seem to want to accept.
I am all for blocking this instance and defederating it. If someone wants to ride that line, they know that address, but now they have to type it into their browser instead of hoping those images show up in their feed.
They didn’t defederate the instance for the content alone, but specifically for being on board with intentionally making content seem like CSAM. That’s a long, long step beyond the subject of an image just looking less than 18.
specifically for being on board with intentionally making content seem like CSAM. That’s a long, long step beyond the subject of an image just looking less than 18.
Do we actually know the community was about? I haven’t seen anything remotely as described on lemmynsfw. I have no clue as to how the admins here ran into such a thing.
We also don’t know what the admins talked about between themselves. I’m not buying it. Seems more likely that the admins here personally didn’t like whatever that was, and added the CSAM label to stop any discussion (cuz if you raise any question you must be a pedo right?). It’s their instance they can do as they please.
Another more feasible thing is that they just don’t want to deal with NSFW content. Just like Reddit & imgur did under the guise of ‘protecting users’.
Both of those seem way more likely and with precedent than admins of a server willingly fostering CSAM-like stuff
Their sidebar states
All things adorable but NSFW. Adorable: lovable, sweet, cute; inspiring great affection; delightful; charming;
If you like fresh, young starlets, this is the place for you!
specifically for being on board with intentionally making content seem like CSAM. That’s a long, long step beyond the subject of an image just looking less than 18.
Do we actually know the community was about? I haven’t seen anything remotely as described on lemmynsfw. I have no clue as to how the admins here ran into such a thing.
We also don’t know what the admins talked about between themselves. I’m not buying it. Seems more likely that the admins here personally didn’t like whatever that was, and added the CSAM label to stop any discussion (cuz if you raise any question you must be a pedo right?). It’s their instance they can do as they please.
Another more feasible thing is that they just don’t want to deal with NSFW content. Just like Reddit & imgur did under the guise of ‘protecting users’.
Both of those seem way more likely and with precedent than admins of a server willingly fostering CSAM-like stuff
specifically for being on board with intentionally making content seem like CSAM. That’s a long, long step beyond the subject of an image just looking less than 18.
Do we actually know the community was about? I haven’t seen anything remotely as described on lemmynsfw. I have no clue as to how the admins here ran into such a thing.
We also don’t know what the admins talked about between themselves. I’m not buying it. Seems more likely that the admins here personally didn’t like whatever that was, and added the CSAM label to stop any discussion (cuz if you raise any question you must be a pedo right?). It’s their instance they can do as they please.
Another more feasible thing is that they just don’t want to deal with NSFW content. Just like Reddit & imgur did under the guise of ‘protecting users’.
Both of those seem way more likely and with precedent than admins of a server willingly fostering CSAM-like stuff
deleted by creator
Being real? I have pretty much all of the porn C/s blocked now, so I had to go look specifically for what the problem was.
Nobody with eyes and basic reading comprehension could mistake anything on that community for underage people. The instance has very clear rules about age verification, and the posts all followed them.
Someone seeing them as underage would take either horrible ability to discern age, or just being so wound tight that they weren’t willing to consider otherwise.
Which, again, this is their instance, they can make the decision for no reason at all, and I’m okay with that. It’s just that the stated reason is, bluntly, malarkey.
Kinda glad that server is blocked. I’ve had to block over 100 subs from them over the last few weeks.
The amount of porn that was coming in on my feed was crazy.
Fr, I love browsing all but there is a LOT of porn there.
To be fair, it IS a porn instance. I think I would rather blame the sorting algorithms of lemmy that allows popular communities, including memes, to spam the All page due to the sheer number of posts and hides smaller ones.
I feel like I’ve blocked like 5+ different yiff subs that were almost all the exact same name lol
Hey now, pregnant yiff is clearly way different than thick lady yiff, c’mon.
I’m confused on why you didn’t just turn off NSFW if you didn’t want porn in your feed?
that also hides things that aren’t porn, and just marked as NSFW because there’s a wound or mention of violence for example.
Like, i still want to see a lot of non-porn NSFW posts, lemmy REALLY needs specific tags for content so we can filter things properly.
It’s not just porn that uses the NSFW flair. I really wished Lemmy would have adapted a new flair since reddit had this exact same issue.
Right!? I was desperately wishing for a way to block the whole instance so this is great news for me
Thanks for the feature idea! I’ll add “Block Instance” function to my app.
I guess Trans Littles can just go fuck off then? One of the biggest Trans comics artist is openly a little. Why are we in the business of regulating what consenting adults do?
Don’t be disingenuous. Genuine consent practices also consider that not everyone else consents to witnessing their play, so they don’t do it where it’s not welcomed. And it’s not welcomed on Blahaj Zone, in this case. That’s all.
Exscuse me but you’re the one being disengenious, a NSFW instance had what!? Porn!? Stop the fucking presses. Are we going to defederate from all porn instances or just the ones you find icky? Where can I post my objection to having to be subjected to porn at all?
Let me know when you find out what the word “disingenuous” means.
Responses are pointless, I’m going to remake my account in another instance when I’m done with work. Enjoy insulting people and living your prudish dreams
I think it’s a little more complicated than that. If it were just a matter of not consenting to seeing their play, that community would be blocked. But instead, the entire instance has been defederated, so that’s not really a fair comparison.
I’m sorry, could you rephrase that? I’m not sure I understand
Leigh made an analogy to how in kink communities, it’s generally not cool to involve people in your kink who didn’t consent to it, and that defederating was basically just Blahaj Zone saying that they don’t consent to seeing any sort of ageplay-adjacent content.
I’m saying that’s a bad analogy, because they could just ban the offending content if that was the only concern; instead they’ve banned the entire instance by association. I’m not saying it’s a bad call, just that it’s a step beyond “don’t involve me in your kink”, it’s now “I don’t want to see anyone who lives in the same house as you while you do your kink, even if it has nothing to do with your kink.”
It goes beyond even that, actually. This isn’t one individual making the decision for themselves, this is one individual making the decision for their entire household. “I don’t want anyone who lives in the same house as me to see anyone who lives in the same house as you, because you did your kink, even if it has nothing to do with your kink.”
At some point the metaphor starts falling apart a bit…
Ah, I see. I agree
Is it actually possible to instance-level ban a community that’s hosted by another instance without defederating? I’m under the belief that it isn’t, but if I’m wrong on that, then I think I’d agree with you here.
It is possible as far as I’m aware, I think Ada mentioned doing it before.
That makes this all so much more infuriating then!
No one is looking at a little and thinking that they’re physically 15.
I wrote a comment but got more aggressive than I intended. My overall point though is there are young looking adults, there are old looking kids. Making a sweeping statement like you did is just wrong
Young looking adults also aren’t the issue.
The issue is a community that focuses heavily on models that are framed to look like they’re not adults.
Not adults roleplaying. Not adults that incidentally happen to look younger than they are.
But they are adults, no?
Again, the issue is a community with models that are framed to look like they’re not adults.
There is no scenario where something that can be mistaken for CSAM will have a space here.
And again, these are adults on an instance that was explicitly designated for NSFW works. Defederating was entirely within your right but these justifications seem really poorly thought out, and could have unintended consequences.
Should we shun non consensual play? Should we defederate from anything that shows BDSM? Because I can’t see any reason why your justifications wouldn’t apply to them
Defederating was entirely within your right but these justifications seem really poorly thought out, and could have unintended consequences.
You disagreeing with the decision doesn’t make it poorly thought out.
Because I can’t see any reason why your justifications wouldn’t apply to them
There is no outcome here that leads to me saying “Ah, good point, this makes me ok with content that can be mistaken for CSAM”
Hypotheticals and what ifs do not change the fact that I encountered something that looked like CSAM, and when I looked at the community in question, I encountered more of it.
That’s a hard no.
Removed by mod
The topic was about porn of a straight couple, but be a disgusting gremlin I guess