Discuss Scratch

ladnat
Scratcher
66 posts

Ideas about moderation. Help ST stop being overworked!

Ideas About Moderation

There's a lot of talk about Scratch moderation. Many users claim that lots of users/projects were unfairly banned. And that could be the case. Other users say that the majority of the users under a ban are just saying that to not take responsibility for their actions or are simply unaware. This could also be true. I understand that the community guidelines and the other unspoken rules are strict since this platform is meant to house people ages 8 and up. The problem is, no one knows what the actual situation is. Not even the Scratch Team.

ST member Paddle2See wrote:

And, as others have pointed out, I don't think the current error rate is very high. But I don't have statistics to back that up.
quote origin: here

The only true way we would know how many Scratchers were unfairly banned is if we looked at every single report and every single action and all the context for each and we don't have time in the heat death of the universe for that.

So what could we do to?

We need a plausible solution for a non-profit organization with minimal staff. So far I have two solutions to the permanently understaffed situation and here are the pros and cons (that I can see) with both.

Volunteers

Already I see the biggest problem. "No No! You can't just trust anyone and everyone to be able to remove projects and users!" But I have a few workarounds. Make it teams of two. Two guys are better than one guy. They'll be in communication with each other and have the community guidelines on hand. Better yet they could be equipped with all reasonable reasons that they could be taken down. However, even teams of two could be problematic. That's why the power of volunteers would be limited:
  • They can censor projects but have no ability to delete them.
  • They can give detailed warnings/alerts about what specifically is wrong.
  • They both have to agree in order to make a decision. (In the event of a tie, a Scratch moderator will settle it.)
  • They will not have the ability to take rash action. (As in deleting accounts, forever muting a user, or any other strict punishment) Any large issue will have to have an official Scratch moderator review it.
Another solution to the volunteer system is to shuffle the teams regularly. That way the good volunteers are less likely to be cut off and the appropriate action will be taken. To keep volunteers from repeatedly falsely acting, a user can appeal to the ST to change the punishment or remove it. In the event that the ST finds that volunteers are taking false action, they'll give the volunteers a warning. If a project/user/comment is reported and no action was taken when there should be, the volunteers who took no action will again receive a warning. 3 strikes and the bad volunteer is permanently removed.

This is all just an idea. It doesn't have to look exactly like this.

Pros
  • The ST would have more time to spend on harder cases, overlooking users' requests for their accounts back, and understanding more context (in theory).
  • The response time should improve.
  • It would certainly satisfy many users' complaints about moderation (again, in theory).
Cons
  • This type of system could still be exploited by malicious users. Polish will be necessary.
  • The system might give the ST more work to respond to instead of less. (Of course, it relieves some weight but could add more than it removes. It depends how much power the teams are given.)
  • Kids would likely try to help moderate content unsafe for them.

Better automated systems

Another solution could be to implement an automated system. Just look at all the latest advancements in artificial intelligence. AI that can tell you what's in any given image, a text-guessing AI that seems to have some common sense reasoning, and many more! This is the safer solution. Instead of entrusting the public to respond to a good chunk of reports. Why not an AI that can learn how to best moderate Scratch accordingly and evenly? The biggest issue is creating such a program. That's a lot of resources and such that Scratch, being a non-profit, might not be able to afford. They'd need to train the AI to recognize what's good, and what's bad, and (if possible) why it was bad. As far as I'm concerned, Scratch is understaffed, and so devoting time like this is going to pay a hefty toll. It could be a fantastic plan if it wasn't so costly.

Pros
  • ST will definitely have more time to solve hard problems. (Once the system is applied and running.)
  • The time it takes for action is almost immediate.
  • It would be able to judge every report evenly.
Cons
  • It is a costly operation in both time and money. (Developers need paying after all)
  • It could still end up being biased if training is skewed.
  • It still wouldn't be able to look at or understand the context of every given situation.
  • CPU power heavy. Could slow the site as a result. -@Zydrolic (#4)

Conclusion

Overall I'm trying to suggest ideas to let the ST get a break. Appeals have been known to take forever to be looked at and with the lack of time ST has, it's understandable. Bonus problems are that since ST has no time to waste, they won't be able to fully look at all the context behind the ban or mute of a user and they might skim over their reasons for why they took action. And so I propose we try and see if we can find a solution to ST's lack of recourses.

Idea construction? Reasons for why or why not you agree?
Share them below! Let's see if we can improve Scratch's resource budget. There's a lot more to Scratch than just moderation.

Last edited by ladnat (March 18, 2025 21:28:14)

BigNate469
Scratcher
1000+ posts

Ideas about moderation. Help ST stop being overworked!

Partially rejected:

TOLORS wrote:

7.4 Allow Scratchers to moderate the website
A community moderator program used to exist where Scratchers could moderate the website, but it was removed due to some very inappropriate things showing up in the report queue, among other reasons. A volunteer program could also encourage trolls to use moderator tools to cause problems on the website. As a result, the Scratch Team has decided to only allow adults to moderate the website, including the forums, and those adults would need to join the Scratch Team as a paid position in order to moderate the website.

If you are 18 or older, legally allowed to work in the United States, and are interested in moderating the website, check out the “Jobs” link at the bottom of the website to see if there are any openings for the “Community Moderator” position.

Also, AI is highly unreliable and requires a lot of processing power to run.

Then again, there is something you can do to help with this.
ladnat
Scratcher
66 posts

Ideas about moderation. Help ST stop being overworked!

BigNate469 wrote:

Partially rejected:

TOLORS wrote:

7.4 Allow Scratchers to moderate the website
A community moderator program used to exist where Scratchers could moderate the website, but it was removed due to some very inappropriate things showing up in the report queue, among other reasons. A volunteer program could also encourage trolls to use moderator tools to cause problems on the website. As a result, the Scratch Team has decided to only allow adults to moderate the website, including the forums, and those adults would need to join the Scratch Team as a paid position in order to moderate the website.

If you are 18 or older, legally allowed to work in the United States, and are interested in moderating the website, check out the “Jobs” link at the bottom of the website to see if there are any openings for the “Community Moderator” position.
Is why I was trying find a workaround. I understand that malicious users would attempt to take advantage of it. That's why the other volunteer and ST are keeping them in check. Buuut the best solution is to make the teams larger but still an even number. (So ties can be settled by ST) People have less power when put in a team.
Zydrolic
Scratcher
1000+ posts

Ideas about moderation. Help ST stop being overworked!

BigNate469 wrote:

Also, AI is highly unreliable and requires a lot of processing power to run.
.
I'd like to add, it's not just processing power we are talking about - We are talking about possibly over a dozen million prompts/requests to it daily, which may just end up straining the servers a bunch, especially since they have to wait for a response that is slower in general.

ladnat wrote:

Is why I was trying find a workaround. I understand that malicious users would attempt to take advantage of it. That's why the other volunteer and ST are keeping them in check. Buuut the best solution is to make the teams larger but still an even number. (So ties can be settled by ST) People have less power when put in a team.
Fake your IP, spoof your user-agent, and you can easily concoct a recipe for disaster, even if it may take long to boil.

This site isn't invasive to the point where they send away users who don't want their IP dripping with each fetch, nor asking you to even disable an adblocker, even if all you really block is google tag manager.

I'd surmise not a lot of people would love having to see the opposite on a site that did the opposite for several years without end.
ladnat
Scratcher
66 posts

Ideas about moderation. Help ST stop being overworked!

Zydrolic wrote:

I'd surmise not a lot of people would love having to see the opposite on a site that did the opposite for several years without end.
I'm not sure what you are saying here. Could you rephrase that?

Zydrolic wrote:

Fake your IP, spoof your user-agent, and you can easily concoct a recipe for disaster, even if it may take long to boil.
I mean yeah. You can't exactly block a determined somebody forever. But, OOhhhhh I see what you're saying now. One guy spoofs the user-agent and posses as multiple volunteers simultaneously. I see. If the teams are shuffled and out of the volunteer's control as to who they get assigned to, they'd have to brew up thousands of of alts and they might, just might, have full control of a few reports. That feels a like a difficult task even for the most determined.
Zydrolic
Scratcher
1000+ posts

Ideas about moderation. Help ST stop being overworked!

ladnat wrote:

Zydrolic wrote:

I'd surmise not a lot of people would love having to see the opposite on a site that did the opposite for several years without end.
I'm not sure what you are saying here. Could you rephrase that?
Having to see a site that doesn't even have to block an adblocker (Which, they wouldn't even be able to due to european regulations God bless) do so anyway, or block VPNs just because it hinders the amount of work necessary.
It's just looking at it from the privacy side of things (sorry if i sound like gibberish, sleep fatigue has its toll so I'm getting off right after this lol)
I mean yeah. You can't exactly block a determined somebody forever. But, OOhhhhh I see what you're saying now. One guy spoofs the user-agent and posses as multiple volunteers simultaneously. I see. If the teams are shuffled and out of the volunteer's control as to who they get assigned to, they'd have to brew up thousands of of alts and they might, just might, have full control of a few reports. That feels a like a difficult task even for the most determined.
One slot wouldn't be enough with what system you suggest however, as the other volunteer may not be in on the “little trolling”.
Either they go for a triple seven, or they hit a brick wall and the whole thing has to be started over. They gamble anyways.

All-in-all by the way, I'd argue this would just be more people to supervise, do note that reports definitely get 18+. With a site promising to have it's content, including it's content moderation, attempt to give an All Ages delivery, I'm not certain how you'd be aligning that.
You can fake your age easy as ever, having to show your literal government ID would feel excessively intrusive even if it helps to the goal you're trying to achieve.
mingo-gag
Scratcher
1000+ posts

Ideas about moderation. Help ST stop being overworked!

Oh great another one of those “Scratch Moderation is Bad because I said so”

Maybe Scratch Team Isn't so bad after all article

If you have an issue with scratch moderation then use the contact us button.
50st
Scratcher
8 posts

Ideas about moderation. Help ST stop being overworked!

BigNate469 wrote:

Partially rejected:

TOLORS wrote:

7.4 Allow Scratchers to moderate the website
A community moderator program used to exist where Scratchers could moderate the website, but it was removed due to some very inappropriate things showing up in the report queue, among other reasons. A volunteer program could also encourage trolls to use moderator tools to cause problems on the website. As a result, the Scratch Team has decided to only allow adults to moderate the website, including the forums, and those adults would need to join the Scratch Team as a paid position in order to moderate the website.

If you are 18 or older, legally allowed to work in the United States, and are interested in moderating the website, check out the “Jobs” link at the bottom of the website to see if there are any openings for the “Community Moderator” position.
As BigNate469 said, this is rejected.

Aah wrong account

Last edited by 50st (March 18, 2025 23:25:04)

MagicCoder330
Scratcher
1000+ posts

Ideas about moderation. Help ST stop being overworked!

mingo-gag wrote:

Oh great another one of those “Scratch Moderation is Bad because I said so”

Maybe Scratch Team Isn't so bad after all article

If you have an issue with scratch moderation then use the contact us button.
you did not read the OP.

I don't think the problem here is really people being bad actors (although it is a problem and you guys have valid points about it)
I think the problem is that reports are for removing anything bad, “Bad” can constitute anything from simple tag spam to a certain kind of adult images - Kids shouldn't be seeing this.

If anything, I would have it restricted to ONLY “exact copy of project”, “scary”, “uses image/art without credit”, “is disrespectful to a scratcher or group” and “I am worried for creator's safety.” This will keep inappropriate content to a minimum. Again, does not address the problem of people pulling a cult of the lamb once they are elected, but it could make it more feasable.
Sasha-mouse
Scratcher
500+ posts

Ideas about moderation. Help ST stop being overworked!

(deleted, but created new comment. This happened because of a bug: I wanted to edit the old one, but instead scratch wrote a new comment.)

Last edited by Sasha-mouse (March 19, 2025 15:04:56)

Sasha-mouse
Scratcher
500+ posts

Ideas about moderation. Help ST stop being overworked!

Zydrolic wrote:


All-in-all by the way, I'd argue this would just be more people to supervise, do note that reports definitely get 18+. With a site promising to have it's content, including it's content moderation, attempt to give an All Ages delivery, I'm not certain how you'd be aligning that.
You can fake your age easy as ever, having to show your literal government ID would feel excessively intrusive even if it helps to the goal you're trying to achieve.
But scratshers must see 18+ content to report it (in 90% chance).
Zydrolic
Scratcher
1000+ posts

Ideas about moderation. Help ST stop being overworked!

Sasha-mouse wrote:

But scratshers must see 18+ content to report it (in 90% chance).
Yes, however they aren't the one that has to witness every report made, many of which possibly about more 18+ content than the user reporting it sees.

MagicCoder330 wrote:

I don't think the problem here is really people being bad actors (although it is a problem and you guys have valid points about it)
I think the problem is that reports are for removing anything bad, “Bad” can constitute anything from simple tag spam to a certain kind of adult images - Kids shouldn't be seeing this.

If anything, I would have it restricted to ONLY “exact copy of project”, “scary”, “uses image/art without credit”, “is disrespectful to a scratcher or group” and “I am worried for creator's safety.” This will keep inappropriate content to a minimum. Again, does not address the problem of people pulling a cult of the lamb once they are elected, but it could make it more feasable.
yup, true. That aside, people may agree on things that wouldn't even need to be punished irregardless of how we align it, which is also a down.
I'd argue though, maybe don't allow “I am worried for the creator's safety”? That's… going into content that may be forbidden by DSA already and I doubt 2 people (who, we have no age gap of - Keep that in mind, we don't know what the user that is being granted such permissions' age is. This is also a talkable point - which risks them just going into a downwards spiral themselves quite possibly depending on what the content really is, whether or not the intent is to vent out bad feelings, or to make people get bad feelings) would really have the mental strength to properly tell the creator to call a (mental health assistance) hotline like they should, letalone view that content.
VedanshS933
Scratcher
1000+ posts

Ideas about moderation. Help ST stop being overworked!

Youth Advisory Board. What does that do then?
GratefulGamer9398233
Scratcher
1000+ posts

Ideas about moderation. Help ST stop being overworked!

VedanshS933 wrote:

Youth Advisory Board. What does that do then?
From Google:
“…Taking part in consultation work by listening to other young people and hearing their views…”
BigNate469
Scratcher
1000+ posts

Ideas about moderation. Help ST stop being overworked!

VedanshS933 wrote:

Youth Advisory Board. What does that do then?
From the announcement about it:

ceebee wrote:

What is the Youth Advisory Board?
The Youth Advisory board is a new program designed to enable youth leaders within the Scratch online community to collaborate directly with the Scratch Team, providing insights from a youth perspective and contributing to the continued growth of Scratch.
TL;DR they're Scratchers that serve as advisors to the Scratch Team.
kip23s
Scratcher
500+ posts

Ideas about moderation. Help ST stop being overworked!

bump
coder2310
Scratcher
100+ posts

Ideas about moderation. Help ST stop being overworked!

I agree with many statements here, like how young moderation volunteers will a -lot of the time- be exposed innapropriate content.

And how AI is expensive to run and hard to teach.

Though while the moderation system is slow and flawed, to the point where even the contact us button can fail on the off occasion. And the system could probably do better.

I don't think there are any other options that wouldn't inhibit safety, break a core part of scratch, or make the system even worse.
SMG4fan7236
Scratcher
100+ posts

Ideas about moderation. Help ST stop being overworked!

ladnat
Scratcher
66 posts

Ideas about moderation. Help ST stop being overworked!

Yeah, uh… I was throwing something at the wall, seeing if it would stick but I think the counter points overpower this idea.

So… I'm closing this topic as a rejected idea.

Powered by DjangoBB