12 Comments
User's avatar
James Hanlon's avatar

I think a lot of the controversy has stemmed from leadership feeling their oats about taking an ideological stand + people misunderstanding the way content moderation works on Substack. It's a hamstrung system. They don't want to have to deal with a lot of internal moderation, so they outsource as much as possible to users and publication owners with the Report feature.

Which would be okay if they made it clear to everyone that there actually is a way to get content removed via Content Violations, but they bent over backwards to take the absolutist position. I mean, it seems like it would have calmed people a little if Hamish literally just explained how to actually get problematic publications removed. He could have included the link in his response and made everyone look like they were overreacting!

NickS (WA)'s avatar

Yes, I mostly agree with Noah's comment: https://substack.com/@noahberlatsky/note/c-47045733

"Instead the substack founders grabbed a large blunt object and started to smash themselves in the face with it. it’s completely unnecessary."

There are a couple of cases in which I can understand why the founders are sticking to their guns (For example, I do understand why Chris Best said, in his interview, that he didn't want to discuss hypothetical moderation scenarios . . . that's an easy path to going down a rabbit hole. But he should have added, "if you want to follow-up and get clarification about Subtack policies I will give you the name of someone to contact after the interview.").

But it's _remarkable_ how frequently their public statements sound like they haven't given the question more than 20 minutes of thought and are just extemporizing about free speech.

That said, I think there's an encouraging side to that simply because there's a lot of easy gains that can be made by both publicizing the existing resources(like the content violation form) and pushing Substack for more transparency.

Lately I've been pushing harder, when talking to people who aren't the most committed free speech people but who start from a baseline intuition of, "why not let everyone do what they want and only intervene _after_ it becomes a problem" by saying, "Substack already does moderation. Even if you don't think they should do _more_ of it, wouldn't it be good for them to do it _better_ and more carefully."

As an aside; I want to thank you for your writing on the topic. I appreciate the long posts that you've put together that are well documented and well-linked.

Also, this discussion reminds me of an example which I might write up as a short post.

James Hanlon's avatar

Thanks Nick, I appreciate your takes as well. I'm also trying the "earnestness" angle πŸ˜‚

Ultimately I think if any serious moderation ends up taking place here on Substack, it's not going to come from people employed by Substack. It's going to be unpaid volunteers going out of their way to find the darkest corners and shining a light. If the best we can do is equip them with better flashlights and spread awareness, it's something at least.

CharleyCarp's avatar

Agreed.

I would just add that Substack shouldn't be lead to believe that they've brought the thing to a close.

NickS (WA)'s avatar

I agree, and for what it's worth, the particular point _I_ want to emphasize right now is, "distinct from the question of _what_ should be moderated, it's really important to try to do moderation _correctly_"

At the very least (1) the interface to report items should be clearer; so people know whether they're making a report to Substack Trust & Safety or to a substack author. (2) When posts do get flagged for violations, the person who wrote them should be informed, and there should be an appeals process and (3) there should be (monthly? quarterly?) reporting on moderation.

That wasn't where I would have started, but as I was reading on the issue I came across the "Santa Clara Principles" which are mostly concerned about protecting free speech, but I think they're good concerns.

CharleyCarp's avatar

I don't get why you think there is more to come. Ownership's ideological commitments are clear. Do you think Casey Newton or Anne Helen Petersen can talk them in off the ledge? Why?

NickS (WA)'s avatar

This is some progress. I'm not surprised that they will only act in response to complaints, but it is a minimal step. I'm happy that it's been announced so quickly. I'm a little surprised that Casey didn't say anything about the number of accounts. I'm very curious to know what, if anything, this means: "We are actively working on more reporting tools that can be used to flag content that potentially violates our guidelines, and we will continue working on tools for user moderation so Substack users can set and refine the terms of their own experience on the platform. "

https://www.platformer.news/p/substack-says-it-will-remove-nazi?comments=true

NickS (WA)'s avatar

This is quite bad, and certainly evidence that they are really not taking the issues seriously: https://substack.com/@katz/note/c-46840863

NickS (WA)'s avatar

I am not completely optimistic. I'm glad that Casey and Anne are continuing to press, and I don't think their current position is going to hold up -- it is concerning that they now seem to be lying to get temporary advantage. So I think _something_ else will happen, and it could be good or bad.

CharleyCarp's avatar

Yeah, I've been following this on Casey's discord. I think SS was looking to do the absolute least they could do while still saying they're doing something. Better than nothing, but I think this'll keep going.

User moderation means you can hide Nazi content from yourself. But Nazis can still organize, and proliferate on the platform undisturbed.

NickS (WA)'s avatar

Yes, I don't mean to give them any more credit than necessary; I still deeply suspicious, but for the moment my response is, "that's enough to work with."

I definitely think they're only acting under duress; but this was also a logical next move for them . . .