slug.com slug.com

4 5

A Modest Proposal In Support of Free Speech and Expression

Perhaps alternatively titled, "How can we deal with the problem of Censorship on social media?"

To the owners / operators of this (and all) social site:

If your goal is truly to appreciate 'free speech' and the 'marketplace of ideas' then why not simply take a hands-off approach (apart from actual illegal speech or content) and provide controls for the user communities themselves to deal with whatever issues arise?

Adding the ability to upvote / downvote in the form of a drop-down button that if simply clicked provides a sort of 'likes/dislikes' counter but if one of the drop-down items were clicked it could begin to gently invoke some shielding to the message in order to give someone an idea of what they were about to encounter before they click to read it.

So on the upside you might have items such as 'thought-provoking', 'insightful', 'funny' and other things that could help people spot the "better" messages without entirely censoring out the others.

On the flip side, people could click on drop-down items from the 'downvote' side such as 'vulgar', 'trolling', nsfw', and the like-- perhaps permit the site members to suggest new ones as they become necessary. Downvotes such as this would NOT remove or censor the item, but simply put a notice over it detailing the various item(s) that have been noted for the post or comment, along with a shield to obscure it from view until a prospective reader chooses for themselves whether to click again to see it or else keep scrolling to avoid it.

For groups or posts that are known / recognized to have potentially sensitive content and/or else judged by the community over time to have sensitive content-- such measures could be put in place more 'globally' to shield an entire post and it's responses, or even an entire group. But at all times, a reader MAY CHOOSE to click and read/view the sensitive content for themselves, or not--as THEY ELECT. NO CENSORSHIP.

In this way you could support REAL free speech AND recognize that there is a general consensus among the users regarding what is / isn't appropriate without actually censoring anybody at all-- ever. And even better, the social consensus can change over time.

Today the users are more sensitive to this particular set of issues, tomorrow they could be sensitive to some other set of issues. In both cases-- and in all cases, the users themselves, as a community, would be able to decide and GENTLY add some additional measures to avoid exposure to things that-- as a community-- they really don't want to be exposed to-- without censorship. Simply warning and shading, and then allowing a prospective reader to make up their minds about how to proceed accordingly.

Thank you for listening. Please tell me what you think-- could this idea work to reduce the amount of actual censorship and throttling of ideas and opinions?

Final note-- I realize that this topic could be construed as better suited for the "how can we improve this site" thread. And for what it's worth, I put it there too. But I also think the idea is worthy on its merit for 'philosophical' consideration on the larger, more general notion of How can we deal with the problem of Censorship on social media? If you want to comment on that, please do!!

jwhitten 7 Apr 9
Share

Be part of the movement!

Welcome to the community for those who value free speech, evidence and civil discourse.

Create your free account

4 comments

Feel free to reply to any comment by clicking the "Reply" button.

0

I don't want to be a contrarian either, but I'm happy with the status quo. Can you add a poll to get a sense of how the majority feel? Thanks.

My own particular interest is in solving the problem as it was presented. I'm not even sure that I want to have it implemented. Except to say that if anything was implemented, I think this might be a workable solution to prevent censorship. Permit people to self-censor what they don't want to see and let everybody else speak freely. Seems pretty rational to me. (shrug) I'll think about the poll. Someone else asked me to do it too.

1

Let users block, let users create lists of those they block with reasons for each block, and let others subscribe to those block lists.

Then people can block because they see something they don't like, and people can let others do their blocking for them (think SPLC fans) and the rest of us can just talk. No censorship needed. Admin can maintain a block list that people can subscribe to, etc.

I agree. Read my reply from earlier to 'chuckpo' where I outlined a filtering system that users can apply for themselves to help them filter / block what they don't want to see. It's very much like what you just described.

5

Okay, not to be contrarian, but systems that give users the ability to down vote become weaponized and even encourage tribalism. Eventually, USERS get down-voted and not each idea judged separately from a user. People become consumed with their 'score', and that's not very useful either. The feature becomes part of the content. This site isn't homogeneous, but it does seem fairly heavily weighted in one direction. IF the owner of the site is interested in making a place for all ideas, I wouldn't be very supportive of a rating system. Because people. I think you make a lot of reasonable points, and it's great you're trying to solve a real problem. I'm just not a big fan of rating systems.

That's why I'm not suggesting anything other than a 'shield' be added if (some number) of votes are made on a particular post or comment. The comment is still there to see / view, but merely a note that other people have considered it thus and the opportunity granted for the reader to continue or skip it at their discretion. Perhaps there could be more done to make it easier to make the judgement, but my thought is exactly in opposition to the type of knee-jerk behavior which occurs on other sites while still giving the members / community a way of dealing with content they collectively find sensitive. Personally I think it's a reasonable compromise. Perhaps some other people, such as yourself, could suggest ways to improve the idea?

@jwhitten, I've had some experience administering discussion forums on the internet. I've never found an answer to the psychological warfare that occurs (bleeding into the real world now, btw). I've seriously looked and tried a bunch of stuff. There just doesn't seem to be a way to win. I totally am with you in intent. I guess an answer is just beyond me.

It's like how do you deal with trolls? Well, clearly nobody has found a solution to these kinds of problems yet. Just too much human variation. I'm all for you if you figure it out. Per the voting systems, their success is dependent upon the people who use them. It seems to always turn into bully/victim or outright clashes, and then ultimately stratification.

I remember years ago getting irate about some forum moderation. Then, when I became the admin, I was like, 'oh'. Hahaha, complete clusterF. It's like babysitting. And, if you apply 'law', then the free-speech people start pressing on the gas. I feel better knowing people smarter than me haven't figured this out yet.

Maybe talking about it more will help. You have an idea. How can that idea defeat the problems within it to win ultimately? One thought, if you mark something 'undesirable', you ensure EVERYONE will read that one, haha. Maybe as you continue to hone the idea, something will spark and I can be of some help. I'm willing--just don't have anything but my own experiences to offer--for whatever that's worth.

@chuckpo Okay, that's a start. And also to point out, I'm not trying to downvote anybody's posts or comments into obscurity. The post could have a billion downvotes and it would still be one click away to view. One possible addition to the concept might be the ability for each user to set their own filtering, possibly even on a post-by-post / group-by-group basis so in some groups you (ie. the user) would be open to pretty much anything, in other groups-- perhaps ones that you've visited before and know they could be contentious-- you could set them a different way to flag more content. In any case, it would still be you and not the site owner or administrator who was regulating the content you view and no censorship would occur.

In my minds-eye view of this, I come across your post that says 'so and so is a muckity-muck' and that offends my sensibilities. So I can mark it as 'trollish' or 'rude' or whatever I like and the post gets that as a mark. Someone else might see your post and agree heartily and mark it 'insightful' or 'funny' and it gets that mark too. When enough people see it and vote it using one of the 'down' buttons, it eventually hits some magic number and the shield is applied, but the post is still available, the shield simply tells you that a number of people in the community have all marked it in a down way for the following reasons... (whatever the list).

You have it in your power to:

1, Turn off the shields completely and ignore the rating system. No censorship is applied.

  1. Turn on whatever combination of filters that help you make it through the day. No censorship (beyond your own) is applied. Other readers may continue to read the post freely as they choose. You (and others like you) will be the only ones with your heads in the proverbial sand...

  2. Turn on filters for the entire post. Same result as #2. No actual censorship is applied.

  3. Turn on filters for the entire group-- or site-- or user-- whatever you like, but no censorship is actually applied. Everybody else continues to be free to view-- or not view-- the content as they choose according to whatever filters they've selected-- and you could even go a step further to include thresholds to that as well--so if it's a little bit 'raunchy' you'd be okay with it, but if its downright filthy and obscene-- well, if you're like me, you'd push that one up to the top of the list to read first.... but I digress, no actual censorship is applied.

Even if the site or moderators "do you a favor" and pre-select your filters for you so that by default you are presented with a pretty generic and milquetoast view of the site, you could still adjust your filters however you want to view the content in whatever way suits you and No actual censorship is applied.

Only in the actual event of real, honest-to-gosh Illegal content would site administrators step in and remove / delete actual content.

IMO, you could even tell advertisers to go pound sand if they object because nobody has to read / view anything that they don't want to that way. They see everything through their own filters and NO CENSORSHIP needs to be applied.

So take THAT Zuckerberg. Up your fat, fascist behind!

@jwhitten, the system you're talking about sounds technologically cool. My hesitation continues to be that people may not use it AND people will find a way to weaponize it. We have a 'block user' feature here now. I almost used it because In the last day or two I had a member of this site baiting me personally. But, then I think maybe this guy isn't a choad all of the time--maybe he'll contribute something at some point that I'll find of value. It's an inconvenience for sure, but what is more important, dealing with worthless personal junk or the possibility of learning something?

Then, there are the side-effects of blocking. Sometimes you lose context in discussion because people are responding to this part you're not seeing, and you don't know what they're responding to. It changes the flow of threads.

So, I feel like negative-Nick. I want to be positive about your endeavor. It's something I've thought a lot about it. And, the psychology of what's going on is really fascinating. Why don't you ask everyone? Create a detailed survey (poll). Ask members if they use the block function, and why they do or don't. Ask them if they can see themselves using your system. Ask them for suggestions that would improve the system or ideas that help to solve the problem. Ask them general questions about the problems--trolls, baiting, flaming, undesirable content (overtly racist, sexist, homophobic, blah, blah, blah). Still wonder if tribes will band together to make the other side pay by voting in block to 'filter' someone. Being filtered becomes a social status. 'Undesirables have their posts filtered.'

OH, I did have this technological idea for you to consider. What about the dating site feature of swiping right or left--or even up and down--as a way to add your 'filter'. Maybe this will spark something for you.

If you pull this off, you win the internet...

@chuckpo, @Admin May I point out that if you block someone because you 'think they're a choad', that's still you doing it, so the consequences of your action-- the experience you have-- is still your own. And, I would add, you have the ability to undo it in order to expand your horizons again at any time. No censorship required. Perhaps what is needed is to add the ability to permanently block a user, block a user for a thread or a group, or to block a user for some specified period of time, which could be conveniently suggested in a drop-down list if you're feeling uninspired and can't think of an appropriate period on your own...

"Effects of blocking"... Yup. Sooner or later you have to make a choice. Do you want to see it or don't ya? The world is not a tame place, despite our best efforts to put up handrails for children (not implying you're a child), there are consequences that accompany the choices we make.

"Negative Nick"... Here's the thing. If you put these elements in place, you don't have to use them. Neither does anybody else. Perhaps just having them might help some people who feel 'triggered' now and then feel like they have some small amount of control over what they're seeing, thinking and feeling in response to some content they've read or viewed. Simply marking a post with one of the 'downvote' buttons won't censor the comment, but rather mark it with a note regarding your feeling / assessment of it. Other people can CHOOSE to set their filters in such a way that they can (A) ignore your markings or 🍺 Include your markings. They may even choose to set their filters more strongly than yours so their view is even more restricted and myopic 😉

EDIT / AFTERTHOUGHT-- it's possible that some downvote buttons would invoke the shield (originally I had thought that maybe all of them apart from the simple numerical counter would invoke it). But in mulling over the 'Filtering' aspect as the second half of the scheme, it occurs to me that the shield doesn't really have to be invoked for anybody except for people with the same filter settings. So if you're not concerned about 'NSFW' for instance, or maybe 'GRAPHIC CONTENT' might be another one-- you will never experience the shield, though you would see the tags and the numeric counts. That way anybody can tag and it operates immediately for them (depending on their filters) but it doesn't affect anybody else unless they want them too-- again via their filter settings.

Asking everybody... good idea. I'll need to write it up more concisely, I think. I'll give it some thought. Perhaps one of the @Admin 's would weigh in as well. Don't know how to attract one or call one over though... maybe the '@' thing will work its magic...

'Swiping'... not everybody is using this site as an app. I'm using it in my desktop browser for instance. I don't like phone apps as much, at 56 I don't see as well as I used to...

BTW, thanks for all of your really useful thoughts and feedback. I think you have definitely helped flesh out some of these ideas and held shape and refine them quite a bit. If I win the Internet, you'll deserve at least half! 😉

MORE AFTERTHOUGHTS...

The filters could be set up with a numerical threshold-- perhaps with the assistance of a drop-down to help people make useful choices. One choice might be any notice, just 1 would do it. Another choice might be some specific number-- 5, 10, 25, 100-- whatever. A third option might be some percentage of the number of viewers, 10%, 25%, 50%, etc. So you could really fine-tune the filter settings for your personal level of comfort.

Speaking for myself, I would probably want the 'NSFW' and 'GRAPHIC CONTENT' filters to be on all the time, regardless of the time, place, thread or group. I might want to filter out 'TROLLING' or 'VULGARITY' if 50% of the viewers agree. And so on.

When you join the site, you might get a generic and pretty tame set of filters so that you have some handrails in place. But on the other hand, you're gonna miss some stuff too-- no censorship though, just 'shields'-- unless maybe you elect not to see 'shields'. But even then I would suggest that there should be a note somewhere indicating the total number of posts (or whatever) you're NOT seeing based on your current filters along with a helpful on-the-spot button to whatever dialog box helps you re-select them.

YET MORE...

Getting back to your thought about whether people use them, consider that most people would only tend to use a feature like that if some content they read or viewed had a real and severe emotional impact on them. Everybody not so affected would probably NOT do anything in particular, unless maybe they LIKED the content in which they could use one of the upvote buttons. So the filter settings are not just for negative reactions, they can also be to hone in on the more positive, upbeat, informative, funny, insightful or whatever posts. Some of these ideas have been around awhile and successfully implemented on other sites-- such as Slashdot. While my ideas, what we're talking about in this thread, does not mimic Slashdot completely, there is some underlying similarity in some aspects.

Okay, I'm really done editing this comment now! 😉

@jwhitten, just one final thought. You creating something that works isn't dependent at all on me being negative or positive in any way. What do I know? I'm just a clown living in central Texas, and I'd rather you be right than me--that's honest. I'll check out slashdot.

1

This is a pretty solid take on things.

Write Comment
You can include a link to this post in your posts and comments by including the text q:29013
Slug does not evaluate or guarantee the accuracy of any content. Read full disclaimer.