“At the end of the day, they are not interested in seeing things in their work timeline that make them uncomfortable, or distracts them from what they’re interested in.”
This attitude is endemic in the tech industry.
https://www.platformer.news/p/-what-really-happened-at-basecamp
@laura there's nothing wrong with that view.
@sneak unless you are making anything for public consumption, in which case its downright irresponsible.
@laura i don't agree that doing the thing you're working on and only the thing you're working on is irresponsible. i think the burden of proof for irresponsibility is on the people who claim it is so.
you're totally allowed to be singleminded and it's not irresponsible. silly, ridiculous, simpleminded, sure - irresponsible, no.
@laura @icedquinn do you believe that people who produce end-to-end encryption systems (where the server can't censor certain messages due to crypto) have moral obligations to put clientside filtering tools into their clients to avoid being "irresponsible"? or is being content-neutral an acceptable choice?
@laura this is a good example of a concrete instance. also encrypted messaging censorship isn't moderation, it's censorship.
do service providers have an moral obligation to censor or not? would remaining completely content-neutral render them "irresponsible" in your view?
@sneak I’m not sure where you got the impression that I think encrypted messages should be censored. I don’t think service providers have an obligation to censor content any more than I think pencils should be banned because you could write harmful messages with them. But I also think that the architecture and design of a platform (usually those seeking to exploit engagement and personal data) should be designed so that people can avoid the content they don’t want to experience from others.
@sneak but none of this is really relevant to Basecamp. Where I think the issue is that the leadership don’t want to engage with social issues affecting their staff and customers, which negatively affects the lives of its staff and customers.
@sneak I’m not talking about content moderation, I’m talking about designing for prevention and having an organisation that acknowledges its ability to cause harm.