“At the end of the day, they are not interested in seeing things in their work timeline that make them uncomfortable, or distracts them from what they’re interested in.”

This attitude is endemic in the tech industry.

platformer.news/p/-what-really

@sneak unless you are making anything for public consumption, in which case its downright irresponsible.

@laura i don't agree that doing the thing you're working on and only the thing you're working on is irresponsible. i think the burden of proof for irresponsibility is on the people who claim it is so.

you're totally allowed to be singleminded and it's not irresponsible. silly, ridiculous, simpleminded, sure - irresponsible, no.

@sneak @laura if they truly intend to be apolitical then i can respect that, as long as it's not coding for "only be political in ways the company personally approves of."

@icedquinn @sneak I disagree, but I have a feeling we won’t agree on this. I also believe that there’s no such thing as apolitical. Nothing produced by people for people exists in a vacuum. The only possible exception I can think of is when you produce something for your own personal use and don’t release it publicly.

@laura @icedquinn do you believe that people who produce end-to-end encryption systems (where the server can't censor certain messages due to crypto) have moral obligations to put clientside filtering tools into their clients to avoid being "irresponsible"? or is being content-neutral an acceptable choice?

@sneak I’m not talking about content moderation, I’m talking about designing for prevention and having an organisation that acknowledges its ability to cause harm.

Follow

@laura this is a good example of a concrete instance. also encrypted messaging censorship isn't moderation, it's censorship.

do service providers have an moral obligation to censor or not? would remaining completely content-neutral render them "irresponsible" in your view?

@sneak I’m not sure where you got the impression that I think encrypted messages should be censored. I don’t think service providers have an obligation to censor content any more than I think pencils should be banned because you could write harmful messages with them. But I also think that the architecture and design of a platform (usually those seeking to exploit engagement and personal data) should be designed so that people can avoid the content they don’t want to experience from others.

@sneak but none of this is really relevant to Basecamp. Where I think the issue is that the leadership don’t want to engage with social issues affecting their staff and customers, which negatively affects the lives of its staff and customers.

Sign in to participate in the conversation
Mastodon

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!