Working for an Ethical Software Company
Some short thoughts on what an ethical tech company can (and should) look like from my views.
Ethics. A governing set of rules that help people determine conduct, according to Wikipedia. These kind of rules have become more and more important as software continues to "eat the world". Recently, Paul Jarvis asked an open(?) question about what it means to be an ethical software company. Which is always going to be subjective - some people consider some things to be "right" or "wrong". For example, imprisonment of any being is wrong. Angela Davis taught me that. But we have companies who have no problem working on a mission to accelerate and "effectively" put people into prisons or detention centers. Ethics - subjective as hell.
For any company that wants to come off as an ethical one and walk the walk of it, they have to understand that their impact doesn't end on deploys. It continues into the action that they take with the money (and resources) they collect from users to continue their livelihood. This line is one that tech (for some reason) is willing to ignore in some cases and what makes the actual difference between an ethical company for show and one in the paint.
It has to start with the people in the company. An ethical company values all of the work produced by its workers (think cooperative economics). This insures that by design, the company is stabilizing for the well-being of its workers and not for vacuuming excess profits for shareholders. This focus on sustainable growth spills over into other things like:
- collecting and communicating data from users when it's necessary, not because you can
- listening and responding to feedback from users about how it impacts their lives due to having a more direct connection to the success of the company
- responding to events and supporting the well-being of workers when it comes to their lives
This definitely comes off as "this company is creeping into their personal life" and yeah, that's a bit of the idea. When companies decided to become "the new social interaction layer" or "the new movie theater", they took on the responsibilities that come with these deep social vectors of people's lives. That obligation has been taken extremely lightly in the tech sector and has resulted in a lot of mistakes and harm for the sake of "innovation" and "scale".
Companies won't be perfect - nothing really is. But they can do better (we're past the saying part). And this ties into what Paul was really asking - what do they do?
They stand behind the values that they profess. If you're a company that works to connect people; don't fund or back things that actively disconnect people from their families. If you're aiming to be a transport layer of ideas, understanding that one user is not representative of all is necessary and creating tooling for such a landscape is necessary. You truly have to put yourself in the lives of others (actual empathy) and understand how supporting and working with organizations at scale can impact your users. It's doubly ironic when companies claim to want to reach into "developing worlds" and are indirectly making their lives more difficult. This is an obvious jab at companies like Microsoft, Amazon, Google, Facebook and whatever letter you can fit in the FAMG acronym.
Donating the money away doesn't matter if it's not proportional to what you take in nor does it undo the harm initially caused. It's like a conditional apology (those aren't apologies).
For those who don't wanna read the wall of text above, just try to apply the following:
- Give users complete say over their data on your platforms
- Invest time into not being a member of the Attention Economy
- Understand what you spend money and/or invest in
- Be as transparent with your choices to your users as you would be with stakeholders (users are the validation of your product after all)
- Pay your workers equitably, fairly and transparently.
- Avoid surveillance capitalism (it's anti-ethical by design - unless your ethics have no problem with abuse of individual privacy).
- Grow from the mistakes and listen to people who are negatively impacted by those mistakes.
Published using publish.koype.net.
Permalink • Published about then updated by Jacky Alciné.
Syndicated To Bridgy Publish to Mastodon Twitter Federation