How about Ethical Software?

July 1, 2024
3 minutes
Software

There has been, and should be, a lot of talk about Ethical AI. Over the last several weeks, I have been revising Sturdy’s Ethical AI policy. I am trying to convey that we don’t do shady stuff and won’t let our customers do it, either.

(If you are interested in Ethical AI, we have a webinar coming up at the end of the month; the registration link is in the comments)

Writing the policy, I realized we need to talk about ethics writ large, not just as it relates to AI.

Consider the case of Allstate and Arity, as reported in a June 9 NYT story, “Is Your Driving Being Secretly Scored?”  Allstate apparently owns Arity. Arity builds phone apps for things like finding gas stations. Their apps also track how you drive, although they bury that minor detail in their “consent” pages (that no one reads). They then share this data with Allstate.

Not a lot of gray area here. This is unethical.

My co-founder, Joel Passen, coined this mantra at our first startup 20’ish years ago:

“Build what you’d want to use, sell it how you’d want to be sold, and service it how you’d want to be serviced.”

I don’t think anyone downloading a Gas Station finder app wants their driving to be sent to Allstate. I would not. And I would not build it.

So, instead of an “Ethical AI” policy, I’ve decided we need an “Ethical Software Policy”. It will encompass our use of AI, our platform, and how we expect our software to be used.

Here’s a bit of a summary so far…

Sturdy’s Ethical Software Policy (WIP):

  • Our product is only be used to improve how businesses make decisions so they can be better vendors to their customers;
  • We will not support use cases that do not directly relate to our problem set. The use cases for our product will be obvious;
  • We do not have ulterior motives for our customer’s data or their users;
  • We will not let any entity, business, government, or person use our product in a way that violates a person’s privacy;
  • We will not, nor will we allow our product to score or rank human beings;
  • Our product will be engineered to prevent deception and must never be used to deceive people;
  • Finally, If we feel that one of our customers is using our product in a way that violates our principles, we will terminate their service.

The problem is that many “Ethical Policies” are only as good as the paper they are written on. They are a checkbox on an RFP. None of us want to live in this world. Maybe it's time to try and live in a better one.

At some point, somewhere along the corporate food chain, executives need to say, “No.”

It is hard to say “no” to revenue. Do the hard things.

Let me know your thoughts.

Steve

Similar Articles

Sturdy announces SOC 2 Type II security compliance certification

Joel Passen
6 Min Read
July 11, 2022
Software
How about Ethical Software?

There has been, and should be, a lot of talk about Ethical AI. Over the last several weeks, I have been revising Sturdy’s Ethical AI policy. I am trying to convey that we don’t do shady stuff and won’t let our customers do it, either.

3 minutes
July 1, 2024
Software
The deck we used to raise money for Sturdy

The idea for Sturdy was born from asking this all too common question, far too many times, “What is going on with Customer X?”

Joel Passen
8 Min Read
March 9, 2021
Software