Design Ethics 

As product developers, we need to recognize that the choice architecture we put in front of users will impact user behavior. Interaction design is all about removing friction to guide a user to an action. Every design intervention leads someone down a path; there is no neutral design. Even a seemingly neutral chair encourages people to sit, not stand. Therefore, designers and companies face an ethical question about how to influence user behavior in a responsible way.

Every design intervention leads someone down a path; there is no neutral design.



Designs can have influences that are either “manipulative” or “supportive.” A manipulative design works against a user’s long term best interest. Whether through deception or a user’s inability to stop herself, these designs end up hurting their users in the long run. Manipulative designs exploit cognitive biases to maximize corporate profit. They are all around us: advertisers sells products to people who didn’t want them in the first place; dark pattern design on the web tricks people into signing up for things they didn’t mean to; casinos design slot machines to be more and more addictive.

A supportive design, by contrast, has three characteristics:

  • It has a dual purpose: it maximizes gain for both the business and the user.

  • It considers “user need” as a continuous experience rather than a singular event.

  • It foresees the “ripple effect” of the actions it enables and measures the benefit this design provides over time.


The last point is particularly important because when a company is too focused on short-term growth without carefully considering a design’s long-term impact, there might be consequences that would end up hurting the company’s brand image. Mark Zuckerberg’s recent testimony in Congress is one example.


Here is an example of how the same cognitive bias—in this case, inertia—can be used in both a manipulative and supportive way:

Manipulative use of inertia


In 2009, under user growth pressure, Facebook changed the default sharing setting to “Public,” which allowed everyone on the Internet, not just individual users’ friends, to view users’ content. This resulted in more revenue for Facebook, but it also made users more vulnerable to scammers, fake news merchants, and other bad actors. It also broke relationships and ruined lives. Bobbi Duncan was accidentally “outed” by Facebook when she joined a queer chorus Facebook group as a college freshman. Her parents found out and disowned her. Bobbi later attempted suicide. Facebook was criticized by privacy advocates for this decision and changed the default share setting back to “Private” in 2014, five years later.


Supportive use of inertia


The IRS reports that about 30% of workers who are eligible to participate in a retirement plan at work fail to do so. According to behavioral economist Richard Thaler, one of the primary reasons people don’t participate is inertia—they simply don’t get around to signing up. In Thaler’s 2008 book Nudge, he describes the experience of one company that adopted automatic enrollment and saw its plan participation rate skyrocket from 65% to 98%. In the years since, similar results have been widely reported. This work won Thaler a Nobel Prize in Economics in 2017. In situations like these, a simple default can either help or harm users. As research in decision-making advances, designers can leverage these insights to advocate for users instead of taking advantage of them.



It’s not the
design itself, but
instead the
embedded values, that define a product.

Here are a few questions you can ask yourself when you are designing the next feature of your service/product/system:

  • How would you feel about being known for building this service/product/system?

  • How would your users feel during and after using this service/product/system?

  • Would you let your child or your grandmother use the service/product/system you built?


  • Is this service/product/system good for both the company and the consumer? What about in the long-run? It’s of course impossible to predict or control the outcome 100%, but try to foresee the ripple effect.