Blog

Building Awareness and Solving for Bias in Design

Blog

The country is in the midst of a civil rights fight that has been growing at too slow a speed as Black, Indigenous, People of Color (BIPOC) individuals are treated unfairly, discriminated against, and harmed by those meant to protect.


UNCOVERING DESIGN BIAS

Design can have a profound impact on social change, both positive and negative. As a UX and technology company, we recognize that our work can contribute to the perpetuation of racial bias, whether we intend to or not. At Useagility, we strive to advocate for the user and use a human-centered approach in all aspects of our work. However, we know that even with the best intentions, unconscious bias can creep in, and there is always room to improve.

One way we’re trying to educate ourselves is to examine bias within our own industry. By understanding where bias exists, we can better train ourselves to uncover and work to overcome our own unconscious biases.

Here are some examples where bias in the design process has failed to adequately solve for inclusivity:

FAILING TO RECOGNIZE DARK SKIN TONE IN DESIGN
There are several instances where technology has failed to recognize and respond consistently to darker skin tones. Wearable fitness trackers such as Fitbit and Apple Watch have been known to not properly work on dark or tattooed skin and AI for self-driving cars has issues recognizing Black individuals as pedestrians. The way these tools are programmed and built, are not designed to accommodate for marginalized populations.

pedestrian crossing crosswalk

FACIAL RECOGNITION BIAS

Facial recognition technology has been fraught with issues since its inception.  In 2015, Google’s facial recognition categorized two African American individuals as gorillas. The “solution?” Removing ‘gorillas’ from the recognition software. Google, you can do better than that.

According to a study by the The National Institute of Standards and Technology, “Asian and African American people were up to 100 times more likely to be misidentified than white men… Native Americans had the highest false-positive rate of all ethnicities.” A MIT researcher found similar results, and added that women are mis-identified more frequently than men. This is especially problematic as facial recognition software has become one of the fastest growing tools in identifying criminal suspects and witnesses in law enforcement.

While some companies have called for national legislation to regulate use of facial recognition, cities such as Boston and San Francisco have banned use of this technology by law enforcement. Congress has yet to pass comprehensive legislation to regulate this technology which has incredible potential for human rights violations.

UPHOLDING BIAS ‘NORMS’ IN FEATURE DESIGN

Instagram’s photo filters were designed to enhance images. However, when ‘enhancing’ photos, the filters lighten skin tone. This functionality, knowingly or not, is catering to long-standing bias in photographic standards. In the 1960s, Kodak used white skin tones as the standard to measure film quality and accuracy of light – this design bias has morphed into the current digital ecosystem.

HOW DID THIS HAPPEN AND HOW CAN COMPANIES DO BETTER?

The root cause of much of this can be connected to a lack of diversity and inclusion among designers, as well as missed opportunities for real-world application through research and testing while in development. By including diverse populations in our research, design, and testing processes, we can create better solutions for everyone.

Nextdoor used research to counteract negative implications of their neighborhood app.

  • COUNTERACTING RACIAL PROFILING IN DESIGN

    Nextdoor unintentionally created a perfect breeding ground for racial profiling in their Crime and Safety section of their app. In 2015, after a news story accused Nextdoor of creating a forum for perpetuating racism, they discovered that several dozen users had reported suspicious behavior solely based on race. To counter racial profiling, the company took several intentional steps to research and understand where and how this behavior was happening. Staff underwent diversity training to enable employees to better recognize the problem and extensive research was conducted. Through stakeholder engagement and this research, Nextdoor used an iterative design approach to create friction in the UI design that made users stop and think about what they were posting.

    Several months of iterative work manifested as a three-step process to report an incident – including the inability to submit an individual’s description solely based on race. These changes reduced reports by 25%.

    Images from Nextdoor

Nextdoor Incident Screenshot
Next Door Complaint Screenshot

MOVING FORWARD

Design is not a perfect science, and there is no easy solution. But we can and will do better. We must all strive to continue to uncover our own biases and become more inclusive and aware of who may be underrepresented in our work, as well as how to design for them.

At Useagility, we are educating ourselves through our community, listening to and elevating BIPOC content and viewpoints. Here are some resources we are using to continue to educate ourselves.