How a Teen Designer Turns Algorithmic Bias Into Wearable Activism
- 2209921574
- May 28
- 2 min read
Updated: Sep 16
Vivianne Hartan, a high school student passionate about graphic design, technology, and data, explores algorithmic bias through visual storytelling. As part of the 501(c)(3) nonprofit Raising Awearness Campaign (RAC), she creates T-shirt designs that raise awareness of the social harms embedded in digital systems using sustainable fashion to spark critical conversations.
In RAC's latest collection, “Coded Inequality,” Vivianne shines a light on the hidden injustices embedded in everyday technologies. Though these systems may appear neutral, they are often powered by biased data that reinforces existing inequalities. Her designs aim to decode and challenge these systems to reveal how algorithmic decisions affect our lives and disproportionately harm marginalized communities.
Each T-shirt in the collection serves as a wearable protest: a statement against systemic oppression and a call for accountability in tech design. Through bold graphics and compelling narratives, Vivianne invites us to wear our values, amplify unheard voices, and reimagine a more equitable digital future.

The angel stands in contrast to the grid and binary. She symbolizes emotion, softness, and humanity inside a rigid, coded system. She doesn’t belong there, and that’s the point. The surrounding elements, code, coordinates, net structures, reference how people are trapped, tracked, categorized, and often misread by systems that claim to be unbiased and fair.
The way the angel is placed in the design pushes back against the system. It’s surrounded by elements meant to control or sort people, code, grids, numbers, but it doesn’t follow their rules. It stands out on purpose. It’s not trying to fit in. It’s forcing a different way to be seen.
If you look closely and enter the coordinates in the bottom-left corner into Desmos…
you’ll uncover this:


I want my design to challenge the idea that algorithms are neutral. They’re not. They reflect choices. What’s prioritized, what’s erased, and who gets centered. Because they get data from us, humans. Humans, whether we like it or not, contain bias. And when you’re someone who doesn’t fit the mold the system was trained on, you start to feel like you’re the problem. But you’re not “broken.” You were just never considered in the first place.
It’s a reminder that we don’t have to accept the way things are. We can challenge the system. We can build better. We can ask: How can we do better? How can we code with conscience, with care, with everyone in mind?