Ai being a person and selecting resumes baseed on names

[Image created with AI (OpenAI), © 2025 Heather P. Smith — concept: résumé selection bias]

Hi, my name is Heather — and I’m a blonde female digital marketing professional.

Those six words alone already paint a picture in your mind, don’t they?
Before I’ve shared a single credential or example of my work, you’ve likely imagined something about who I am, what I’m like to work with, or even how competent I might be.

That’s not intentional judgment — it’s how the human brain works. We fill in gaps automatically, using past experiences, cultural cues, and unconscious associations to make sense of new information.

But those quick judgments can shape opportunities long before we even speak.


When Bias Begins with a Name

Names carry weight — more than most of us realize.
In a 2025 study, researcher Javier Rozado tested 22 leading AI hiring models using identical résumés that differed only by first name. The results were striking: the models consistently favored female-named CVs in some fields but undervalued them in others — proof that gendered names alone can sway hiring outcomes. (arxiv.org, 2025)

Another analysis from the University of Washington’s Information School (2024) found that female-associated names were favored just 11 percent of the time when algorithms ranked job applicants — showing that even automated systems mirror societal assumptions about gender. (ischool.uw.edu, 2024)

And a recent meta-analysis of U.S. hiring audit studies concluded that, while not universal, gender bias in résumé evaluation remains present in many modern contexts — particularly in leadership-oriented roles where assertiveness is prized. (sociologicalscience.com, 2025)

What this means is simple: our names — before our experience, tone, or presence — start telling a story for us.


The Reality Behind Perception

For me, it’s often the combination of name, tone, and appearance.

Add blonde hair and a calm tone, and people sometimes read me as passive — until my words are spoken or they witness my juggling skills.

Those unconscious assumptions happen everywhere — biases we have no control over, yet still have to navigate.

I’ve learned how easily composure can be mistaken for compliance, and how often a calm voice must prove itself louder than a raised one.

I’ve felt it in real time — that subtle energy shift when another woman looks at me and a silent story begins to form. Sometimes I can almost feel her searching for evidence to make her label fit. It’s a strange pressure — having to prove your intelligence before you’ve even finished your first sentence. But the tone consistently changes once I start speaking. Clarity has a way of rewriting assumptions.

Harvard researchers Amy Cuddy and Susan Fiske describe this as the “warmth–competence bias.”

Their findings show that women are often judged first on warmth — friendliness, approachability, tone — and must then earn perceptions of competence. Men, by contrast, are assumed competent until they must prove their warmth. It’s an invisible double standard that plays out in boardrooms, classrooms, and even casual one-on-one conversations.


Did You Know AI Is Judging You?

We used to think bias lived only in human perception — the quick judgments we make based on names, tone, or appearance. But bias now has a digital counterpart: artificial intelligence.

The algorithms behind résumé screeners, hiring platforms, and recommendation systems are trained on historical data — data full of the same human assumptions we’re trying to overcome. As a result, AI learns our prejudices and scales them.

A 2025 study by Javier Rozado tested 22 large language models by giving them identical résumés that differed only by first name. Each model showed measurable bias based on gendered names alone — favoring one over another with no change in qualifications. (arxiv.org, 2025)

The University of Washington’s Information School found something similar: female-associated names were favored only 11 percent of the time — meaning nine out of ten times, an algorithm downgraded them before a human ever saw the application. (ischool.uw.edu, 2024)

In other words, it’s not just people forming impressions — machines are doing it, too.
And while an individual’s bias can be challenged in conversation, an algorithm’s bias hides behind code.

For digital professionals like me, that’s both fascinating and alarming. It reminds us that the way we train systems is critical for improvement. Accessibility isn’t just about who can use technology — it’s about who technology includes or excludes without ever realizing it.


And What We All Want to Know… What Does Your Name Mean?

And what we all want to know — drumroll — what does your name really say about you?
Not in a poetic sense, but from an AI standpoint or the first scan of a hiring manager’s eyes.

Our names are often the first data point anyone — or anything — encounters. Before a résumé is read, before a handshake or video call, the name at the top already begins telling a story.

Recent studies have shown just how deeply those split-second reactions go.
In Grundmann et al. (2025), researchers found that first names carry measurable emotional and competence cues. Some names were subconsciously linked to leadership and strength, others to warmth or creativity — none of which are negative, but each colors perception. (ScienceDirect, 2025)

In Kaplan et al. (2024), even identical recommendation letters were interpreted differently when the candidate’s name changed from a male to a female-associated one — the language around them subtly shifted, shaping how capable the candidate was perceived to be. (JMIR, 2024)

And according to the long-standing Portia Hypothesis, women with more “masculine-sounding” names have historically advanced further in fields such as law, where leadership and decisiveness are prized. (Wikipedia)

So, which names “add value” and which shrink perception?

There isn’t a universal list — perceptions shift with culture, era, and even job type — but patterns do exist. Names that sound concise, strong, or traditionally neutral (Jordan, Taylor, Alex, Ryan) tend to be rated as more capable or adaptable. Names associated with softness or familiarity (Heather, Emma, Grace, Lily) often earn immediate warmth but may face an uphill climb toward being seen as authoritative in specific fields.


Try Your Name

Curious what your name signals — historically or culturally? Look it up here:

These tools explain origin and usage — a practical context for how names are interpreted before we even speak.


The Meaning Behind a Name

When I chose my children’s names, I looked for more than uniqueness or trend. I looked for meaning — for names that carried character, curiosity, and strength. Because a name is a child’s first introduction to the world, and in many ways, their first test against bias.

I didn’t choose based on what would “fit” a résumé someday, but I’d be lying if I said I didn’t consider how their names might be read — by teachers, by algorithms, by people scanning hundreds of applications a day.

That’s the calculus of parenting in a world still learning to outgrow its assumptions.


When Appearances Challenge Assumptions

I see the same dynamic in my oldest, young-adult son. He loves to challenge perception head-on — piercings, tattoos, brightly colored hair. He says he does it because he enjoys proving people wrong. Strangers often assume he’s reckless or unambitious, but when he speaks, they’re stunned by his intelligence and self-awareness. His IQ is 135, and he uses his words with precision. In his own way, he’s teaching the same lesson: appearance and intelligence have never been mutually exclusive — but our biases still tell us otherwise.

I once worked with a graphic designer who carried that same brilliance. His style was bold — creative clothing, visible tattoos, a shock of bright hair. People outside our field often dismissed him at first glance, assuming “artistic” meant “unfocused.” But within minutes of conversation, he could articulate brand strategy and design psychology with the clarity of an academic. His aesthetic wasn’t rebellion — it was expression. And like my son, he knew that challenging expectations was sometimes the fastest way to reveal truth.


For Parents, Leaders, and Designers

It’s sobering to realize that this story begins before any of us enter the workplace.
For parents, it begins at birth — with the name written on a certificate.

The name you choose for your child will one day appear on a résumé, a college application, or a conference badge. And it may influence how others see them before they ever walk into the room.

That doesn’t mean we should name our children for other people’s biases.
It means we have to build systems and cultures that stop rewarding bias in the first place — from how we design algorithms to how we train hiring teams to read beyond their instincts.


Reframing the Narrative

Unconscious bias thrives on shortcuts — on assumptions that fill silence with story.
But leadership and inclusion come from doing the opposite: pausing, questioning, and choosing to see beyond the shorthand.

I’ve stopped trying to “fit” what people expect from a name, a voice, or a face.
Instead, I let my work — and the clarity behind it — speak for itself.

Because bias loses its power when we refuse to shrink to fit it.
And every time we challenge those assumptions, we make space for the next person who deserves to be seen for their skill — not their name.


Resources / Further Reading

Harvard University — The Real Cost of Beauty Ideals

AI in Resume Screening — Vervoe Blog

Grundmann et al. (2025) — First Names and Ascribed Characteristics

Kaplan et al. (2024) — Experimental Evidence of Gender Bias

Rozado (2025) — Gender Bias in LLM-Based Hiring Decisions

University of Washington — AI Tools Show Biases in Resume Ranking

The Portia Hypothesis — Wikipedia

Harvard Business Review — How Hair Discrimination Affects Women at Work

View my resume
Scroll to Top