Podcast > Culture Fit > EPISODE 1
Monoculture
People from communities that have long been excluded by Silicon Valley’s monoculture share their personal stories and walk through the data on the lack of diversity in tech. When it comes to cognitive biases, what critical errors do our brains make without even realizing it?
In this episode
Frances Coronel
Software Engineer, Executive Director at Techquería
David Dylan Thomas
Author, speaker, content strategy advocate at Think Company
Jacqueline Gibson
Software Engineer, Digital Equity Advocate
Adam Recvlohe
Software Engineer, Founder of Natives in Tech
Josh Torres
Chief of Staff for Out in Tech, cofounder of LTX Fest for Latinx in Tech
LeRon Barton
Writer, Speaker, Artist, Network Engineer
Sinduja Rangarajan
Senior Data Journalist
Treavor Wagoner
Independent Writer, Lead Product Designer, Co-founder of Black UX Austin
Sydney Sykes
Blck VC cofounder, Angel Investor
Dairien Boyd
Podcast host, Principal Designer at All Turtles, mmhmm
Stephanie Lampkin
CEO and Founder of Blendoor
Highlights
"The thing about culture fit, and this is what gets me, Dairien, is that it's so clear that it's being practiced. Right? You know, you can tell that these people, they like the same thing. They look like each other and they hang out. I understand in theory that if you work around people that you get along with, that you have similarities with them… That will allow the work atmosphere to be easier. But the problem with that is… it just becomes an environment that's very homogenous and very monochromatic." 5:42LeRon Barton
"Google has the best search engine in the world, but they can't find enough diverse people to fill the workforce." 10:06Frances Coronel
"Cognitive bias is really just a fancy word for the fact that you have to make something like a trillion decisions every single day. And, as a result, your mind is doing a lot of things on autopilot. The problem is that sometimes the shortcuts lead to errors and we call those errors cognitive biases. ” 23:50David Dylan Thomas
“Racism is designed. It doesn't just happen magically, right? Those patterns come from somewhere." 25:00David Dylan Thomas
Episode Links
1
Design for Cognitive Bias
Book, David Dylan Thomas
2
Blendoor
Diversity and Equity platform, CEO Stephanie Lampkin
3
BLCK VC
A non profit with the goal of doubling the representation of Black investors by 2024.
4
LTX Fest
An annual conference for Latinx professionals in tech.
5
“Is Silicon Valley using culture fit to disguise discrimination?”
Article, LeRon Barton
6
Out in Tech
A not-for-profit uniting the LGBTQ+ tech community.
7
Techquería
A nonprofit that serves the largest community of Latinx in Tech.
8
9
“The Pinterest Paradox: Cupcakes and Toxicity”
Medium article by former Pinterest COO
10
Kapor Center Retention Study
The 2017 Tech Leavers Study is a first-of-its-kind national study examining why people voluntarily left their jobs in tech.
Show notes
Dairien: [00:00:00] Over the summer, a tidal wave of racial justice initiatives took storm and tech was no exception following the public murder of George Floyd, big names like Apple and Cisco. They made huge financial contributions to diversity initiatives. It seemed like everyone in tech, from the smallest startup to the biggest company, put out a solidarity statement.
We can see the well-meaning intentions of people in tech who want to promote racial equality. Yet the industry as a whole hasn't made much progress. And we know it's hard to pinpoint [00:00:30] exactly what is the quote unquote tech diversity problem. Why are there so few Black and Latinx professionals in tech?
It's time to fully dig into it.
I'm Dairien Boyd. This is Culture Fit: Racial Bias in Tech.
If we met in person, one of the first things you would probably notice about me is that I'm not white. People in the U.S. usually think of me as a light-skinned Black guy, but that's only skin deep. [00:01:00] I'm the child of two veterans. I was actually born on an air force base. If there's anything more symbolic of America, please let me know.
As a kid near Philadelphia, I was taught to fit into a suburban white community. Since moving back to California in '06, I've learned to adopt tech culture. At All Turtles, my role is Principal Designer. I'm the only person at our studio that is identified as a Black man. I've worked at the heart of Silicon Valley for 10 years, and I'll be your guide as we hear stories from dozens of fascinating people. Collectively the [00:01:30] voices of this podcast will amplify perspectives from people that are often underrepresented or entirely excluded.
David: [00:01:37] Talking about race is a skill, because you're not born knowing how to do it.
Dairien: [00:01:42] That's author and researcher, David Thomas. He's one of the voices that we'll hear throughout the series. Together, you and I are going to build the skill to talk about race. We'll hear stories of unfair treatment, but we'll also practice love and compassion by listening to the voices of those who are dedicated to fostering a better and more inclusive [00:02:00] world. Let's examine racial bias in tech and go beyond culture fit.
Treavor: [00:02:06] The tech industry is, is bro tastic, as they say.
Dairien: [00:02:14] That's Treavor Wagoner, a designer who's worked and consulted for different tech companies.
Treavor: [00:02:18] And by bro tastic I mean, white male, cis-male individuals rule the landscape of tech. And because of that, you have to [00:02:30] sometimes conform or there's this pressure to conform to leave your culture behind, to adopt theirs.
Dairien: [00:02:38] When you step into a tech space, the bro vibes can be palpable. If you've worked at any of the major tech companies within Silicon Valley, you've probably seen how this dominant culture plays out. It can be described as a monoculture monoculture means a sea of the same. The term refers to a common identity that repeats itself.
The tech workforce looks just like the founders of [00:03:00] prominent tech companies. Is that really a surprise? Hiring cycles established this pattern that accepts similar people over and over. So monoculture is a problem in tech because it leaves the industry with cultural blind spots to groups that are excluded.
And we'll talk more about how these blind spots perpetuate systemic racism. Monoculture is also used more specifically in computer science. It refers to a network of computers that all run on the same software. A monoculture of computers is vulnerable to being taken down by a [00:03:30] single attack. All this is to say both computer networks and the tech industry as a whole are strengthened through diversity.
In conversations about the lack of diversity in tech, tech leaders blame the pipeline problem. We'll get back to that later. We actually dedicated an entire episode to the pipeline topic. We also dedicated episodes to Silicon Valley's deep history, as well as how we can build a more equitable future.
But [00:04:00] first, we're going to take the time to understand how racial inequality manifests in tech today.
The power of the tech industry can not be overstated. These companies are shaping every aspect of our lives. They're impacting how we work, how we navigate the world and how we connect with the people we care about the most. This power has created both good and bad for humanity. Working from home during a pandemic, it wouldn't have been possible without these tools, yet they seem to interfere with our elections, they over police our poorest communities, and [00:04:30] perpetuate racial hierarchy structures that lead to inequality.
So it's important to understand who comprises this powerful industry so we can see who's being left out. Here's writer and software engineer, LeRon Barton.
LeRon: [00:04:42] I'm from Kansas City, right. And so we think of California is like this magical place.
Dairien: [00:04:48] What did you find about culture fit when you walked into the tech environment?
LeRon: [00:04:51] You know first off there, that I was one of the few black people there. I mean, gosh, I don't even remember any black women, if [00:05:00] I'm going to be from keep it all the way live. Like the city isn't as great as I thought it was. And Silicon Valley ended up being kind of a bummer because it's very tribal, very segregated, just not a lot of people that look like myself.
Dairien: [00:05:13] LeRon writes about hiring practices that contribute to the lack of diversity, like culture fit. The problem with the idea of culture fit is who decides the culture also decides who doesn't.
LeRon: [00:05:23] The thing about culture fit, and this is what gets me, Dairien, is that it's so clear that [00:05:30] it's being practiced.
Right? You know, you can tell that, okay, these people, they like the same thing. They look like each other and they hang out. I understand that. Yeah. And in theory that, Hey, you know, if you work around people that you have similarities with them, maybe that'll allow the work atmosphere to be easier. The problem with that is it just becomes very exclusive.
It just becomes an environment that's very homogenous and very monochromatic. I mean, [00:06:00] racism's everywhere; it's pretty rife in the tech industry.
Dairien: [00:06:03] First let's dive into the demographic data, who are the people building the products we use every day.
Stephanie: [00:06:08] You can't fix what you don't measure. And a lot of the times, and problem solving is first realizing that you do have a problem.
Dairien: [00:06:16] That's Stephanie Lampkin, CEO of Blendoor and inclusive recruiting and people analytics, software company.
Stephanie: [00:06:22] When I first moved back to Silicon Valley to launch the company. My first interview was with an accelerator and one of the, the panelists [00:06:30] who was doing the interview process said, "This diversity problem.
Like I don't get it. Are you really solving a problem? I am colorblind. I make decisions on merit, not based on any other factors." And it was sort of the aha moment when I realized that there still needed to be a lot of education about the problem and the ways in which it impacts outcomes for different people.
I think it's important to start measuring with the same level of [00:07:00] precision your workforce demographics. As you do your financials.
Dairien: [00:07:05] As Stephanie said, we need to look carefully at the data, digging into these numbers. We see the percentages of black employees at major tech companies is low. As LA times reports Salesforce 2.9% Black. Facebook
3.8% Black Google. 3.7% Black. Slack and Microsoft are 4.4 and 4.5 respectively. Let's look at these numbers in another way, because math's hard, [00:07:30] especially over a podcast. So Salesforce has 49,000 employees. Only 1,400 of those employees are black. Facebook similarly has 50,000 employees, only 2000 are black.
It's unclear just how many of those black employees hold technical roles as opposed to lower paying sales and service jobs. One thing's for sure. Very few of them hold leadership roles. And we'll look closer at that later, considering the fact that these companies build products that impact everybody on a global scale, the [00:08:00] numbers aren't great. And tech companies know this. Here's Stephanie again.
Stephanie: [00:08:04] There's been a lot of reluctance on the part of tech companies, oftentimes from the legal departments to really look into their numbers because the numbers really don't lie.
Dairien: [00:08:15] Fear of legal troubles is enough motivation to keep the numbers under wraps, to get a better sense of how reluctant these companies are to disclose their demographic data.
We spoke with reporter Sinduja Rangarajan. She's a data journalist and previously reported on the lack of diversity in Silicon Valley for reveal.
Sinduja: [00:08:30] A lot of companies have to submit official reports of that employee breakdowns by demographic, by gender, by the kind of work they do to the EEOC, the equal employment opportunity commission.
Those reports don't have to be public, but some companies shared it or have started sharing it since 2014 or so. A lot of them don't share it and resist sharing it. I mean, certainly more companies have started releasing this data, but a lot of them also resist. Reveal's [00:09:00] actually find the lawsuit. It's still in progress.
We use the lawsuit to get information about Palantir and Pandora and a few other companies that were sort of resisting.
Dairien: [00:09:10] It's clear these companies are embarrassed to share their diversity numbers. And when a lawsuit is the only way to get them to release the numbers, you know there's something wrong. For the companies that have publicly provided data, here's what the numbers look like for historically underrepresented groups. You know the names, Google, Microsoft, Salesforce, Github, Intel, Slack, [00:09:30] Dell, and Cisco. Here are the average numbers of employee demographics at those companies: Black 4.3%. Latinx 6.9%. Indigenous or Native American. It's less than 1%.
Jacqueline: [00:09:42] I think it's no secret that for black and Latinx employees in tech, um, our numbers are still ridiculously low, even after many initiatives.
Dairien: [00:09:51] This is Francis Coronel, executive director of Techquería, a non-profit that empowers Latinx professionals to become leaders in the tech industry.
Jacqueline: [00:09:58] Well, like Google has the [00:10:00] best search engine in the world, but they can't find enough like diverse people to fill in the workforce, is like the, the joke.
Dairien: [00:10:08] That's the kind of joke that's funny because it's true. Also a quick note, you may have realized that we use the term Latinx. We do so because that's what our guests use. We want to acknowledge that some people prefer Latino or Latina as an identifier. It's just an example of the rich diversity that exists within the Latin American population, a richness that the tech industry usually misses. In addition [00:10:30] to the low representation of Latinx , tech's exclusion of Native American and Indigenous populations is glaringly evident. This is something Adam Recvlohe has seen as a software engineer. Adam is Muskogee Creek Canadian-American. He and other members of his community saw a need. And so he founded Natives in Tech.
Adam: [00:10:49] Natives in Tech did not exist, you know, before, you know, our team started to work on it.
It's not to say that there weren't native people in tech, [00:11:00] but there just wasn't like a centralized organization slash like social circle that kind of embodied the native experience within technology. As I've kind of gotten to know other people in the tech industry, there's a bit of, because we're not really seen in the media, it's kind of like an out of sight out of mind type of thing.
Dairien: [00:11:29] Monocultures [00:11:30] drown out the voices of marginalized groups. As we've said, Indigenous and Native American employees make up less than half of a percent of the tech workforce. That's four times less than the percentage of the overall U.S. population. And many of us live on Indigenous land.
We're going to continue hearing from those who are often excluded by the tech industry. To get a complete picture, it's important to understand how tech is funded. [00:12:00] Just follow the money. Everyone knows the saying. In tech, that path will take you to venture capital. VC firms are financial gatekeepers.
They provide the capital that transforms startups into powerhouses, but who are these people, really?
Sydney: [00:12:13] The stats are really in this situation able to tell pretty holistic story.
Dairien: [00:12:19] This is Sydney Sykes. She's an investor turned co-founder of BLCK VC. BLCK VC is an organization that connects and empowers black venture capitalists.
Sydney: [00:12:27] A lot of times you can look at an [00:12:30] industry and not really know what it's like to be an under-recognized represented minority. But I think when you look at these stats, you really understand the picture of, there are very, very few black people in venture. Most firms don't have a single black investor, and if they do, you're likely to be the only black investor at your firm.
And so what that means is. If you're trying to get into venture capital and you're underrepresented minorities specifically, if you're a black person, you're less likely to know someone who's already working [00:13:00] in venture your going to have more trouble getting through that interview process. You're going to have less people to turn, to, to get advice for about the venture world.
And then equally importantly, if you're an entrepreneur and you're pitching to a venture firm, You're more likely than not to be pitching to a group of faces who don't look like you and might not understand as well. Some of the problems that you're trying to address, 81% of venture firms don't have a single black investor, depending on the data.
There's [00:13:30] less than 3% of investors are black. And many of those investors don't have check writing power. Another statistic we talk about is the difference between. Seniority levels. And there are way more associate level black investors than there are partner levels. And those partners are going to be the ones writing the checks.
Dairien: [00:13:49] Beyond the venture capitalists, who else is in the room when big decisions are made? Who's at the executive level, who's on the board?
Here's a breakdown of leadership positions within the same companies we mentioned earlier. [00:14:00] The average demographic percentages of leadership and management roles are only 2.7% Black, 5.6% Latinx, and only 0.2% Indigenous or Native American. Low leadership and executive number stood out to Sinduja in her research as well.
Sinduja: [00:14:15] Overall, these companies had like almost no executives from these underrepresented communities, which means the people who have the most power don't come from those communities. And therefore, when you look [00:14:30] at tech as an industry, as a whole shaping the future, you see that these voices are completely left out.
Dairien: [00:14:37] Apple's another clear example. Business Insider reported in 2018. Apple had roughly 120 people at the executive and senior level. Only one of those individuals was Black. That's one out of 120, no matter how you spend the numbers or how you try to explain who was hired and who was promoted 0.08%. It's not even close to reaching equitable representation at the executive and senior [00:15:00] levels.
How can tech include Black experiences? If they don't includeBlack decision makers, maybe things will start to improve at Apple, but they're going to have to do it without diversity chief Christine Smith. She resigned in June of 2020. As we've seen, representation at the leadership level for Latinx professionals is also low.
Josh: [00:15:21] My name is Josh Torres. I am the chief of staff with Out in Tech and I am a [00:15:30] Latinx millennial, who is Puerto Rican and queer.
Dairien: [00:15:33] Josh co-founded LTX Fest. It's an annual conference for Latinx in tech through his community organizing. He's had a front row seat to the way tech management roles are staffed. And he's seen who gets left out.
Josh: [00:15:44] Of the small representation. How many Latinex individuals are we actually seeing in senior leadership positions as hiring managers, as C-suite executives, as board members? Because that's, when I think we'll actually see the landscape starts to shift.
And in so many cases, [00:16:00] these individuals are the first to kind of chart, chart that path. There aren't a lot of models or there aren't a lot of people to look to that look like them and have the shared experiences that they have.
Dairien: [00:16:10] The lack of black Latinex and indigenous managers and executives is a key data point, but all of this still left me with one question for Sinduja.
Through your research, when you would look at the reports released by various tech companies, did the data tell a story in terms of retention of the underrepresented [00:16:30] groups, or was it simply a snapshot of a population at a particular time?
Sinduja: [00:16:34] This is a really good question. It's a huge part of the missing data that is retention. An example that comes to mind. A woman wrote this big, medium post about gender disparities and toxic work culture at a company called Pinterest. When I was looking at Pinterest data, it seemed to be really good, but turns out that even when I was looking good, [00:17:00] it means nothing.
If you don't have the retention statistics, especially after that medium post detailing, the experiences of how someone who was actually a C-suite executive with tons of experience, talking about, you know, a very toxic work culture that made it a really hard place for women to thrive. So that's a really good point,a great question on retention and companies have a really long way to sort of start tracking it and putting it out.
[00:17:30] Dairien: [00:17:30] There is some data on retention. In 2017, the Kapoor center released a national study examining why people left tech jobs. I'll summarize the report for you. People cited daily unfair treatment in the form of stereotyping harassment and microaggressions.
As the reason for leaving the study even found that unfair treatment was more pronounced in tech companies than in non-tech companies. Underrepresented people of color experienced stereotyping at twice, the rate of white and Asian employees and the more [00:18:00] bullying and employee experienced the sort of, they stayed at the company. This is what true diversity requires. True equity. Here's LeRon again.
LeRon: [00:18:07] Diveristy without equity is useless. I'm not about, you know, a quota. I'm about people coming to an environment and being able to contribute and change the environment. When a black person leaves, they're going to tell other black folks, Hey, listen, this is, this is what's happened to me there. It's one thing to hire [00:18:30] people, you guys, but it's another thing to retain. Retention, that's the real key. And no one's addressing it. You know, you can recruit from a HBCU, right? 20 black folks, 10 black men, 10 black women. But if 18 of them bounce after a year, are you really doing something or are they just there for the window dressing?
Dairien: [00:18:52] For LeRon, retention was never on the table because he never really felt welcome to begin with.
LeRon: [00:18:57] It was very segregated and I never [00:19:00] seen that before. It wasn't an environment of togetherness. It was, it was just segregated. That was the last Silicon Valley company I worked for. And I'll never do it again.
Dairien: [00:19:10] A Pew research study in 2018, revealed that over 60% of black professionals in STEM roles have faced discrimination in the workplace. From my conversations with Black professionals, I would guess that number is probably closer to a hundred percent. The study found the same to be true for 42% of Latinx employees in similar roles.
Why would anyone stick to a job or an entire industry for [00:19:30] that matter, that rejects their identity?
LeRon: [00:19:32] If we're gonna keep it funky, like, I'm a black man in America and tech has never truly been friendly to me. So I'm already going into an environment going into a lot of work that is pretty much owned by owned by white folks.
So when you are in an industry that prides itself on innovation, but isn't very innovative in the [00:20:00] people that make up said companies. It's just really interesting.
Dairien: [00:20:05] Another story of being alienated in the office is one we heard from Jacqueline Gibson. She's a software engineer and digital equity advocate.
Jacqueline: [00:20:12] I can speak from firsthand experience. It's hard being in a space where you are the only. I'm often the only Black person in the room, or the only Black person who is working on this project. I've had instances before, when I was an intern where people questioned, why I was [00:20:30] there. Why I was in these spaces. I've had people ask me if I was someone here to deliver something.
And so I think that when you are constantly faced with being the only, and you're constantly faced with being sometimes the only voice for underrepresented identities, it can be exhausting. It can be disheartening and it's just upsetting.
Dairien: [00:20:57] Tech spaces can be a lonely place for black women. [00:21:00] They're often the only Black woman, sometimes the only Black person, in that space, yet they use their voice to look out for others.
Who's looking out for them?
Data shows monocultures present in personal experiences like Jacqueline's show why this is a major problem. Our friend Treavor shares his own experience.
Treavor: [00:21:22] So I identify as a black queer male. I carry that identity with me into tech [00:21:30] spaces. I'm a writer, I'm a designer, I'm an entrepreneur, I'm independent, I'm a consultant.
You know, there's a lot to it. And everybody wants you to be one thing and to fit into a box neatly, but I've never been able to do that. I know who I am and I know. I know what, what I bring to the table and my brand of blackness is, is important. It is a voice that needs to be [00:22:00] heard.
Dairien: [00:22:00] It's really interesting, right? This, this idea that we're expected to fit into a monoculture as creatives. That seems counterintuitive. I think creativity should be boundless. What do you think about that?
Treavor: [00:22:13] I can't really answer the question without. Discussing my history with adapting to white culture in general, I've always been able to straddle the [00:22:30] line between white culture and also black culture.
Um, that's gotten me in trouble many times, but, um, being able to code switch has gotten me very far in my career. Um, but that doesn't change the fact that I look black. So I still get a lot of, um, sometimes vitriol amongst the microaggressions and everything else.
Dairien: [00:22:56] Code switching refers to fitting into a dominant culture while [00:23:00] also maintaining the plurality of one's own culture. So even if Treavor feels comfortable with the challenge of code switching, constantly switching to fit into white and Black spaces, it can be really strenuous.
Treavor: [00:23:11] The micro aggressions, the conformity to white culture. It's um, it's, it's a lot, it's very stressful.
Dairien: [00:23:22] Microaggressions can occur when we lack empathy or understanding for the person we're talking to.
It's coded language saying one thing, but meaning [00:23:30] another we're all capable of making microaggressions as well as receiving them. And so it's no surprise that people who don't fit into the culture are often on the receiving end of microaggressions, to understand what fuels these bad behaviors we got in touch with David Dylan Thomas. He's the author of Design for Cognitive Bias.
David: [00:23:45] Cognitive bias is really just a fancy word for the fact that you have to make something like a trillion decisions every single day. And as a result, your mind is doing a lot of things on autopilot. The problem is that sometimes the shortcuts lead to errors and we call those errors, [00:24:00] cognitive biases. A lot of the biases that lead to racial harm that are often come up within the tech world or just the corporate roles in general have to do with basic pattern recognition.
So if you're, let's say, hiring a web developer. And in your head, the pattern that's been set up for what a web developer looks like is skinny white dude. Even though if I were to ask you, "Hey, do you think men are better developers than women, or white people are better developers than Black people?" You'd say, of course not.
That's ridiculous. But with that pattern set up in your head [00:24:30] from watching TV or workplaces you've been in. If you see a name at the top of a resume that doesn't quite fit, that pattern, you might start to give it the side-eye.
Dairien: [00:24:39] According to David, a job interviewer might not say a Black candidate is less qualified than the white candidate, but their unchecked bias is still heavily at play.
David: [00:24:46] I have taken the implicit association test that basically shows you a lot of images very quickly and forces you to kind of just make snap judgments around what goes with what the end result is. It will tell you if you have [00:25:00] an association implicit association with one thing or another. It's not telling you that you're racist or sexist or that this is what you actually believe.
It just tells you that left to your own devices, a snap judgment moments, your mind associates, these two things together without you even realizing it. And it's important to understand when we talk about bias that something like 95% of cognition is happening below the conscious level. And so for me, one of the things I found out is that I associate weapons more with black people and [00:25:30] non-lethal objects more with white people and I'm, I'm Black.
And I can totally see where that comes from, right? It comes from the news, it comes from TV. It comes from, you know, all sorts of sources of information, where you're just seeing this pattern over and over and over.
Dairien: [00:25:43] Wondering how biased you might be? Spoiler alert, very biased. We all are. Find an implicit bias test online and
try it yourself. The way these patterns build up over time means that they have a significant influence despite going completely unnoticed.
David: [00:25:57] Racism is designed. It doesn't just happen magically, right? Those patterns come from somewhere. Um, and yeah, it's very deliberate to start to associate the term black with bad things in white, with good things and dark with bad things in light with good things.
Um, because that makes it easier to hold onto power. That makes it easier to make even the people you are disempowering, feel like they should be disempowered. Right? There's a very subtle message there that not just is white better, but. Because black is worse. I shouldn't make as much money. I shouldn't be able to hold [00:26:30] onto my home. I shouldn’t you know, you start to internalize that.
Dairien: What David describes is an internalized hatred. And as David said, white supremacy is so pervasive that it's embedded in our language. Okay. So there's no question about the problem. We're able to listen to experiences that supported the dismal data text monoculture excludes a lot of people.
So where do we start? How do we actually hold companies and ourselves accountable? I'll pass the mic back to Blendoor CEO stephanie Lampkin. Blendoor provides a tool called Blendscore to assess diversity within an organization.
Stephanie: [00:27:01] Blendscore is a comprehensive diversity equity and inclusion index and rating system that measures company's overall diversity strategy and outcomes.
Dairien: [00:27:15] Blendscore's index provides data points like demographics in the workplace, in leadership and on the boards. Blendscore also takes into consideration and play benefit programs, partnerships, and social impact.
Stephanie: [00:27:25] So what started off as just sort of a metric [00:27:30] to kind of differentiate which companies were really putting their money where their mouth was when it came to diversity and inclusion is now evolving into a really interesting indicator of company sustainability and financial attractiveness.
Dairien: [00:27:48] And what are some of the companies that score highly on the blend score scale? And what does that mean for those companies?
Stephanie: [00:27:54] So larger, more legacy companies have often scored higher. [00:28:00] So HP, Intuit and usually the higher a company scores is more so a reflection of how comprehensive their diversity equity and inclusion strategy is. Meaning, they've made an effort to establish policies, have people in leadership
positions that are very broad in terms of diversity, across race, gender, [00:28:30] sexual orientation, veteran status, disability, et cetera.
Dairien: [00:28:35] Stephanie is holding up the mirror so that companies can look at themselves honestly. Self-evaluation is a great first step. It takes the guesswork out of identifying how equitable opportunities are within your own organization.
A big question for that self evaluation process: Do you, as a co founder or CEO or hiring manager have a genuine desire to be around people that aren't like you? Here's LeRon.
LeRon: [00:28:55] If you. Don't have an experience, or if you don't have a [00:29:00] desire to be around folks that are not like you, this is what's going to happen. You know, you're going to get environments like a Google or a Facebook where even if you're black with a badge, people are going to be checking to see if it's real.
Dairien: [00:29:16] LeRon refers to situations where black employees are assumed to be trespassing by security. That's the sort of thing that happens when one culture of what's expected is reinforced by cognitive bias.
We're talking about decades of expecting certain categorizations of people [00:29:30] to fill specific roles and hiring cycles that continue the same pattern. The first step in altering these cycles is to openly talk about what's happening. Please join us throughout our six episode series where we'll work together to dismantle this idea of culture fit.
And we'll make sure that cultures that we create are welcoming to the diversity of experiences all humans have to offer. We'll hold each other and the tech industry accountable. We're going to check our biases and listened to insightful stories from people who have been kept on the sidelines. And we'll look [00:30:00] into how we can encourage tech culture to better serve the entire world.
You heard a ton of incredible voices on the podcast today. So we're going to take a moment to give everyone a shout out. Here's a huge thanks to our guests. LeRon Barton, Treavor Wagoner, Stephanie Lampkin, Sinduja Rangarajan, Sydney Sykes, Josh Torres, David Dylan Thomas, Frances Coronel, and Adam Recvlohe. Visit all-turtles.com/podcast.
You'll find links to their work as well as some of the resources that they mentioned throughout the episode. We would be hugely appreciative if you went onto iTunes and left a review for Culture Fit. That way other people can find Culture Fit, it'll help us spread the word. Finally, a special thank you to the tiny, but very mighty team that helped make this episode possible.
Including Marie McCoy-Thompson, she produced edited, and co-wrote the show. And thanks to Jim Metzendorf. He mixed our show. As well as [00:31:00] Dorian Love for laying music. I'm Dairien Boyd, and I'll catch you in episode 2, we're going to dive into the history of racial bias in Silicon Valley.