Are you aware that gender bias is affecting lending approvals? In this episode, Tracy Hazzard, Inc. Magazine Columnist and podcast content strategist, talks about how gender bias affects women when it comes to approvals and other sector of ethical subjects. Tracy provides insights on how to change this code-based problem to make women and other communities more profitable. Join Tracy as she imparts her knowledge on how to design algorithms with filters to learn more about the world of biases and why you need to question your understanding of these processes in finance points to know what they are.

Watch the episode here:

Listen to the podcast here:

Is Gender Bias Affecting Your Approvals? With Tracy Hazzard

We’re going to be talking about one of the most significant fundable subjects out there, privacy, bias, leaning in or leaning out. How do algorithms support all of this AI technology? How do algorithms support our approvals? We’re all about being f*able means being fundable, getting approvals. We’re going to do a deep dive with Tracy Hazzard. I’m a fan of this woman in four different platforms. We’ll discuss why. This is crazy. I am so excited to have this episode. First of all, it’s my inaugural long-distance interview episode. I’m brand new at this one. I’m so excited to have Tracy Hazzard.

I’m going to let her introduce herself because she has so many amazing accomplishments in her life. That’s the reason I’m a fan. I’m not a fan of a lot of people, but this woman, I’m all in. This episode is about conversations we had from a Facebook Live where I went off. It was one of my biggest rants about how human beings filter in their experience. We’re going to be talking a little further upstream about how algorithms are designed with these filters, with this bias. We had overlapping views, we shared some things and we disagreed on others, which I’m always into. Welcome, Tracy. I am so happy to have you here. 

Thanks, Merrill. I’m excited to be here. It’s such a pleasure to get to talk about something that I’m as passionate about as you are and for different reasons, but still passionate about it and I love that. You asked about my background and the reality is that most of what we’re going to talk about is relevant because I write a column for Inc. Magazine. It’s about to be my fifth year writing it.

Congratulations.

Thank you. My whole career has been focused on disruptive and evolving technologies. I write about these things like AI, 3D printing, blockchain, stuff most people have no grasp of or understanding about how they get started and where they go, virtual reality, augmented reality, all of these things. I see them at the very early stages and start to review them, understand them and say, “Is there an opportunity here? Is something going to happen? Is this really going to go mainstream?” That’s usually the viewpoint people expect from me and expect for my column. I get to comment on them. I also podcast about this. I have a podcast on blockchain called the New Trust Economy. I have a podcast for podcasters as you know, called Feed Your Brand, and we knew each other from other things too.

My very first podcast was on 3D printing and watching an industry that never did go mainstream like it was projected to go. That’s where I bring to it this viewpoint that sometimes these things happen in there, but because I deeply know and understand consumer buying habits because that’s my actual long-term career. For many years, I ghost designed products that you buy every day at mass-market retail. I understand consumer buying behavior, especially the pattern and habits of women who consume every single product category, including finance. They choose the banks, they choose the credit cards and they choose all those things. Including the buying habits and finance, real estate, all types of categories are mostly driven by women and it’s up over 80% to 90% in some categories that it’s being driven by the way women think, buying decisions.

Do you guys see already why I love this woman? She’s a professional professional. She knows this inside and out. First things first, I want to weigh in that it is an absolute fact. The most recent numbers we have is 2017 but 65% of all mortgages that are engaged in the United States in 2017 were driven by women. It was either the women or women in the household as the lead. It could even be a joint mortgage, but the woman’s name is on top 65%.

That’s bigger than I thought it was. That’s great.

When it comes to borrowing in nonmortgage relationships, depending on the metric you use for 2017, because there are credit cards, there are all kinds, but it is between that same 65% and 80% of all debt consumption is engineered and the senior applicant is a woman. It’s not just the buying decisions but actually executing on the purchase. Engaging the credit instrument that’s going to be that. Of course, then they decorate it.

Categorization is at a basic level of shorting and learning about the world. Click To Tweet

How do they use it? This is the thing that I’ve spent my entire career focusing and understanding on that and most people don’t want to listen to it. The last thing my clients when I was designing products for Costco wanted to hear is, “This is going to sell better to women.” They didn’t want to hear it, which I thought was the most ridiculous backward thing ever.

Talk about bias. 

Why shouldn’t you care? They would say, “We don’t care about niche markets. We sell to anyone.” I was like, “86% of your sales and memberships are driven by women.” That’s not a niche market. That’s what your numbers are showing too, right?

Yeah, completely. First things first, I want to do a deep dive into the actual tech development, but I want to talk about your views on the philosophy of bias. I call them filters. In all of my work in my writing and the book that my audience is aware of, the new F word is about a new paradigm in borrowing, how borrowers can be professional borrowers rather than some consumer and then be predated upon by lenders because they don’t know what they’re doing. What are your thoughts on the human experience of filters? How we’re raised, the prejudgments we make, our prejudices, which is the fancy word for prejudgment. It’s judging not on the facts, but on some preconceived notion. What’s your big-picture philosophy, far upstream about filtering in the human experience?

It’s natural. It’s part of our nature. It’s part of the way humankind has evolved and developed. I have daughters, two young ones. You watch them when they’re around ages 3 to 5. My daughter is a blonde, not like their mom, but they go into this age when they only want to play with the blonde girls. They don’t want to play with the boys and they don’t want to play with brunettes. They want to play with the blonde girls. My daughters go to school out here in Orange County and there are very few blondes actually, which is ironic and everyone’s from somewhere else. They feel like the odd girl out and they’re like, “I don’t have anyone to play with.” We have to say, “How many girls have brown eyes because you have brown eyes?”

Categorization is at a basic level of shorting and learning about the world. To think that we don’t carry that into adulthood is stupid. It’s not going to happen. We’re carrying that all the way through now. As parents, hopefully we’re doing a much better job of saying, “This is not appropriate behavior in here inclusive.” Let’s talk about what that looks like, but it doesn’t change the difference that our human nature is we need to categorize unboxings because we have to filter information quickly. That’s our job and it’s the job of those algorithms out there to filter the people quickly. We want to get to an answer as quick as possible.

That’s why we want to ask that upstream question. The reason why I bring this in is I believe we filter to create safety. Anything that’s different from us is either unknown or dangerous. Dangerous and maybe bad. We have to train ourselves consciously to understand that different is actually good. It expands our capabilities of life. I bring this up because when we start talking about software, we’re codifying prejudice. We’re hardwiring it in.

It’s what I call designed in and my column is called By Design. All of this gets designed in. Early on in my career, I was lucky to be working with some great researchers who were doing this research with NASA and with color pattern and texture, which sounds like the weirdest thing ever. What I learned from it was that our minds, our eyes, everything that we see, everything that we take in are filtering and building out no different than we were when we lived on the savannah. We had to worry about what we could eat and whether the lions were going to eat us. We still operate in that at the base level of our brain.

It’s our lizard brain. 

AYF 42 | Gender Bias In Lending

Gender Bias In Lending: FICO has more data than anyone else. They have more pieces of information in which to test out any algorithms and understand whether or not it’s filtering out the wrong or right people.

 

Our lizard brain, that’s right. We take in information that way. If we don’t build our environments, if we don’t build our systems, if we don’t build that to allow for that to happen, then we already are in resistance to that. The studies that we were doing on color is that when we have these gray bland environments, out in the space station all alone, and it’s all white, silver and brown, it causes extreme stress level because it causes you into this, “I don’t know what’s good or anything like that or what’s bad.” Every time I turned around, a big red light starts flashing. If we put some texture in there, color in there, our eyes and our brains can actually calm down and say, “There’s nothing standing out in alarm in this space and only where it’s supposed to alarm me. I do something about it so I don’t decompress.” That’s where we don’t build that in often enough. We don’t think that through that we have to accept this filtering is going on because it’s a survival technique. That’s why we’ve been around.

We made it so far. I can’t predict long in the future, but so far we’re doing okay. I love it. I can totally see NASA engineers, you calling it monochromatic distress. It’s psychological conditioners.

It’s what we call discognizants. It’s upsetting the cognitive abilities in our brain when we’re at that level. Sometimes we call it resonance. It’s when it’s not resonating, you know the difference because this causes the heightened prickliness that you feel. It’s a good thing because then we know we need to be safe about something, but it’s also a bad thing if it’s constantly occurring.

Bias is not in and of itself bad. There are no true good things and bad things except the intention behind it, right?

Right. We can’t label it that.

Bias is not bad. It’s either discriminatory or harmful bias would be what is bad. Let’s give everybody a background. I went on a rant because this whole Apple and Goldman Sachs thing came out and there was a gentleman who did a blog or tweet. He was very popular and he basically said, “Something is wrong because I got twenty times the limit that my wife did, but we share the same finances. We share the same household income, we share all of these things. This is discriminatory.”

Like you, my first reaction was, “They don’t necessarily know what goes into the formula,” because as I learned from your Funding Hackers program and from your Bootcamp. In full disclosure here, I’m one of Merrill’s clients and I love it. What I’ve learned from that process is that there are so many factors. Most people don’t have that nuanced understanding. Where it raised the red flag to me was when I saw Steve Wozniak and his wife come forward.

He’s the Cofounder of Apple.

It is in our human nature to categorize unboxings because we have to filter information quickly. Click To Tweet

Woz says, “I got ten times what my wife got.” California is a joint property state. They’ve been married long enough to pass that period, which is 10 years or 7 years or something like that. That shouldn’t be the case. I can’t imagine that there isn’t as much stuff in her name as his name because of the kind of guy he is. I’ve met him and I’ve interviewed him and I’ve read his book and followed him.

He looks like a teddy bear and significantly is. 

He also understands how these algorithms could bill and how this stuff happens. He’s saying this doesn’t seem right, whether or not there are actual true results in there.

There’s discriminatory bias.

That still remains to be seen. What it does is it raises the flag on there probably is something wrong. Here’s where I looked at that. I look at that and say, “The issue that most companies can’t handle, or most coders, if you want to call it that at that base level, is the unconscious bias that’s put in.” You develop a system and you lived in a world where there were women were property. You have bias built into how you build the algorithm because you don’t have that same American acceptance of equality that we do.

Let’s create the framework for what she said. Remember, the outsourcing of coding is popular in second-world and third-world countries. They’re extremely intelligent. They have not adopted all of the equality, the upper equal opportunity act, equal opportunity lending, equal opportunity, etc.

The property ownership, it’s different.

There may be a patriarchal bias to the system. In their subconscious, in their unconscious mind, there may be some version of discrimination and that’s really far upstream at the coding level.

AYF 42 | Gender Bias In Lending

Gender Bias In Lending: When you brute-force test a code or any kind of algorithm, you have to have a way to test the failures.

 

It can also happen in the planning levels. Obviously, somebody is sourcing out this coding or these things are happening, but they’re also planning it. When you plan it, because you live in this world with your own filters and your own ideas about the way things are, you don’t always think, “I have to say that. It’s understood.” The understood things maybe are not understood across cultures and genders. It doesn’t really happen. We know that there’s a significantly lower number of women, especially white women in coding positions, black women as well. These are a large portion of what you’re talking about as the lending community. Without that, they’re not represented at the base level in the coding. There isn’t equality of perspective or broadness of perspective being represented in how things are developed to begin with.

Also at that understood planning level, we again have not enough women at a high enough level or in planning positions, especially in technology companies, but in finance companies as well. We don’t get them involved in the process. They’re not putting in and saying, “I need to question our understanding of these points of what they are.” We’re developing a code based on maybe a flawed plan with cultural differences and gender differences already inherent in the team we’ve chosen to build the code. The third factor of it is that we don’t really fully test the failure because we can’t, because in the process you can’t test the failure.

If they were in fact testing out every single one of the women who had failed in that group, Steve Wozniak’s wife, the other guy’s wife, if they had tested out all of these failures and said, “Let’s put them through as general underwriting like we would have normally done. They shouldn’t have failed through the process.” We then have a code problem because we’re excluding a community we could be making great money off. Why would we want to do that? That doesn’t make sense.

That was my exact point and I came at it differently. I said, “If 65% to 80% of buying decisions are women, then why would we consciously do so?” Your countermanding argument was we wouldn’t consciously. 

We want to be profitable.

We may unconsciously. That’s where I told Sky, my producer, “We’ve got to get her on,” because it sounds like and it feels like the vitriol out there is discrimination.

I don’t always jump on that bandwagon. That’s not my first reaction to it. It’s, “There’s probably something going on here.” It’s actually hurting the company. It’s hurting Apple and Goldman Sachs as much as it’s hurting the potential consumer base out there. It’s not the perception that they’re discriminatory. They’re probably excluding great people to lend to because women are known for paying their debt faster, for fewer bankruptcies, for all of those things. This is a better lending group.

Goldman Sachs is a tier two and it’s an 80%, so you don’t want the card in the first place. Whether or not it’s discriminatory, you don’t want that. This is a moot point for all of us funding insiders. That isn’t what you want, even if it were perfectly designed. One of the things that I wanted to share with you was I came back from FICO and everybody’s been a part of this conversation. It was awesome. FICO had a class for high-level designers and engineers and it’s called Bias In, Bias Out. It’s exactly what we’re talking about. If there’s bias in, of course then the algorithms are going to deliver biased results.

How to Fight Biases in AI and Create Trust, that’s the name of FICO’s thing. Of course they’re cutting edge, but alarming headlines appeared increasing rate about biased AI leading to non-transparent decisions, discrimination and a lack of trust. They say newcomers to application development are confronted with multiple traps and can inadvertently lead to biased and unreasonable automated decisions. By comparison, FICO credit score developers have decades of experience in the art and science of mitigating biases through transparent incredible models. They go to show these very things how the FICO algorithm has checks and balances. I don’t know that Goldman Sachs, Apple uses FICO as one of their models. I’m going to say no.

No data is safe in a nonblockchain world. Click To Tweet

We don’t know for sure either way. It sounds like not, but who knows.

I have not done so yet, but I would like to invite the Twitter guy and his wife. I would love to do a fundability analysis and put them through and say, “Here’s what they evaluate in doing a fundability analysis.” The very things they say like, “Our checking account is shared, our assets are shared,” are not fundability metrics by AI software. I would love to do a compare contrast to show whether there’s a bias. I can’t say that would remove all the bias, but I could at least compare it to what we already know our fundability metrics. I already know those things. 

That would be wonderful. You should definitely do that.

Invite them and say, “Let’s do this. I’ll Venmo you the cost of the credit report. I’m happy to do this.”

I can see Wozniak and his wife for that matter. Go for it, Merrill.

Let’s do Steve Wozniak.

Let’s reach for the stars while we’re on it.

You want to go balls out when it comes to this stuff. Here’s the great thing. If I put them through those metrics, we will know to a significant degree bias included because all of the things that we measure, we know are measured by automatic underwriting systems. We know that. Not the nth degree. I don’t have FICO’s code, I don’t have the lender’s code, but I do know if there’s a plus or minus 5% then yes, there was bias. If there is a huge difference, then you’re measuring the wrong things and you’re incensed about the wrong things. That’s where our education increases.

What FICO is talking about is brute force testing. FICO has more data than anyone else. They have more pieces of information in which to test out any algorithms and understand whether or not it’s filtering out the wrong people or the right people, because they can follow someone all the way through their life and then say, “They declared bankruptcy, then this happens, then they default.” They can see failure in that process as much as they can see success in this process. The algorithm set at Goldman Sachs and all of those other places is set to create profitability. It’s set for a narrow margin of people that they want to, that’ll give them the least amount of risk and the most amount of profit. That’s their goal. They probably set out some metrics in the process of saying, “We expect to have X percent of the people going through the approval process and achieving these rates and these credit limits.” When the algorithm achieves that, they don’t try harder to find out about all those other people.

AYF 42 | Gender Bias In Lending

Gender Bias In Lending: If FICO’s out here building this amazing core data and understanding it and now you build off of that, you have a higher likelihood for success in whatever algorithm and underwriting system you build.

 

“Green light, this works.”

“It’s working and it’s way faster than our other one.” Their model of measurement too is about speed. We’re pumping through more people in this automatic underwriting than we could ever pump through in underwriting before. It costs us a whole lot less because the underwriting had a lot of team, training, support and all of that. Their measurement is our profitability as a company is also way up because we aren’t supporting all these underwriters. They’re not also questioning saying, “Could we be making more money?”

Good enough is good enough, not great. 

They’re not asking the question, “Who are we leaving behind? Where are the opportunities?” Not yet. At some point they will, but this is too early and especially on this day of Apple and Goldman Sachs launching this card. It’s too soon. They haven’t done enough what we call brute force testing. When you brute-force test a code or any algorithm, you have to have a way to test the failures. You have no view because it’s brand new to you, brand new to your company, and you have no model for how you used to do it before. You have even less ability to tell.

You’re not comparing apples and apples. Your argument is that we don’t take 10,000 manual and see what the failure rate is. We look at the profitability of the acceptance rate because it costs so much less. We’re not looking for, “82% succeeded in getting approved through manual underwriting.” They don’t push until they get 82% approval rates over here. The 42% is enough because it’s ten times more profitable because automatic underwriting is three seconds and a lot of code. There are no human beings, no evaluation. Good enough is good enough, not great.

They’ve tightened that risk so tight that it is harder to qualify in a way. They’ve tightened that risk so tight because they’re afraid to make those mistakes because then it could hurt the company. They always start out tighter and they rarely loosen. That’s the understanding. It’s like you start out tight because you need to make sure your company doesn’t fail at this and it doesn’t approve a lot of the wrong people. From that point, we rarely get an algorithm that gets broader, that doesn’t normally happen and algorithm learns and gets tighter and narrower. That’s what it’s supposed to do. It isn’t until it starts affecting and hurting profitability that they’ll go back and say, “What’s going wrong here?” What they really should have done is they should have taken 10,000 or whatever they imagined that the force number is.

It will be hundreds of millions.

See if FICO can do that and they’ve got a broader set of data points to work with across the board. They could even fake through and say, “This person is applying and this person is applying,” based on somebody’s profile but not really be approving them or doing that. They have information to utilize that are real people and real information. You want to run them through side-by-side, automatic underwriting and regular underwriting and see where the crosses. I guarantee you they did that, but they probably did it at a very narrow and small subset. When they saw that the algorithm was beating out the underwriters, they said it’s good enough.

That’s the good enough, is good enough. She’s making a brilliant point that the second you are 2 to 3 times more profitable, then we don’t need to be inclusive. To be profitable, I want to illustrate in reinforcing language, your argument is perfect that the algorithm is designed to create less and less risk and only until profitability is threatened will they even take a look at broadening the approval model. If it’s ten times less expensive, you only need to make 20% of the previous applications that double your money. That’s a brilliant point, Tracy.

Between the lines of the opportunity gaps is where the truth is. Click To Tweet

The other part of that I want to point out to everyone and thinking about it is that this is also the opportunity. We may see a company emerge with a better-run algorithm, a better machine learning system on how they design it. Actually having this brought up, whether or not it was true, it’s the perception that it’s true and that there’s discrimination.

It needs to be addressed. It needs to be looked at.

That data has to be trustworthy. It starts everyone to question. When you’re questioning this, it means there’s an opportunity somewhere for someone to make a great card that’s going to make a killing because it’s going to actually target the women who have been falling out of the other algorithms everywhere.

The 82% of manual approval rates, absolutely. 

That’s where we’re going to see something emerge into being really consumer-friendly. I’m always a fan of these things coming up, even if they’re little hyped up because they do raise awareness to people taking a scrutinizing look at their algorithms.

That’s perfectly said. Everybody knows that I’m a FICO fanboy, but I don’t trust a lot of systems unless there’s this scrutiny. FICO has been doing this since the ’50s and has basically set the standard for the vast majority of organizations out there who subscribe to their willingness. It costs to buy this scrutiny. I don’t know who does and who doesn’t use a FICO. I know that there were over 600 banks and financial institutions represented FICO World.

Here’s something that we heard Merrill, because we’re in your program and we were out there and I’m looking at buying a new car. My car loan ran out, and you know how bad that is on your score. Seriously, my score dropped 30 points overnight. It’s huge.

I rest my case, this is a witness. 

AYF 42 | Gender Bias In Lending

The Science of Getting Rich: How to make money and get the life you want

I had a better score of Tom and I, of my husband and partner. Looking at that drop, we were like, “Can it only be the car loan?” Nothing else changed in that two-month period of time.

This is reverse bias. 

We only have one car here. I’ve been wanting to go buy a car. I’m like, “This is great.” We go out to investigate, we start to make some phone calls around to the banks and we come to discover that most of them don’t know which one of the credit bureaus they could pull. They have no idea because it’s some mysterious automatic underwriting right now except one of the banks. We called the Credit Union that I’ve been involved with since I was in college. Tom called the Credit Union. The Credit Union said, “We only pull Experian now. No one trusts Equifax because of the perception of them being hacked so much and TransUnion is ridiculously slow at updating their systems. We can’t trust that they have current data.” I was like, “That was totally honest and at least someone was paying attention. It’s a Credit Union and their company is small and they’re regional and they have 6 or 7 branches. They can probably have more information down at the base level than most big banks do.

There are lots of people who’ve pulled their contracts from Equifax. I am going to put some conversations I had with a VP of Equifax. They know that the data breach was gut-wrenching. It was horrible. They are now partnering with on a platform with FICO to not just shore up, but this is a market opportunity for Equifax to regain some level of trustworthiness and create a joint platform that I believe is going to be mind-boggling. Before this, credit reporting and credit scoring legally have to remain apart. The underwriting platform allowed by data and FICO could be a juggernaut. I’m saying it’s always three steps backwards, seven steps forward. It’s the slingshot effect I was talking about.

When you have a company that has as much data and information as they do and who’s been as good at it for as many years as they do, they take that with the respect and authority that they’ve earned. I was interviewing Jerry Cuomo who is at IBM and he is the VP of their Blockchain Technology, which is a whole new division. Why is IBM going blockchain? It’s a question. I was interviewing him on the New Trust Economy and I started to ask him some questions about this algorithm thing. They’re building what is called a private blockchain, which is different than the public one. A public one has a higher perception of trust because everyone can be involved at any point in it. It has the checks and balances of the community. It has the group. He said, “There are so many applications and corporations and understanding where we can’t have a public blockchain.”

However, he said, “We built the coding basis for what is our blockchain out in the public space and open source because we understood that while we had a lot of data, we have so many different clients and everything at IBM, they cannot brute-force test that properly themselves.” They put it out in open source and then they pulled it back in and from their learnings decided what was going to be a part of their private blockchain because they knew how they wanted to service their corporate clients with it. That’s the better way to do it. If FICO is out here building this amazing core data and understanding it and now you build off of that, you have a higher likelihood for success in whatever algorithm and whatever underwriting system you build.

There’s no time this time to talk about blockchain, but everybody’s going to be responsible for the security of everybody’s data. It’s going to be a bit here and a bit there and they’re all online and we all share the bits.

After the experience of going around a bunch of banks on Monday, it made me feel a little like a bank robber. I was casing the place. I put that in the Facebook group because that’s what it felt when you go to three banks in one day to open accounts. Even though you’re giving them your checking money, it feels a little like you’re doing something wrong. After experiencing everybody’s security for online platforms, some of the archaicness of getting your PIN number, I was like, “Not enough has changed here.” I can see why the trust in banking systems and in our financial system is so low.

It’s because profitable enough is enough. There’s not a social consciousness to say, “We’re at 40% but it’s good enough.”

As long as you’re making money year over year, who cares, right?

Every time you sit back when you see injustice is as bad as accepting and doing it. Click To Tweet

Yes. The value of human life is based on what the payout would be at GM for having accidents that they have to pay out on for poor design. It’s horrible modeling. Banks are in the same business of saying how profitable is enough given that we’re going to have some data breaches or we’re going to have some failures, but those failures are not enough to make us want to improve. All of a sudden, like Equifax, no credit bureau is a progressive institution. They have this massive failure and then now they’re partnering with somebody who has truly been a pioneer in every iteration of their algorithm development and even trying to do their best to build bias out of it, if not make it conscious. It’s like everything else. It’s natural selection. When you fail, it’s the only time you’re willing to listen to improve. Things will get better, but look at all the failures in the banking system, the horrible security. When they say bank-level security, that isn’t saying much.

This is when somebody does ask me about blockchain, that’s what I was like. Imagine you build a big fortress around all of your data and you put red bull’s eye targets all over the thing. That’s why no data is safe in a non-blockchain world. That was the first opportunity I saw. Distributed data means I don’t know what’s valuable and what’s not. That’s a whole lot easier. That’s the same thing here. What I wanted to say is that when you think about the FICO model and you think about all those, I think it’s 30 points of information that they’re utilizing to value you and to value your ability to be lent to. When you look at that fundability score, the more data points, the better. If it was only our credit score, it would be so biased because it’s one single point of data managed by companies getting paid.

By quantibility, we have at least 30. That’s what drives at all. What we call fundability is also what lenders use in their software. Am I willing to give them money based on the summation of this algorithm, all these data points that I’m measuring? We just call it fundability to make our borrowers feel, “I’ve got some power in this message.”

On the AI side, it’s called risk mitigation. That’s mostly what they’re actually doing.

That’s their perspective.

Their perspective is not just profitability, but on the coding side, their job is risk mitigation. The goal is to net a result that is profitable, but their job is to mitigate the risk.

Let’s back out from the boots on the ground to the 100, maybe the 1,000-foot view. Tell me what it is about you, your character traits that make you not just intelligent, but perceptive of this cutting-edge technology? Your entire background is about evaluating the likelihood of cutting-edge, brand-new technology. What is it in your character that makes that so fascinating to you?

It’s basic curiosity and questioning. You never believe what you’ve read.

“It’s on the internet. It must be true.”

It must be true. I never believed the first thing I read. It doesn’t happen. My father instilled in me this idea very early on that when you wanted to read about a subject, you should read 3 to 5 books about it. Back then, we didn’t have the internet. I’m sure you’d say you probably have to go through twenty websites, but back then and you could read 3 to 5 books and that way you got a better, prouder perspective. One of those books always had to be literature and not a history, not a biography or anything like that. It had to be literature from that day and age. Literature tells you more about what people thought especially if you’re looking at ancient history or longer history, they would disguise it because it was the only they could dissent. You would have the dissenting opinion in there somehow. That’s what he taught me.

It always is this discerning level of things that I look at where I’d be like, “It’s got me curious. I’m interested to learn more. Where am I going to find more?” The next part of that is I learned very early on to get to source material faster. You’ve gone to FICO World, that’s the source material. That’s so critically important to hit to the source material of something. Our mutual friend Aaron Young who you were at his event and everything, as part of one of his courses that I’m taking assigned me The Science of Getting Rich, which I had not realized is actually the source material for Think and Grow Rich.

It’s a Napoleon Hill original.

This is the original part before Napoleon Hill because it was out in 1910 and Napoleon Hill was in 1937, somewhere around there. Taking a look at that, it’s so interesting to be saying, “I’m at the source material for something.” I had not been curious enough about it to look into that deeper.

To go to 1910 instead of 1930. 

When you look at something like that, it brings an awareness of where were people’s minds at that time and where was that. I always like to go to the source material because you’re always going to find the fresher perspective. It’s really important to get that broader brush look at it because between the lines is where the truth is. It’s between the lines of the opportunity gaps. That’s what you’re finding and you’re seeing. This is why I didn’t need to take his bootcamp. I was going to hire him anyway. I took the bootcamp because I was curious. I wanted to understand it better because I thought it broadened my view of things. I did, I sat through your two-day bootcamp and I learned hundreds of points of things and took I don’t know how many notes, but now everything that your team does with me, I understand at a more intimate level. That makes a difference in how I approach the world and it makes it so that I am capable of seeing where the innovation point can lie. That’s in evaluating someone, but also in doing that for myself because my job is to innovate for my clients. If I’m at that innovation point, I know if I mess with this 4% of something, that’s the critical point because that’s the opportunity gap. That’s where we want to get to because that’s the true value. It’s the original value in any company, in any corporation, in any technology.

Everything we do personally and professionally, everything derives from this little awesome mess inside of us that’s trying to filter the world and grow and become responsible adults, sometimes from huge messes of our lives. Yet it’s the curiosity and getting to the source material because I always teach that only the truth is actionable. The truth, I say it’s a pendulum. You got this view and this view, but all views are accounted for somewhere in the middle. Rarely is somebody truly wrong in the historical perspective. We’ve been doing a lot of things, a lot of ways for a long time. Out of that, this opportunity gap is where you’re saying that as the pendulum comes to rest, that last 4% is where we get to see best practices, best opportunities. I am not attached to a single thing a lender or FICO does. I’m committed to finding out the next best thing that hits that 4%, that new level of accuracy. We call it the bull’s-eye. Thank you for that.

The thing that is so important about that and what you’re doing in general is that when you’re not attached to all of the things that happen along the way. I’ve worked with a lot of inventors over the years because of my product work. They’re so attached to their thing. They’re so in love with their thing, but when you’re not attached that’s where you can see is where the opportunities lie. How do things have to shift? Where do I need to go? This is our opportunity to be truly what I call competitive proof because you’re always so far ahead of your competition in deep knowledge and understanding of those things. They can copy one little detail of what you do and make a big marketing buzz about it. At the end of the day, it’s missing all that deep rounded data that is making your clients truly successful. When I always look at that, I always think that’s the more important thing to do. You’re compounding on the most important things again and again that are going to create continued success.

What would you say would be the single most important thing in this aspect of personal bias and institutionalized bias? What would you tell my readers as the most important way to bias-proof themselves? If we do it personally, we have a good chance of getting it out of the institution. What would be your recommendation as a human being, as a powerful, intelligent, skilled and capable woman? How would you tell me and my readers how to bias-proof themselves? 

If you don't speak up, you are potentially being a part of the harming processes. It is your job to raise the red flag. Click To Tweet

It’s not easy. This is hard. It’s a lot of inner work. I don’t think we can do that at any point. I don’t look at it as a bias-proofing. I look at it removing the negative bias. We want to remove the things that are harmful and hurtful. What might’ve been harmful and hurtful is not necessarily what was harmful and hurtful several years ago. I’m not a, “Jump on the bandwagon and let’s go rewrite history here.” I don’t believe in that as well. We need to say, “It’s my job to not be racist. It’s my job to not be sexist. It’s not my job to make sure that my kids don’t grow up with that.” That’s the bias I want to pull out in everything that we do and say. A little example of this was a couple of years ago, my daughter’s Montessori school had a Mother’s Day and Father’s Day event. The Father’s Day event was at the end of the day. The Mother’s Day event was mid-morning at 11:00. I’m like, “I’m busy. I’ve got interviews. This is inconvenient. Why can’t mine be at the end of the day?” I looked at my husband and I said, “If I say something about this, it’s not going to be received in the way that it needs to be unless you say something about this.”

We have to address the bias in a way that the bias can actually hear us.

They can’t hear us if I’m complaining and it’s another working mom complaining. When a dad says, “Do you realize you did this?” They heard it. The next year we now have it. It’s always at 9:00 AM for all parents, Mother’s Day, Father’s Day and Grandparents Day. Whenever you see something like that, it’s important to not get all in a huff about it, but to make a point and to say, “Do you realize that you’re doing this and that it’s not actually the best solution?” That’s always a better approach. Every little thing matters over time. It’s what I tell my daughters all the time, every little bit of inclusion that you can do, every time you see injustice and you stand up for that, every time you sit back, it’s as bad as accepting it. It’s as bad as doing it.

Those are the little things that matter in our own personal way that we address things. The institutional bias, if you’re in the institution, if you’re building that company, it’s your job to speak up as fast as possible because you cannot let that happen. Someone gets hurt. When I’m doing product development, people could get hurt on a product and you see it. That’s why we have all those massive recalls and lawsuits. If you don’t speak up, you are potentially being a part of the harming processes. It is your job to raise the red flag. That’s why we have whistleblowers. It’s important in the process.

Like we were talking about in profit mode, it is good enough to be profitable. In our companies, it’s good enough. How do we support mediocrity? How do we support minimalism? How do we support that it’s okay to meet legal standards rather than keep rounding up? 

It can get overwhelming to people. When I was early on in my career, I was working on sustainability textiles that we recycled or other things like that. Everyone was up in arms with like, “It’s never going to happen. There are too many things that have to occur.” Let’s take a small step because every bit is better than it was yesterday. Every bit means it’s out of landfill or out of this. Always having that continuous improvement mindset is important in everything that we do. Not to get overwhelmed and say, “It’s not good enough yet.” That means that we don’t make progress because that’s where people sit around and ruminate about it and bitch about it. “It’s not going to be good enough. It’s never going to be good enough. Why do anything?” That’s not making progress. If we truly want it, you design the systems in your company to be continually learning. That’s why I really love AI at the end of the day, because machine learning, it’s supposed to learn.

It’s designed to learn.

People don’t always do it, they stop faster than a machine. As long as we keep fiddling with the model and keeping our eye on its bias, that machine learning is valuable. When people ask me, “Do you mind if Alexa is listening in on your house?” I say, “No.” because I don’t mind because I know she’s learning my habits better. I know she’s learning about my daughters better. There’s this great game on there called Akinator. Call her, go and say, “Alexa, play Akinator.” Wait to play it with a five-year-old because their logic is so different. She’ll say, “Think of a character,” or something like that. You immediately go and you think of a character and she’ll ask you a bunch of questions. Before you know it, she will have the answer to what’s in your head. It’s like she’s guessing your mind.

It’s a really good insight because this game is teaching Alexa to learn about how our children think because children of different ages think differently. What she learned from my five-year-old was that she didn’t understand the difference when she said, “Is it a male or a female?” My daughter didn’t understand the difference because it was girl or boy in her head. It took Alexa long time. We sorted out that’s what the flaw in it and then ended up at the solution. That’s where our terminology gets better, all of those things. Play with it, learn with it. Brute-force test it on your own.

AI does not necessarily become Skynet from Terminator series. 

It can, we’ve got to watch it.

Tracy, your humanity comes out. This isn’t just tech. You guys all binge the Get Fundable! Podcast because here’s so much great content, but at the end of the podcast, it means how does this improve not just my standard of living, but how does it improve my quality of life? How does it create more opportunities for me to bless the lives of my children, my loved ones, my family and even my community? Thank you, Tracy, for joining me. We share more in common than we are diverse. You brought a wonderful opportunity for me to expand the conversation after my rant, which is on a previous podcast. Go watch the rant and then you’ll see Tracy’s logic, compassion and straight forward-thinking. You go, “I totally get why she’s here.” Thank you for joining me on this spectacular version of, the Get Fundable! Podcast. Learn for yourself how you can remove bias as much as possible from your life so that we raise our little ones and we love our big ones in a way that is inclusive and not exclusive. It has been a wonderful time. Thank you so much.

Important Links:

About Tracy Hazzard

Tracy Hazzard, Inc. Columnist & CEO of Hazz Design has co-designed and developed 250+ products generating almost $2 Billion in revenue for her retail clients. Tracy has had products in all major e-commerce and mass-market retailers; office superstores; electronics boutiques; and wholesale clubs, including the best-selling mesh office chair at Costco. For over 25 years, she has worked with design-leading brands like Martha Stewart Living and Herman Miller and is an expert in product, furniture, color and materials.

Tracy develops products from around the world and travels to Asia frequently to inspect, certify and work with her clients’ factory suppliers. Along with her partner, Tom Hazzard, she holds over 37 utility and design patents with an unprecedented 86% commercialization rate. Their intentional invention process has been the key to keeping her clients knock-off proof and is the foundation for her new invite-only collaborative community built to rev up original retail product sales with the right things in the right order with the right resources.

Besides being featured in Harvard Business Review, Forbes, Wired, CNN Money, and as an Innovation Columnist for Inc., Tracy is also a contributor to ThriveGlobal, and the best-selling author of Guerrilla Patent Tactics and Successful Launching By Design. Tracy co-hosts weekly the top-ranked podcasts: Product Launch Hazzards, Feed Your Brand, and WTFFF?! 3D Printing: the 3D Start Point for disruptive technology.

Love the show? Subscribe, rate, review, and share!
Join the Get Fundable! Community today: