Community leaders emphasize that digital literacy and access to AI tools are becoming essential skills, not luxuries, in today’s workforce. Credit: AI

Artificial Intelligence (AI) isn’t just something from the movies anymore. It is a tool we use every day. 

Now, Black communities face a big question: Will AI help us get ahead, or will it be another way we get left behind?

AI is already at work in our schools, hospitals, and job offices. It helps decide who gets hired, who gets a loan, and what news you see on your phone. It can even be used to replace workers. But there is a problem: AI learns from the past. 

YouTube video

If the past was unfair to Black people, the AI will be, too.

“The danger is the history,” said Onyx Impact Founder Esosa Sosa, whose company studies Artificial Intelligence. “If we teach AI using old data from a time when Black people were kept out of hospitals and banks, the AI will keep doing the same thing.”

The Jobs question: Who gets left behind?

The stakes are very high for Black workers. Right now, according to McKinsey data group, Black unemployment is about twice as high as white unemployment. Many Black workers also have jobs that are easy for machines to do. The stakes extend far beyond individual jobs.

Studies show that many Black workers are in office, food service, and factory jobs. These are the jobs most likely to be changed or replaced by AI. Some experts say nearly one in four Black workers could soon see their job tasks handled by a computer.

“We can’t just be afraid,” said AI engineer Wavee Rose. “We have to take our power back. That means learning how to build and use AI, not just being scared of it.”

Knowledge is power

The real gap isn’t just about computers, it’s about who knows how to use them.

Tomayai Colvin is an entrepreneur and educator who helps Black business owners use AI. She says learning this technology is a way to stand up for yourself.

“We can’t just take what they give us,” Colvin said. “AI is a tool. It shouldn’t replace your talent, but you have to learn how to drive it. If you don’t, you won’t have a seat at the table.”

Colvin also says it’s okay to keep a steady job while you learn. She says using your paycheck to pay for AI tools and training is a smart move. It helps you get ready for the future.

“It’s about efficiency,” Colvin said. “But it’s also about agency.”

A new civil rights fight

Black Americans capture only a fraction of the new household wealth created in the U.S., despite representing more than 13 percent of the population. If current trends continue, experts warn that the wealth generated by generative AI could widen the racial wealth gap by tens of billions of dollars annually over the next two decades.

But that outcome is not inevitable.

By automating customer service, marketing, scheduling, and data analysis, AI can enable Black-owned businesses to operate with the efficiency of companies that once required large teams—if owners understand when and how to use these tools.

AI will create billions of dollars in new wealth. If we aren’t careful, that money will only go to people who already have it. This could make the “wealth gap” between Black and white families even bigger.

But it doesn’t have to be that way. AI can help a small, Black-owned business act like a huge company. It can handle emails, scheduling, and ads. This saves time and money—if the owner knows how to use it.

“It’s about having control,” Colvin said.

What we must do now

To make AI fair, we need better laws and more honesty from tech companies. But for our community, the goal is simple:

  1. Get involved early.
  2. Learn as much as you can.
  3. Build your own tools.

“We cannot afford to be left behind. In this new era, learning about AI is a civil rights issue. The world is changing fast, and it’s time to get on board,” Sosa said. 

AI is making choices for you

AI is already making big decisions about your life. Often, we don’t even know it’s happening.

1. Healthcare 

In busy hospitals, AI helps doctors determine which patients require the most urgent care and predict which treatments are most effective for specific diseases.

  • The Action: AI determines your “risk score,” which dictates whether you get a follow-up call from a specialist, a spot in a clinical trial, or immediate access to a bed.
  • The Impact: If the AI uses “total healthcare spending” as a proxy for “health needs,” it will favor patients who have historically had the money to see doctors. This often results in Black patients being ranked as “less sick” than white patients with the same symptoms, simply because they spent less on healthcare in the past.

2. Hiring and recruitment

Companies use Automated Employment Decision Tools (AEDTs) to filter through thousands of resumes or to analyze video interviews for “personality traits.”

  • The Action: The AI decides if your resume ever reaches a human recruiter’s desk or if you are “rejected” within seconds of clicking “submit.”
  • The Impact: These programs often look for patterns found in previous “top performers.” If a company’s history consists mostly of men from specific zip codes, AI may penalize women or people from different neighborhoods, even if they have the exact skills required for the job.

3. Banking and lending

When you apply for a credit card, a car loan, or a mortgage, AI models analyze your financial history to predict if you will pay the money back.

  • The Action: The AI sets your interest rate or decides if you are “creditworthy” enough to receive a loan at all.
  • The Impact: Because AI can analyze thousands of variables, it can uncover “hidden” indicators of race or class (like where you shop or your social network). This can lead to “digital redlining,” where certain groups are charged higher interest rates or denied loans based on factors unrelated to their ability to pay.

4. Social media and information flow

Algorithms decide what news you read, what videos you watch, and which friends’ updates you see.

  • The Action: Instead of a chronological feed, AI predicts what will keep you on the app longest.
  • The Impact: This can create “filter bubbles,” where you only see information that confirms your existing beliefs, making it harder to see diverse perspectives or objective truths.

5. Insurance premiums

When you apply for car or life insurance, AI models analyze thousands of data points to determine how “risky” you are.

  • The Action: It sets your monthly price or denies you coverage entirely.
  • The Impact: In some cases, AI has used “proxy” data, like your zip code or even your shopping habits, to infer things about your health or driving ability, which can unfairly penalize people living in lower-income neighborhoods.

6. Digital policing and sentencing

In the justice system, AI tools are used to predict crime “hot spots” or to calculate the likelihood that a defendant will reoffend (recidivism scores).

  • The Action: These scores influence whether a judge grants bail and, if so, how long a prison sentence should be.
  • The Impact: If the historical data shows more arrests in certain neighborhoods due to over-policing, the AI will “learn” to target those same areas, creating a feedback loop of bias.

7. Dynamic pricing (The “Hidden” tax)

If you’ve ever noticed a flight price go up after you looked at it twice, or seen Uber prices spike during a rainstorm, you’ve encountered dynamic pricing.

  • The Action: The AI determines the maximum amount you are likely willing to pay at that exact moment.
  • The Impact: It can lead to price discrimination, where two people sitting next to each other on a plane pay vastly different prices for the same service based on their browsing history or device type.

8. Education and admissions

Large universities often use AI to sort through thousands of student applications.

  • The Action: AI predicts which students are most likely to graduate or make a donation to the school in the future.
  • The Impact: If the AI is trained on past “successful” students (who may have come from wealthy, privileged backgrounds), it may automatically de-prioritize brilliant students from underfunded schools who didn’t have the same extracurricular opportunities.

Most of these AI systems are “Black Boxes.” This means that even the people who own the software can’t always explain exactly why the AI made a specific decision. This lack of transparency makes it very difficult for an individual to challenge a “no” from a computer.

I’m a Houstonian (by way of Smackover, Arkansas). My most important job is being a wife to my amazing husband, mother to my three children, and daughter to my loving mother. I am the National Bestselling...