AI and Overcoming Bias: How Does an App Attempting to Keep People Safe Overcome Racial Bias and Profiling? BeWize Founder Says It’s Taking the Necessary Precautions. inter-TECH-ion investigates…

(inter-TECH-ion intern Noah Chavarria contributed to this article.)

Let’s say you’re on a vacation or business trip and want to check out a restaurant or other establishment in a city you’r e not familiar with. But you’ve never been to the neighborhood and aren’t sure you feel comfortable visiting it.

So, you download an app called BeWize, which was created with the purported goal of keeping people safer. It’s in beta testing right now, with a planned official launch this Spring.

BeWize Founder Jason Knickerbocker and his team mine crime data from the FBI, the public and other sources to rate cities on the app in terms of safety. He and his team intend to sell the safety data they gather to city governments, as well as sell the app to individual users.

But how is that done without the app’s AI displaying inherent bias toward minorities and without using racial profiling?

inter-TECH-ion asked founder Jason Knickerbocker to explain. Here’s what inter-TECH-ion found…

First a little bit on Knickerbocker’s background. He’s a veteran of the U.S. army and a lieutenant with the Manhattan Beach Police Department.

In late 2019, he and his team started collecting and organizing information from various sources, including first responders and FBI crime data.

This data is available to anyone, but hard to decipher unless you’re familiar with crime statistics and how to interpret them, he told inter-TECH-ion.

The FBI collects “factual” crime data from every city in the U.S. and publishes it for the public and government agencies to use, he explained.

BeWize created a program which extrapolates FBI data, and other crime data, and automatically plugs it into the company’s AI algorithm.

This algorithm was created by a data scientist and takes into account which crimes are most frequent in any given city, thereby painting a picture of its safety quotient.

BeWize then publishes this data in a visual way, which Knickerbocker claims is an “easy- to-understand, intuitive, safety map.”

The goal is for users to access this map to see the “relative safety” of any given city within the country.

BeWize’s Effort to Avoid Bias

Knickerbocker told inter-TECH-ion that BeWize works to minimize racial biases by “first having a very diverse team who, themselves, know first-hand how it can feel to be discriminated against.”

Also, in “scraping” quantitative crime data which is reported to the FBI every year, this data is “factual,” and does not consider things like race, sex, gender identity or expression, age or arrest/court records.

It’s mathematical crime statistics of various types of crimes that cities must report to the FBI every year,” he explained, adding that FBI UCR Crime Statistics are the only uniform type of crime stats reported on a national level. The acronym “UCR” stands for Uniform Crime Reporting.

Another way BeWize works to avoid racial discrimination is that the app does not allow any sort of back-and-forth dialogue.

Its data scientist looked at various criteria that make people feel safe and BeWize selected those basic needs when creating five categories it asks for public input on, like “lighting and/or visibility to others” and “upkeep.”

He envisions the app being used by individuals and families, as well as groups and companies, to plan activities, increase situational awareness, increase overall safety and “infuse resources into the areas that need them most.”

In terms of “infusing resources,” that refers to BeWize’s plans to let cities know when a large number of users have rated that city as “dangerous.” It would then be up to that city’s council or governmental representatives to do something about it.

When someone doesn’t feel safe and they report that to their respective representative or city council, often times those concerns are not catalogued or even dealt with, because city government is busy and usually understaffed to deal with single complaints, especially in large cities with high crime rates such as Los Angeles or Detroit,” Knickerbocker said. “However, with BeWize, users will be able to drop a pin and rate a location. If we get one or two pins, it may just be an anomaly, but if we get 200 or 1,000 pins saying an area is dangerous, cities can be made aware of the safety concerns. Now, reasonable people can argue how best to deal with high crime areas — whether that is more police; more community programs; federal grants for housing or jobs; better lighting; graffiti abatement; or whatever they choose.”

The purpose of BeWize, he said, is to enable users to notice “dangerous areas” and then for the company to let those city governments know, so city officials can make an effort to enhance safety.

The other choice is to piecemeal together public complaints and deal with them one- by-one, which is not effective,” he opined.

How does the app work?

Anyone can rate any city they visit. All they have to do is be on the app, enable their location to be identified, verify themselves, and confirm they’re human, not robots — like most apps and websites require.

Whether the rater/user is a law enforcement officer or civilian, they are only able to rate within these five categories:

  • Overall, I feel safe in this area
  • Area is well-kept and clean
  • Good lighting and/or visibility to others
  • No suspicious activity
  • No loitering or vagrancy

The user then picks a score of 1-5 for each question, and cannot make any comments.

One could reasonably deduce that the statement, “I feel safe in this area,” is highly subjective, depending on the color of one’s skin and what kind of neighborhood they grew up in, among other factors.

Knickerbocker said the five statements/categories were selected in consultation with data scientists and psychologists.

Then the app asks the user if they have been a victim of a crime at their specific location.

Knickerbocker said he does not think that is a presumptive question.

In the city I work in (Manhattan Beach), we have an area where surfers’ cars are frequently broken into,” he said. “(BeWize) users could indicate their car was broken into, so others are reminded not to leave valuables in their car. And…the city can address the issue, whether that means better lighting, a parking attendant, more police, signs, etc.”

If a user clicks, “Yes,” meaning that they have been the victim of a crime at that location, then a type-of-crime menu pops up and they can then click on the specific crime. There are five crimes available to chose form, with brief definitions. The crimes are:

  • Theft
  • Robbery
  • Assault
  • Vandalism
  • Sex Crime

The app then asks for user comments, but the comments are hidden, for now, from other users. The main point of collecting comments is so BeWize can vet the raters.

Knickerbocker said if he or his team see any undertones of racism in the comments, that user’s account will get terminated.

BeWize has hand-picked every one of the 30 beta testers to date. From these testers, the company has collected more than 1,500 ratings of specific cities and neighborhoods.

Knickerbocker acknowledged that “racism does exist and we will encounter issues,” but said the company will have “a zero tolerance policy when we identify someone using inappropriate language.”

And, before its official launch this Spring, in the section where users can view other user’s comments, the BeWize team “will make sure to have strong content moderation in place,” he said.

How BeWize Intends to Help Brick-and-Mortar Establishments

The hope of the BeWize team is that the app will encourage users to patronize businesses in areas they’re not familiar with and may have otherwise kept off their itinerary.

He used the example of an OC resident driving up to Inglewood to have dinner near the new football stadium.

First, they are not familiar with the area, and if they go off news reports and some social media, they know Inglewood does have some high crime areas,” Knickerbocker said. “If users can rate the safer areas of Inglewood, and show others some great and safe places to go, then they can eat at a nice restaurant, and give the business more customers. This would be a win for everyone.”

In terms of feedback from the law enforcement community, Knickerbocker said it’s been “good feedback: and “long overdue.”

Selling Its Data to Businesses

Knickerbocker claims the app does not currently provide any data to government.

But it will sell data to city governments.

BeWize Team and Financing

In addition to Knickerbocker, the team includes Lesly Combs, as co-founder and COO. There’s also an army veteran as CTO, a Marine Corps veteran as lead designer and an attorney as chief legal officer, who’s married to someone in law enforcement. A current police officer is the community outreach director.

Knickerbocker and Combs bootstrapped the business with $36,000 of their own money. Then, they raised a friends-and-family round of $125,000.

The company is not currently seeking funding because it’s in the application process for a SBIR grant from the National Science Foundation. It’s passed the first screening.

“Whether it receives the grant or not, the reviewers have seen enough credibility in our team, as well as the potential positive impacts that our way of displaying safety data could improve the world,” Knickerbocker said.

BeWize will find out by this summer it it has been selected for the grant and if it isn’t then the team will resume conversations with investors who have shown interest, he added.

 

About The Author

Deirdre Newman is a long-time journalist, who's covered OC startups for a few years.

You don't have permission to register