Machine learning technology and artificial intelligence (AI) applications in the leasing process have brought a wealth of benefits to both property managers and renters alike. Simplifying this process—from the initial apartment search to a signed lease—frees managers to take on other critical duties and offers residents more immediate application results. Thanks largely to the COVID-19 pandemic, such digital rental solutions are growing considerably in popularity.
According to a joint investigation by The Markup and The New York Times, more than 90% of property owners or managers rely on digital screening reports of potential residents when leasing units.
But, as recent headlines have pointed out, with such a simplified process also comes good reason for caution. Some of the algorithms used to generate these applicant reports can unintentionally introduce bias into the process—something every DEI-minded property manager is trying to eradicate.
The more simplistic the algorithms, the more possibility for unintended bias. For example, software relying primarily on credit scoring can disproportionately discriminate against low-income minorities, immigrants, younger people without a strong credit history, or others who choose not to use the credit system, says Sipho Simela, founder and CEO of Matrix Rental Solutions. And without more individual assessments or communication from the property manager or owner, these applicants don’t get a chance to provide a more complete picture or flag erroneous data.
In addition, AI biases raise many fair housing questions. Property managers must be aware that they could be held liable in the event of any accusations of discrimination, as they are ultimately the final decision-makers—not the software.
Pursuit against bias
Fortunately, some software companies in the market are working to ensure that machine learning is more fair and balanced. Matrix Rental Solutions, founded in 2021, aims to be the “first universal rental application built to help both renters and landlords.”
Simela says the combined effects of the COVID-19 pandemic, the CARES Act of 2020, and the growing popularity of leasing software were the impetus for the creation of Matrix.
He says that Matrix’s multi-factor model gives a broader and clearer financial profile of would-be residents. Connecting to 16,000 banks and 130,000 employers across the U.S., Matrix can pull real-time data regarding a prospective resident’s cash flow, income, historical tax returns, and other payroll information.
“We then overlay several macroeconomic conditions in addition to some of the traditional inputs like credit score,” Simela says. “From an inclusion perspective, we’ve found that many people who don’t have the credit scoring alone to qualify are more fairly included in the application process because many of them do, in fact, have sufficient ability to pay the monthly rent. That’s really what we want to drive in this market.”
This model also strives to be inclusive with its mobile-responsive technology, which makes the process easier for people without access to a printer. It also allows applicants to upload necessary documents in alternative ways.
“Not every employer has a payroll system, and some people don’t feel comfortable providing that information, so we utilize computer-vision models and extract data from documents, such as a PDF of a tax document or a picture of a pay stub. Our models are well-trained to extract the right data and plug it into our scoring model,” Simela says.
These models also aim to capture a complete employment picture for workers in the growing gig economy, and they even have built-in algorithms to flag cases of identity theft.
Matrix was accepted to the 2022 REACH program, sponsored by Second Century Ventures, the strategic investment arm of the National Association of REALTORS®. The REACH programs represent “the top new companies in property technology,” offering curricula on education, mentorship, networking opportunities, and exposure to the global real estate marketplace.
Property managers should be very conscientious when choosing the AI solution they plan to implement for their buildings.
Alexandra Goldthwaite, CPM®, MPM, a regional vice president in Sacramento with HomeRiver Group, said the company assembled a task force to carefully evaluate the various software applications available. Ultimately, the team chose an option that uses multiple algorithms that assess an applicant’s income, assets, and previous rental payment track record.
“During the rental process, you don’t see the people, and you might not even talk to them,” says Goldthwaite, an IREM Regional Vice President. “The AI just considers, ‘Do you meet the criteria or not?’ The applications are also time-stamped, so it’s first-come, first-served.”
To ensure that the AI doesn’t introduce bias, her company has completed side-by-side comparisons of applications to see how the AI results differ from the more traditional paper-and-people process.
Along with confirming that AI was much quicker, Goldthwaite’s team found more bias in the human process. “There’s more room for the grey area in the human process,” she says. “With AI, the algorithms determine the renter’s risk, and it stays true to the process.”
Simela says that the best way for property managers to create fairness in decision-making is to be transparent and consistent—and find a platform that aligns with those values. “When we formulated our model score, we aimed for that consistency, and what we give on the back end is the transparency into those inputs. Those two things combined—transparency and consistency—give property managers a leg up in the critical discussion surrounding fair housing and DEI in real estate.”
Goldthwaite says that having clear renter criteria, such as applicants being required to make three times the monthly rent, can also help determine that bias isn’t the cause of an applicant’s rejection.
With the human process, some degree of bias forming is inevitable; what’s important is how bias is handled, Goldthwaite says. Her company has yet to encounter bias with its AI process, but it remains on the lookout. “You also have to develop a standardized process and consistently check in with that AI process,” Goldthwaite says. “As a property manager, I want to see that renters can integrate easily with the software and that the AI abides by the Red Flags Rule with regard to identity theft. And when it comes to DEI, you want to ensure above all else that there is no bias.”