Tenant Application Screening

Landlords and property managers are increasingly using AI tools to help identify suitable tenants. Application screening tools can analyse a variety of data such as a tenant’s credit scores, previous rental history and criminal backgrounds. This helps to indicate potential risks that might reflect a prospective tenant’s reliability and behaviour, helping to reduce risks like late payments or property damage.

Some of the main ways AI is being used include:

  • Verifying tenant information by cross-checking with online databases (e.g., employment, income and rental history).
  • Analysing tenant applications to assess reliability.
  • Performing background checks by screening public records and feedback.

The output from these tools isn’t usually a simple yes or no. Depending on how the system is set up, most will provide a risk score, a traffic-light-style indicator, or highlight specific areas of concern. For example, someone might be flagged as “medium risk” due to a gap in their rental history, or “low risk” with no issues at all. It is then up to the property manager or landlord to interpret that result and decide what to do next.

While there are clear benefits to using screening tools, there are potential risks around unfair and biased treatment. The relationship between a landlord or property manager and a tenant is personal and built on trust. The use of AI in this context raises concerns, as these systems operate on pre-programmed algorithms and often lack the context or sensitivity needed to understand individual circumstances, which can lead to biased treatment. This use of AI also raises issues around data protection and inferred data, which need to carefully handled.

As many AI models are proprietary, it is often unclear how decisions are made or what data has been used, making it difficult to trust the outcomes. Additionally, these models are often trained on historical data, which can reflect and even amplify existing inequalities in housing, income, and credit – often particularly affecting demographics such as age and ethnicity. For example, applicants with a rental history in lower income postcodes might be scored lower despite their financial stability. Relying solely on these outputs could result in unfair treatment, posing both ethical and legal risks.

When using AI to screen tenants, it is essential to still apply human, professional judgment. While AI can be used to help identify potential risks, it is critical for landlords and property managers to perform their roles professionally, using their experience and judgement to make fair and well-informed decisions that consider individual circumstances and comply with legal requirements.

Principal author: Aaliyah Pollock