Can rent tech be made ‘fair by design’?

In late 2022, the Guardian Australia reported that Snug, a rental application platform, seemed to be awarding higher ‘match scores’ to applicants who bid above the advertised price. According to the report, offer to pay the asking price of $490 for a two-bedroom in Brunswick and your score might hover around 74. Offer just $10 more, and your score would go up to 77. The point of the match scores, according to the Snug website, is to help ‘a renter and owner find the best fit for a rental property’. In practice, the score seemed to be encouraging applicants to bid higher rents.

Today, a few months after the report, a higher bid doesn’t appear to affect your Snug ‘match score’. Instead, if you enter a higher rental price, a pop-up box appears. The feature doesn’t prevent the applicant from entering a high bid. Rather, the applicant must declare that the agent has not solicited the bid. The feature adds friction, which might deter the action in some cases. But the most direct effect of the function is protection for the platform and agent; a shield against accusations of conduct which is illegal in some Australian States and Territories.

Figure 1. A pop-up window on the Snug platform requiring a declaration that an agent has not solicited rent bidding.

It’s widely agreed that rent bidding is pernicious, driving up prices in times of scarcity. Some Australian legislatures consider it harmful enough to warrant legal deterrence, prohibiting owners and agents from inviting higher bids (but not the making or acceptance of higher bids). However, the law is silent on the responsibilities of digital ‘rent tech’ intermediaries like Snug.

In these circumstances, what should Snug, or competitors Ignite and 2Apply, be required to do?

As gateways to large sections of the private rental market, these platforms are uniquely positioned to promote norms and enforce standards in rental allocation. At a time of low vacancy rates, slowing supply, and rising housing stress, power imbalances between renters and property owners are acute. Through choices about design, features and business models, platforms can reinforce or exploit these inequities. Or, they could try to work against them.

Fairness-by-Design

Fairness-by-design is about building safeguards into the architecture of a system to prevent unfair processes and outcomes, rather than simply rectifying them after the event.

The idea of fairness-by-design in machine learning has received extensive treatment by academic researchers searching for methods to counter algorithmic bias. A similar concept has caught on in regulatory discussions. Concerned about the effects of online choice architecture on consumer behaviour and competition, the UK Competition & Markets Authority (CMA) proposed a ‘fairness-by-design duty’ for online platforms with ‘strategic market status’.

Broadly, the fair-by-design paradigm reflects two salient themes in current ethics and law reform discussions about digital technologies and AI. First, a focus on ex ante, systemic intervention (in contrast to individual rights and ex post remedy). Second, a shift in emphasis from mere transparency to substantive fairness.

Constructing a definition of fairness and transposing it into concrete contexts is an ambitious project (which has animated innumerable thinkers across diverse disciplines). The rent tech industry, which mediates the distribution of an essential and scarce yet privately held resource, seems like a prime candidate for fairness-by-design. But what would this entail in practice? What would a fair-by-design rental application platform look like?

1.     Fairness is multi-dimensional.

Let’s again consider the rent bidding example.

On 2Apply and Ignite, two rental application services, inserting a higher rent price doesn’t prompt immediate feedback to the renter. The design is ostensibly neutral, with no apparent reward or penalty signaled to renters. We might say the design is fair to the individual user because it neither pressures nor incentivises them to act against their own interests. Yet, by allowing users to enter higher bids without friction, the platform also signals to users that the practice is permissible, in contrast to other actions which are restricted on the platform (such as entering a price of $0 or lodging an application if under a certain age). The design may not incentivise renters to engage in the practice, but it does allow it to continue. A design which encourages fairness in rental allocation practices would arguably prevent or discourage rent bidding.

This example highlights two possible dimensions of fairness-by-design: 1) platforms treating individual users fairly; and 2) platforms taking steps to ensure fairness in the relationships they mediate. Two quite different concepts.

The first resembles existing notions of ‘unfairness’ in consumer and privacy law; obligations not to mislead or deceive consumers, or act outside a privacy subject’s reasonable expectations, adapted and extended to take in insights about the manipulative effects of online choice architecture. Such a fairness principle would probably involve refraining from encouraging renters to pay for deep background checks to prove they’re a ‘good tenant’ and ‘reliable and trustworthy’. It might constrain the use of dark patterns to suggest such features are compulsory, and ‘sludge’ when trying to delete a profile.

Figure 2. A screen advertising a paid background check to tenants on the 2Apply platform.

The first principle would require a shift in certain platform logics, away from exploiting vulnerability to extract data and redistribute processing costs to renters. It’s the scope of the second, however, which is likely to be more contentious.

2.      Fairness is contested and context sensitive.

Designing to counter systemic unfairness, and to promote fairness, in private rental allocation is a complex undertaking. Private rental allocation in Australia is riddled with problems. Discrimination on the basis of race, gender, age, class, disability, family composition and criminal record. Low-income earners competing against higher income earners for limited low-price rental stock. Tenants being pressured not to assert their rights for fear of jeopardising their tenure and competitiveness in the ‘market’.

Some simple design approaches could limit opportunities for bias and discrimination. For example, platforms could restrict the ability of agents to seek unnecessary and discriminatory information from renters through form customisation, and prompt referees to provide factual and fair responses, as well as evidence to accompany negative appraisals.

Figure 3. Customised questions on a 2Apply form asking renters to disclose any convictions (including spent convictions).

Figure 4. An option provided to referees who receive a reference request from the Snug platform.

These approaches nudge fair behaviour by limiting access to sensitive information and deterring bad behaviour through visibility. However, simply excluding information about protected characteristics doesn’t guarantee fair outcomes. First, basic information like surname can be used as proxies for protected traits. Second, discrimination is often hidden in the discretion of agents/owners.

In response to the Guardian report, Snug argued that its ‘innovations are helping to reduce discrimination in the rental market’ and it ‘believes an objective, data-driven approach can help eliminate discrimination and subconscious bias in the rental sector.’ The company claims that the objective of its ‘match score’ is to ‘enable applicants on lower incomes with complete profiles [to] top the selection list’. Whereas price is often a determining factor, Snug purports to encourage agents to consider other factors.

A scoring system which is opaque but seems to reward higher rent bids obviously fails to achieve that objective. So too does a system which rewards renters who pay the platform for advanced background checks. Generally speaking, decision-support mechanisms which reiterate historically biased notions of ‘risk’, reframed to suit the platform’s commercial logics, will fall short.

It is still worth asking whether it is possible to design decision-support features which increase transparency and fairness. A key hurdle to designing such mechanisms will be settling on the principles which should guide the fair allocation of private rental housing. Unlawful discrimination should certainly not affect decisions, and let’s also assume nor should willingness and ability to outbid other renters on price. What variables should be weighted favourably?

Dominant market-based approaches treat private property rights as paramount, viewing rental allocation through the lens of ‘risk assessment’. Renters are ‘generators of risk, reward and value’ for investors, who are entitled to conduct thorough due diligence and audits on renters, with no more explanation than the need to guard against ‘investment risk’. Within this framework, a fairness principle might intervene minimally to counter unlawful discrimination, for example, the use of race or religion as proxies for ‘risk’. A more just approach to distribution, on the other hand, might emphasise the overall economic and social well-being of renters, taking into account factors such as accessibility, availability of alternatives, affordability, etc. It might take into account fairness measures, metrics and values raised by renters and other community stakeholders, not only the interests of the property owner and agent in ‘managing their risk’.

There are reasons to be skeptical and cautious about adopting algorithmic decision-support in rental allocation. A mounting body of research warns of predictive tools replicating and masking historical bias, substituting human judgement, and creating perverse incentives to game systems. Such tools must only be introduced where they make rental allocation more not less explainable and accountable.

At minimum, we should expect and require platforms not to worsen or take advantage of renter vulnerability. Beyond this, it’s worth exploring the possibilities for embedding fairness in rental allocation by platform design. 

Table 1. Proposed dimensions of a fairness-by-design principle for rental application platforms

Previous
Previous

Privacy reform edges forward

Next
Next

Generative AI systems have values. Who should decide what they are and how?