Introduction: The U.S. Department of Housing and Urban Development’s Office of Fair Housing and Equal Opportunity has released crucial guidance on how the Fair Housing Act applies to digital advertising in the housing and real estate sectors. With the increasing use of automated systems and artificial intelligence (AI) in ad targeting and delivery, it is essential to understand the potential risks of discriminatory practices and take proactive measures to ensure fairness.
Understanding the Risks: It’s crucial to grasp how new technologies enable advertisers to target specific audiences while excluding others, potentially leading to discrimination in housing-related ads. The Fair Housing Act prohibits discrimination based on protected characteristics such as race, color, religion, sex, national origin, familial status, or disability. By understanding these risks, you can proactively ensure fairness and avoid potential legal issues.
Discriminatory ad targeting can manifest in various ways, including denying information about housing opportunities, targeting vulnerable consumers for predatory products, discouraging potential consumers, and steering home-seekers to specific neighborhoods.
Audience Targeting Tools: Ad platforms offer audience categorization tools that segment potential audiences based on various characteristics like gender, age, income, and location. While these tools can be useful, they can also lead to discrimination if used to exclude or target specific groups based on protected characteristics. Advertisers and platforms should be cautious when utilizing these tools for housing-related ads and avoid segmenting audiences based on protected characteristics or proxies.
Custom and Mirror Audience Tools: Custom and mirror audience tools allow advertisers to target specific audiences or find similar audiences based on existing data. However, if protected characteristics limit the source audience, these tools can perpetuate discrimination. Advertisers and platforms should carefully analyze the composition of source lists and ensure they are not unjustifiably limited based on protected characteristics. Regular audits of ad delivery outcomes can help identify and mitigate discriminatory practices.
Algorithmic Delivery Functions: Ad platforms employ machine learning and AI to determine which ads are delivered to consumers. However, these algorithmic delivery functions can also result in discriminatory outcomes. Ad platforms should ensure their algorithms do not direct housing-related ads based on protected characteristics, leading to steering, pricing discrimination, or other discriminatory practices. Regular testing and adjustments are necessary to minimize disparities and ensure fairness.
Recommendations for Advertisers and Platforms: Advertisers need to utilize platforms that actively manage the risk of discriminatory practices to ensure non-discriminatory ad delivery. Ad platforms, on the other hand, play a crucial role in this process. They should create separate processes for housing-related ads, avoid targeting options based on protected characteristics, conduct regular testing, adopt less discriminatory alternatives for AI models, ensure fair pricing practices, and maintain transparency through documentation and auditing. Ad platforms can contribute significantly to fair housing practices by fulfilling these responsibilities.
Conclusion: By diligently following the guidance provided by the U.S. Department of Housing and Urban Development, advertisers and ad platforms can contribute to fair housing practices and mitigate the perpetuation of discrimination in digital advertising. This commitment to fairness ensures compliance with the law and fosters a more inclusive and equitable digital advertising landscape. It is crucial to be aware of the risks, implement proactive measures, and work towards non-discriminatory ad delivery to ensure equal access to housing opportunities for all.