Artificial intelligence use cases are increasing across many sectors, including housing. Potential deployment of AI can involve a wide range of uses, including AI-enabled customer service, rental analytics and internal operations efficiency, among others.
As adoption continues, and regulators and legislators focus their attention on AI, the regulatory landscape involving privacy and housing-related issues is becoming increasingly complex. AI tools can raise a wide range of privacy and security concerns.
Multifamily owners who have a working knowledge of AI-related federal and state policy issues will be set up for success, ensuring they are maximizing the benefits of this emerging technology.
Federal Landscape
While there is currently no federal law governing AI, existing federal laws could apply. For example, the U.S. Department of Housing and Urban Development (HUD) issued two pieces of guidance under President Biden.
These policy statements emphasized that the Fair Housing Act (FHA) governs housing decisions regardless of whether those decisions are made by people or AI. Housing providers can be held liable if AI-powered tools results in discriminatory outcomes, even when there is no intent to discriminate, particularly where tenant screening is concerned.
On Dec. 11, President Trump signed an executive order on “Ensuring a National Policy Framework for Artificial Intelligence.” The executive order directs the Federal Trade Commission and the Federal Communications Commission to address AI issues and direct the development of legislative recommendations for Congress on a national AI framework.
Congress has been considering legislation but has not yet acted. The executive order also directs the administration to evaluate and challenge state laws that are inconsistent with a minimally burdensome framework for AI legislation, as well as announces an intention to withhold certain federal funding from states that pass laws the administration determines to be burdensome.
State Landscape
Without a federal AI law, states have been active in filling in the gaps through standalone AI laws and consumer privacy laws that also cover AI decision-making.
Colorado was the first state to enact an AI law that applies to multiple sectors, regulating how high-risk AI systems are designed, used and monitored. In the case of housing, high risk includes tenant screening, application approvals or denials, fraud detection or automated eviction-related decisions.
Since then, states such as Texas have enacted standalone AI legislation, and other states — including California — have begun to regulate AI chatbots, requiring disclosures to consumers that they are interacting with AI.
Many others, such as California and Connecticut, have included or sought to include automated decision-making technology provisions in their comprehensive consumer privacy laws. The president’s executive order suggests that at least some of these laws may face legal challenges from the Trump administration.
Notably, these state AI and privacy laws may have consequences for housing providers and their vendors. Under most state laws that address such issues, automated decision-making that produces bias or discrimination in the provision or denial of housing would come under scrutiny — and could potentially lead to steep penalties imposed by state regulators.
Additionally, states are beginning to introduce sector-specific AI laws that would affect the housing industry. New York’s Assembly Bill A3125A would directly regulate and impose requirements on providers that use AI and other automated decision-making tools to screen applicants for housing.
The bill would require transparency for owners that use AI-driven screening tools, notifying applicants that said tools are in play. The bill also would require housing providers to evaluate whether those tools could produce unfair or discriminatory outcomes and maintain a basic understanding of how the technology functions and influences decisions.
State AI legislation is expected to continue to grow unless Congress passes a moratorium on state AI legislation or the Trump Administration’s attempt to restrict state efforts in this area are successful.
Key Takeaways for Affordable Housing Operators
While the legal framework for the use of AI continues to evolve, affordable housing providers and their vendors should be aware of the shifting landscape for AI and privacy regulation. Despite the legal complexities and nuance, certain best practices can help housing providers with implementation of AI tools.
- Conduct thorough vendor diligence, including the company’s reputation, its experience in the housing space and its use of AI tools, including the resources used to develop an AI system or the source of the AI license.
- Conduct thorough AI diligence for each product and AI use case, including information about how the product is trained, evolves and makes decisions. Pay particular attention to how it addresses potential bias or discrimination and whether it was developed specifically for the housing industry.
- Draft and include contractual terms to protect against potential liability for a vendor’s use of AI tools, such as parameters surrounding data use for AI training, terms to address the vendor’s potential use of third-party products, bias detection and monitoring; AI error performance metrics; and indemnification for claims arising out of AI error or discrimination.
- Draft and include contractual terms that address privacy concerns, including data use permissions and limitations, data subject request assistance provisions, third-party service provider treatment of data and audit rights.
- Keep up-to-date documentation related to AI tool usage and privacy practices, particularly where required by applicable law.
Kevin Coy is a partner at law firm Arnall Golden Gregory and co-chair of the firm’s Privacy & Cybersecurity practice. He can be reached at [email protected]. Kelley Chandler, an associate at Arnall Golden Gregory, advises clients across various industry sectors on data privacy matters. She can be reached at [email protected].