While this is our first article we have published regarding artificial intelligence (AI), we are sure that it will not be the last. AI usage continues to grow in countless areas of life, and homeowners associations (HOAs) are no exception. Not only might HOA boards be tempted to use AI to resolve questions of document interpretation or other legal issues, but we see that they are regularly encountering challenges from owners who themselves are consulting with AI. This introductory article is not intended to be a comprehensive summary of the impacts of AI on HOAs, but we wanted to take an opportunity to provide an initial word of caution regarding HOA board, or homeowner, reliance on AI when evaluating the governing documents or legal issues of their community.
It is easy to see the temptation for HOA boards to use AI to resolve legal questions, interpret their governing documents, or otherwise assist in administering the HOA.
After all, board service is time-consuming, and the volunteer board members do not have limitless free time. However, while the use of AI can appear to be a godsend of a timesaver, it can also lead to more headaches for the board in the long run due to errors, inaccuracies and/or failure to capture the “full picture” being addressed.
We have noticed several incidents in which board members or homeowners have used AI to answer questions regarding their governing documents and been misinformed.
Once concerning “quirk” is a habit of AI to “hallucinate” provisions in the governing documents that do not actually exist. This concept of AI “hallucinations” is well-documented, and has landed even lawyers in hot water. There have been numerous instances in which lawyers have used AI in court submissions only for the AI system to hallucinate nonexistent court cases. In fact, there is a report from Pennsylvania of at least thirteen different Pennsylvania cases containing filings with confirmed or implied AI hallucinations.[1] Other states have also seen lawyers sanctioned after AI hallucinations were discovered in their submissions.[2] When a board asks AI to interpret its governing documents or to answer a legal question, there is a significant risk that the AI will refer to nonexistent governing document provisions or even imagined legal cases. It cannot be stressed enough that this is not just an imagined risk, but something our clients have actually encountered. We have also noticed a tendency for AI to mistakenly rely on laws that were not actually passed, but rather unpassed bills viewable on search engines.
Ignoring the significant risk of AI hallucinating nonexistent laws or governing document provisions, AI generally lacks the necessary context that comes from the legal practice and court experience which helps lawyers evaluate how a provision or issue would actually be interpreted or play out in real life. When evaluating a case for an HOA, there are often trends or themes prevalent at the circuit court level that are not necessarily memorialized in court opinions or statute, but just as important for the consideration of the HOA’s legal position, of which AI is simply not aware. Moreover, AI will often confidently provide answers to the situations provided in its prompts, but lacks the experience to know what additional context may be necessary that it has not been provided. As lawyers, it is often the case that something a client may not have initially felt was worth mentioning turns out to be surprisingly relevant to the legal analysis.
It is also important for HOA boards to realize that consulting with AI is not a substitute for consulting with professionals or experts.
The South Carolina Nonprofit Corporation Act, in setting forth the general standards for directors, provides that a director is entitled to rely on information, opinions prepared by, among others, legal counsel as to matters the director reasonably believes are within the professional’s competence.[3] As a result, decisions made after consultation with AI but not the appropriate real-life professionals could result in significant liability and lack of the ordinary protections of the “business judgment rule,” which is often a key shield from liability for HOA board members. Moreover, legally sensitive information submitted to AI is unlikely to be held to the same attorney-client privilege protections as communications with an attorney. In short, reliance on AI in the wrong situation can lead to liability for breach of fiduciary duties or other exposure for HOA boards.
We trust that any HOA board member reading this article has at some point received a message from a homeowner written by AI.
It is more and more common for our clients to receive legal challenges submitted by homeowners who obviously used AI. These messages generally include the same types of errors discussed above: citations to nonexistent laws, provisions, or cases, and misapplications of those that do exist. It can be difficult to change the mind of someone whose position has been incorrectly reinforced by AI, and this seems to be a “feature rather than a bug” of many AI platforms. AI systems often “learn” or are programmed to be agreeable and reinforce the positions of the user to encourage engagement, naturally increasing the risk of mistaken reliance on incorrect information provided by AI.[4][5] Not only is this important to keep in mind when confronted by a challenging homeowner relying on AI, but it also serves to demonstrate the hazards of relying on AI for important decisions and questions facing your HOA.
Again, this will not be our last article on AI and its impact on HOAs, but is simply an introductory word of caution regarding HOA reliance on this increasingly prevalent technology. This article is not intended to be an exhaustive discussion of AI use by HOAs, nor any guarantee of the outcome of any litigation regarding the same. Our attorneys at McCabe, Trotter & Beverly, P.C. are experienced and well-equipped to answer questions you may have regarding this topic. Please contact us at 803 –724–5000 for further information.
Christian Saville
McCabe, Trotter & Beverly, P.C. blogs and other content are for educational and informational purposes only. This is not legal advice and does not create an attorney/client relationship between McCabe, Trotter & Beverly, P.C. and readers. Readers should consult an attorney to understand how this information relates to their personal situation and circumstances. You should not use McCabe, Trotter & Beverly, P.C. blogs or content as a substitute for legal advice from a licensed attorney.
[1] https://www.spotlightpa.org/news/2026/01/pennsylvania-commonwealth-court-ai-hallucinations-allegations-justice-system/
[2] https://www.thomsonreuters.com/en-us/posts/technology/genai-hallucinations/
[3] S.C. Code Ann. § 33-31-830.
[4] https://www.psychologytoday.com/us/blog/the-algorithmic-mind/202508/ai-always-agrees-with-your-kid-thats-a-problem
[5] https://www.law.georgetown.edu/tech-institute/insights/tech-brief-ai-sycophancy-openai-2/

