Mr. Haynes Explains: You’re Going to Replace Your Lawyer With AI... and Then Your AI With a Lawyer
Mar 8, 2026 • Legal Practice • Artificial Intelligence
Artificial intelligence is here.
It’s not going away.
It’s not a fad.
And it is absolutely changing the practice of law.
But here’s what I predict is going to happen over the next several years:
People are going to fire their lawyers and replace them with AI.
And then, after things go sideways, they’re going to fire their AI and hire a lawyer to clean up the mess.
And that second lawyer is going to cost them a lot more money than the first one would have.
The Illusion of Empowerment
AI tools are incredibly impressive.
They can:
- Draft motions
- Summarize cases
- Suggest legal arguments
- Sound extremely confident
For someone facing legal trouble, that feels empowering.
You start thinking:
“Why am I paying a lawyer thousands of dollars?”
“This app just wrote a brief in 30 seconds.”
“Maybe I can handle this myself.”
I understand the temptation.
But here’s the hard truth:
Artificial intelligence is not a lawyer.
It does not:
- Stand in front of a judge
- Understand courtroom dynamics
- Read the room
- Assess how a specific local judge will react
- Strategically negotiate with opposing counsel
- Know when an argument is legally sound but practically disastrous
And most importantly...
It doesn’t know when it’s wrong.
AI Hallucinates. Judges Don’t Care.
AI systems can and do hallucinate case law.
That means they confidently cite cases that do not exist.
This is not theoretical.
I was recently on the other side of a wrongful death case where opposing counsel filed a brief citing multiple cases that simply did not exist.
They were completely fabricated.
The court does not accept the excuse, “ChatGPT wrote it.”
When a lawyer signs a pleading and files it with the court, they are representing that what is in that document is accurate and supported by real law.
Judges do not care whether AI drafted it.
That lawyer could have faced serious professional discipline.
When I pointed out the fabricated authorities, something interesting happened.
Settlement talks suddenly became urgent.
What a surprise.
I won’t name the attorney or firm. That’s not my purpose here. This isn’t about embarrassing anyone. It’s about understanding the risk.
And the irony? That same attorney later taught a seminar on using AI in legal practice. When I saw the announcement, I laughed so hard. The irony cannot be overstated.
The Real Cost of “Cheaping Out”
Here’s what I’m seeing more and more:
People try to “cheap it out” by letting AI draft motions or by representing themselves with AI assistance.
They walk into court confident.
Then reality hits.
Now the lawyer they eventually hire isn’t building a case from scratch.
They’re doing damage control.
Fixing procedural mistakes.
Undoing admissions.
Repairing credibility problems.
Explaining away filings that made no strategic sense.
Cleaning up a mess costs more than doing it right the first time.
Every single time.
AI Tells You What You Want to Hear
Here’s another danger that doesn’t get enough attention.
AI is incredibly good at reinforcing your existing beliefs.
If you feed it a weak argument, it will often:
- Dress it up
- Strengthen it rhetorically
- Tell you it’s persuasive
- Provide supportive “analysis”
That feedback loop is intoxicating.
It feels like validation.
But feeling validated is not the same thing as being legally correct.
I have even seen trained lawyers fall into this trap.
Understanding the limits of artificial intelligence in the current market is key to knowing how to use it correctly.
AI Is Not the Enemy
Let me be clear.
Artificial intelligence in the practice of law is not a bad thing.
On the contrary, it is a wonderful tool.
I regularly use AI in my practice.
But I use it to enhance my work — not control it.
A trained lawyer knows:
- When AI is helpful
- When it’s incomplete
- When it’s hallucinating
- When an argument looks good on paper but will get crushed in court
AI can assist legal judgment.
It cannot replace it.
Why You’ll Replace AI With a Lawyer
Eventually, many people will learn this the hard way.
They’ll try to represent themselves with AI.
They’ll file motions.
They’ll make arguments.
They’ll rely on “research” that hasn’t been properly verified.
And when the case starts unraveling, they’ll call a lawyer.
By then:
- Deadlines may have passed
- Harmful admissions may be on record
- The judge may already be skeptical
- The opposing party may have leverage
And now the lawyer is trying to salvage what’s left.
That is always harder.
And always more expensive.
Final Thought
Artificial intelligence is a remarkable tool.
But it is not a licensed professional.
It is not accountable to a court.
It does not have ethical obligations.
It does not carry malpractice insurance.
It does not stand next to you when the judge starts asking hard questions.
The judge isn’t going to let you pull out your phone in court and ask ChatGPT what the answer is. You are just going to look like an idiot.
You can replace your lawyer with AI.
For a little while.
But if your case matters — if your freedom matters, if your family matters, if your financial future matters — eventually, you’re going to want a real lawyer.
The smart move is hiring one before the damage-control phase begins.
This post is for general information only and not legal advice. Reading this does not create an attorney-client relationship.