Most in-house legal teams are one or two people carrying the workload of ten. Generative AI doesn’t replace the lawyer’s judgment — it replaces the hours of work that came before the judgment started.
James is head of legal at a 300-person logistics company in Birmingham. He has one qualified solicitor working under him, a part-time paralegal, and somewhere in the region of four hundred live commercial relationships to manage at any one time. His Monday mornings start with a stack of contract redlines from procurement, a question from HR about a disciplinary procedure, and an email from the CFO asking whether a clause in their main haulage agreement actually means what the finance team thinks it means.
He isn’t unusual. The in-house legal function at most mid-sized UK businesses looks something like this: a small team, significant exposure, constant pressure to clear the queue. What varies is how long each task takes — and that’s precisely where generative AI is beginning to make a meaningful difference.
The most time-consuming part of in-house legal work often isn’t the hard thinking. It’s the scaffolding that has to happen before the hard thinking can begin. Reading a 45-page supplier agreement to flag the clauses that matter. Producing a first draft of an NDA from scratch. Pulling together the research memo that lets you give an opinion on a regulatory question you haven’t encountered before. Translating a board resolution into the right board minutes format.
None of this requires a qualified solicitor in the way that a nuanced commercial negotiation does. But it consumes qualified-solicitor hours at a significant rate.
A well-deployed generative AI system can perform a first-pass review of a standard commercial contract in minutes, flagging non-standard clauses, identifying missing provisions, and summarising the document’s key obligations. The lawyer still makes every call — but they’re making those calls from an annotated starting position rather than a blank read-through. The research memo drafts itself from the AI’s synthesis of the relevant material; the lawyer checks it, adjusts it, and applies their judgment to the specific facts. The NDA starts at ninety percent rather than zero.
According to the Association of Corporate Counsel’s latest survey, 91% of legal professionals who have adopted generative AI cite efficiency as its primary benefit. That figure isn’t surprising — but what’s notable is how quickly adoption has accelerated. In 2024, 23% of in-house counsel reported active use of GenAI tools. By 2025, that figure had more than doubled to 52%. Something shifted: the tools became good enough that the professional case for using them became difficult to ignore.
There’s a particular kind of question that in-house teams regularly face: specific enough to need real legal analysis, general enough that it shouldn’t require a three-day brief from external counsel at £350 per hour.
Employment law queries are a good example. So are data protection questions, supply chain contract disputes, and questions about implied terms in service agreements. These are areas where a competent in-house lawyer can give sound advice — but producing that advice historically required either significant research time or a call to an expensive specialist.
Generative AI changes the economics of this considerably. A system trained on the relevant legal material — UK employment law, GDPR guidance, commercial contract principles — can surface the applicable framework, the key cases, and the relevant regulatory positions in the time it takes to write the question. The lawyer still provides the legal advice; the AI provides the reference architecture that would previously have taken an afternoon to assemble.
The ACC survey found that 64% of in-house counsel expect generative AI to reduce their reliance on external law firms. That’s not wishful thinking — it reflects a real shift in what an in-house team of two or three people can credibly handle without outsourcing.
Here is where the choice of tool becomes important in a way that isn’t always obvious.
The most powerful use case for in-house legal AI isn’t general research or document drafting. It’s a system that knows your contracts. Your standard agreements, your precedent bank, your historical advice, your GDPR documentation, your board minutes. A generative AI deployed over your own document library can answer the CFO’s question about that haulage clause in seconds — not because it’s searched the internet, but because it’s read every version of that agreement you’ve ever signed.
That institutional knowledge retrieval is qualitatively different from anything a general-purpose AI tool provides. It’s also, for obvious reasons, something that requires complete confidence in where that data goes. The document library of a corporate legal function is among the most sensitive information a company holds: M&A materials, employment disputes, commercial negotiations, regulatory exposures. Feeding that into a public AI service — one whose terms permit training on inputs, or disclosure to third parties — is a risk that grows with the sophistication of the use case.
James, at his logistics company, could begin using a general AI tool to draft NDAs today. What he couldn’t do without a private deployment is ask it to cross-reference every active supplier contract for indemnification terms and flag the three that create the most exposure. That requires a system that works only with his data, on infrastructure he controls.
That’s the version worth building toward.
JD Fortress AI builds private, on-premises AI for in-house legal teams. If you’re interested in what a system trained on your own documents and precedents could look like — get in touch for a no-obligation conversation.
If you're thinking about secure AI for your business, we'd love to have a conversation.
Get in Touch →