Your Bandwidth Problem Won’t (Totally) Go Away
Talk to anyone running innovation at a law firm right now and you will hear the same thing: We have no time.
Too many tools! Too many rollouts! Too many training sessions!
Chasing lawyers to log in, try the workflow, complete the module, report back… A few firms told us at LegalWeek that they haven't done horizon scanning in months. They are drowning pushing tools they already have.
We thought the oxygen would come back after the firms had selected their Gen AI platforms… but it hasn't! The bandwidth problem has gotten worse.
Adoption is not ROI
For most firms, they have finally bought the tools. Now they want (and need) people to use them.
Most people in legal innovation already know that a lawyer logging into an AI platform is not ROI. It is only a precondition for the possibility of ROI… but actual ROI is hard to measure, so adoption is what gets reported as the proxy.
"80% WAU, with 80% of those as MAU." Those numbers aren't really that meaningful.
We know from talking to lawyers that a lot of the supposed ROI for Gen AI is based on "the feels".
The best tools are invisible
In the past, the question around ROI wasn't so fuzzy. Adoption used to mean success. In past waves of technologies, they could be used, and then disappear into the background — and invisibility was how we knew it was working.
Email. Mobile. Cloud. They all followed the same pattern — resistance, adoption, then invisibility.
Gen AI isn't following that pattern. Every major technology shift before this one was deterministic — same input, same output, every time. Training was straightforward and trust was earned quickly. Innovation teams didn't need to run endless training sessions because the technology earned trust by itself. Nothing needed to be explained away with a handwavy "oh, that's just probabilistic behavior."
With Gen AI, the same prompt produces different outputs on different occasions. Error rates improve as models get better, but they do not reach zero. Trust has to be earned task by task, context by context. The verification overhead is costly, persistent, and realized too late.
This is why the bandwidth problem is not resolving.
Different tools, in different contexts, have fundamentally different paths to earning trust. Most firms are managing all of them the same way — with more training.
A framework for measuring ROI
We can't solve the non-determinism problem. Nobody can. But we can recommend an ROI framework that can help you stop wasting attention on the wrong things.
This framework shows you where and how ROI can be measured, where Gen AI can go invisible, and where it will likely always need training. It is a 2×3 matrix of use case categories, each with a different definition of success.
On one axis, you have front office versus back office:
Front office are client facing use cases, which align with the practice of law.
Back office are administrative type use cases, which align with the business of law.
On the other axis, how Gen AI impacts these use cases:
Acceleration — where AI speeds up work a human still does. A task that used to take five hours now takes two. The right question to ask to measure ROI is how much time is saved overall. Success means the work gets done faster at the same or better quality.
Full automation — where AI eliminates the task entirely. Whether it is extraction of data or automation with agents, the right ROI question is what error rate is acceptable for this specific task, and whether the tool meets that threshold consistently enough that oversight actually reduces.
Capability enhancement — where AI makes possible something that wasn't practical before. Things like predictive analytics and regulatory monitoring at scale fall into this category. The right question is whether clients find the new capability valuable. Strategic value, not operational value, should be measured.
Each cell has a different ROI measure and a different definition of success. Running a single adoption program across all six treats fundamentally different problems identically. A back office automation tool with near-perfect task elimination is delivering more ROI than a front office acceleration tool with high login rates — but measuring success with logins shows the opposite.
Getting some time back
Training is not the enemy to success, but training without knowing what should be measured ends up wasting time and consuming the most valuable resource in this new economy — attention.
Each use case in this matrix has a different adoption methodology. Some will go invisible faster than you think. Some will take longer. A few may never get there.
The sooner we can measure ROI — use case by use case, cell by cell — the sooner some of these tools earn the right to become invisible. Every use case that becomes invisible is one less thing that sucks the oxygen out of the room.
