Corporation adoption of AI has moved fast. Firms actively using GenAI nearly doubled in a year, from 12% in 2024 to 22% in 2025, with a further 50% either planning or considering adoption, according to Thomson Reuters. Research by Ari Kaplan Advisors found that the following applications of AI were considered most impactful in litigation: document analysis (100% agreed), chronology creation (87%), and case strategy (77%). The research found 87% of litigation support directors consider AI-assisted case management a competitive advantage.
Data around disputes tells a different story. Ari Kaplan Advisors also found that 93% of litigation support directors report data volumes per case are rising, but the number of documents actually used at trial has stayed broadly flat. AI adoption is up, but so is case duration and cost.
Those two trends coexist because general legal AI and litigation AI are solving different problems.
What general legal AI does well
Tools like Harvey, CoCounsel, and Legora accelerate legal work: faster research, drafting, document summaries, responsive query-answering across datasets. For most of a firm's work, that's exactly what's needed.
Thomson Reuters found organisations with a clear AI strategy are 3.5 times more likely to see meaningful AI benefits than those without one. In disputes, deploying general AI for a dispute-specific task without litigation AI is precisely that ad-hoc approach: the factual record stays fragmented, stress-tested late, and held across individuals rather than in a structured, interrogable model.
General AI tools aren't inadequate, but they're built around doing everything. Litigation is about understanding the facts and then executing (in great detail) certain workflows.
Read more: You don’t need documents. You need facts
Why litigation requires a different approach
A complex dispute is a process of fact discovery where each finding changes the meaning of the next, across months and hundreds of thousands of documents. A fact that proves intent may sit across three emails, a board minute, and a witness statement written two years later. No document-level tool surfaces that connection reliably at scale.
Chronologies built in a general AI tool are static snapshots, rebuilt manually each time disclosure expands. Contradictions between a witness statement and a document reviewed weeks earlier require a lawyer to hold both in mind simultaneously. Case theories calcify around assumptions about what the evidence shows, tested at the end of disclosure rather than the beginning.
Read our whitepaper A matter of facts: Fact Intelligence in modern litigation
What Wexler adds
Wexler operates at the fact level rather than the document level. Every factual assertion ties to its underlying source. Key players are mapped automatically across the evidential record. Chronologies update continuously as new material arrives. Contradictions surface across the full dataset, not just within individual documents, and case theories can be tested against the complete record at any point in the matter.
Where procedure permits, Wexler checks live hearing transcripts against the documentary record in real time. Inconsistencies between oral testimony and prior written evidence become visible while the hearing is still running. That capability is impossible in any equivalent general legal AI tool.
For multi-step factual tasks, Kim, Wexler's AI assistant, executes complex workflows across the document set, extracting data, running sequential analyses, producing chronologies, memos, or reports in a single run.
The numbers reflect the difference in approach. Wexler identifies relevant facts with up to 95% accuracy against a human benchmark of around 78%. Early case assessment time drops by up to 75%. At a processing speed of 20,000 pages per hour, supporting datasets of a million documents and beyond, the platform operates at the volumes that define modern disputes.
How leading firms are separating AI tools
Clifford Chance uses it across multi-jurisdictional matters to establish a complete evidential picture before strategy is set. Goodwin uses it during early case assessment to surface conflicting evidence that manual review would catch far later. Herbert Smith Freehills Kramer positions it as the disputes-specific layer of a broader AI programme. Burges Salmon uses it for large-scale investigations where the record must be continuously maintained as material accumulates.
Most firms now have legal AI. The ones gaining ground in disputes have gone a step further: they separate the tools by function, using general AI for general tasks and Wexler for the specific litigation workflows. In a major dispute, the side that understands the evidence first shapes how the case is fought. That advantage compounds across every stage from early case assessment to cross-examination.
