The Accountability Gap
The Wingate case is particularly troubling because it suggests active concealment. Backdating a corrected order and removing the original from public access doesn’t just hide mistakes—it obscures evidence of how those mistakes occurred. If judges can sanitize the docket after AI-generated errors come to light, the public and litigants lose any ability to verify the integrity of judicial decision-making.
This creates an accountability asymmetry. When lawyers use AI recklessly, the evidence remains in the court record, exposing them to sanctions. When judges do the same, they control the docket and can potentially erase the proof.
The New Jersey case offers a different lesson: delegation without supervision. If the explanation provided to media is accurate—that a temporary assistant used AI to draft an opinion later issued without adequate review—it reveals a systemic breakdown in judicial chambers. Federal judges don’t draft every word themselves; they rely on law clerks and staff. But the ultimate responsibility for accuracy and legal reasoning cannot be delegated, especially not to algorithms.