The institution of adjudication is in a state of great upheaval to- day. Mounting case backlogs and the litigation challenge posed by mass torts are pressuring Congress and courts to experiment with novel adjudication techniques. Some of the results are well-known-case tracking, alternative dispute resolution, greater reliance on settlement, and tighter pretrial screening of cases. Taken together, these changes fore- shadow a major transformation in the practice and theory of adjudication.
This Article focuses on one particularly remarkable proposal for handling large-scale litigation: adjudication by sampling. This approach uses statistical methods to adjudicate a large population of similarly situated cases. Rather than decide each individual case separately, the court aggregates all the cases and selects a random sample. The court then adjudicates each sample case and statistically combines the sample outcomes to yield results for all cases in the larger population. The sampling procedure is nicely illustrated by the most recent chapter in Judge Robert Parker's struggle with asbestos litigation, Cimino v. Raymark Industries, Inc. After certifying a class action and adjudicating liability, Judge Parker faced the daunting prospect of 2298 hotly contested damage trials. Settlement negotiations had broken down, and defendants made credible threats to contest each case vigorously.' Judge Parker worried about the consequences in Cimino as well as in the thousands of pending and future cases that would have to be tried individually at the damages stage unless some aggregative procedure could be devised.
Robert G. Bone,
Statistical Adjudication: Rights, Justice, and Utility in a World of Process Scarcity,
46 Vanderbilt Law Review
Available at: https://scholarship.law.vanderbilt.edu/vlr/vol46/iss3/2