In re Broiler Chicken - Federal Court Order Sets TAR Protocol

In re Broiler Chicken - Federal Court Order Sets TAR Protocol

April 26, 2018

This past January, Magistrate Judge Jeffrey Gilbert and Special Master Maura Grossman, issued an order setting a protocol for technology assisted review.   See, In re Broiler Chicken Antitrust Litig., 1:16-cv-08637 (N.D. Ill. Jan. 3, 2018).    While there have been many court decisions approving the use of TAR in discovery, few courts have given detailed instructions on how TAR is to be performed.   This matter involves 30 different defendants, a fact which may have prompted the court to set the protocol.

 

The Order Regarding Search Methodology for Electronically Stored Information addresses document source disclosures; search methods, and means to validate the process.   Here's a list of key aspects of the protocol:

 

1. De-duplication by hash values across all custodians is required.

 

2. Parties may choose to conduct email threading and only produce inclusive emails. 

 

3. Categories of documents to be produced in their entirety (e.g., a network folder containing agricultural reports) are to be excluded from data sets used to test search terms.

 

4. Processing exceptions must be disclosed if they are not system files.   An example would include encrypted files. 

 

5. If a party uses TAR (or Continuous Active Learning), they have to disclose the vendor and software; a general description of the training algorithm; the quality control measures; and the categories included or excluded from TAR.  Requesting parties have 7 days to propose exemplars for training or alternate keyword search strings.

 

6. Search software is also to be disclosed, along with the stop words the program uses and other information about its capabilities.    

 

7. False positives can be proposed in addition to keywords, but contextual examples must also be provided.

 

8.  The QC sample must include 500 random documents responsive to at least one request for production; 500 documents marked non-responsive by a human reviewer if they were selected for review by TAR or 2500 documents marked non-responsive by a human reviewer if they were selected by manual review; and 2000 documents excluded from manual review by the TAR process.  This validation sample is to then be reviewed by a subject matter expert. 

 

9. The parties have to meet and confer on whether or not the review of the sample indicates that a substantial amount of responsive documents are being identified. 

 

10.  A recall of 70-80% is consistent with, but not a sole indicator, of adequate review.    The recall estimation method for TAR is:

 

                                                                      # of Responsive Docs Found

         __________________________________________________________________

 

        # of Responsive Docs Found  + # of Responsive Docs Coded Wrong + # of Responsive Docs Not Reviewed

 

 

The recall estimation method for Manual Review is:

 

                                          # of Responsive Docs Found
         ___________________________________________________

        # of Responsive Docs Found  + # of Responsive Docs Coded Wrong 

 

 

Please reload

Contact Me With Your Litigation Support Questions:

seankevinoshea@hotmail.com

  • Twitter Long Shadow

© 2015 by Sean O'Shea . Proudly created with Wix.com