top of page
  • Sep 30, 2020

eBrevia's contract management software uses AI to analyze provisions in a large set of contracts an organization is party to. It will find different categories of information in contracts (price, term, renewal clause, etc.) . . .



. . . and can export the data in spreadsheet form for further review.




 
 

Police across the country are increasingly making use of facial recognition software to identify suspects in crimes recorded on surveillance cameras. There have been reports of innocent people being arrested because the police relied on face recognition software. One example is the case of Robert Williams held in custody by the Detroit Police, and then later released. See, Sarah Rahal and Mark Hicks, Detroit police work to expunge record of man wrongfully accused with facial recognition, The Detroit News, June 26, 2020, available here.


The National Institute of Standards and Technology conducted a study on efficacy of facial recognition algorithms. See Patrick Grother, Mei Ngan, and Kayee Hanaoka, NISTIR 8280, Face Recognition Vendor Test (FRVT) Part 3: Demographic Effects, December 2019, available at https://nvlpubs.nist.gov/nistpubs/ir/2019/NIST.IR.8280.pdf. The study concluded that the algorithms used by different developers vary widely in their accuracy. It made use of mugshots, photos submitted with applications for immigration benefits, and border crossing photographs. More than 18 million photos were reviewed. False positive results were returned far more often than false negatives, and the software exhibited different biases.


"Our main result is that false positive differentials are much larger than those related to false negatives and exist broadly, across many, but not all, algorithms tested. . . . With domestic law enforcement images, the highest false positives are in American Indians, with elevated rates in African American and Asian populations; the relative ordering depends on sex and varies with algorithm. We found false positives to be higher in women than men, and this is consistent across algorithms and datasets. This effect is smaller than that due to race.", Id. at 2.




 
 

In 2016, the Supreme Court of Wisconsin affirmed an order that the use of risk assessment software for sentencing violated a defendant's right to due process. See, State of Wisconsin v. Loomis , 881 N.W.2d 749 (Wis. 2016). The software used the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS). Loomis was convicted for his role as a driver in a drive-by-shooting. He pled guilty. The Wisconsin Department of Corrections prepared a report which included bar charts generated with COMPAS showing the Defendant's risk for pretrial, general, and violent recidivism. The COMPAS assessment was referenced by the lower court at Loomis's sentencing. Loomis was sentenced to six years in prison and five years of extended supervision.

The Defendant's expert contended that because COMPAS was not designed to assist with sentencing, its use would lead a court to overlook a defendant's unique circumstances. COMPAS was designed to determine when individuals could remain in their communities rather than be incarcerated. The software's developer does not disclose how its software evaluates a person's risk for recidivism. The COMPAS training manuals state that it is not be used for sentencing.

The Wisconsin Court of Appeals certified two questions to the Supreme Court of Wisconsin:

1. Did the use of COMPAS at sentencing violate due process because the proprietary nature of the software prevents the Defendant from challenging its scientific validity?

2. Did the use of COMPAS at sentencing violate due process because the software took gender into account?

The Supreme Court of Wisconsin held that COMPAS risk scores could be used in sentencing because they were not the determinative factor in deciding if Loomis could be supervised in a local community and because the defense could challenge the scores. It was also found that Loomis did not meet the burden of showing that the court relied on his gender in issuing its sentence.

The Supreme Court of Wisconsin did however find that a court cannot rely on COMPAS in order to determine whether a defendant is incarcerated or to decide the length of his sentence. A court must explain the other factors in addition to the COMPAS risk assessment that it uses in sentencing. A presentencing investigation report should inform the court that how COMPAS weighs certain factors is proprietary; the scores rely on group data; studies of COMPAS show they may disproportionately classify minorities as having a risk for recidivism; COMPAS uses a national sample, without having a Wisconsin specific study; and COMPAS was not specifically developed for the purpose of sentencing.

The Supreme Court of the United States denied a petition for a writ of certiorari. See, Loomis v. Wisconsin, 137 S.Ct. 2290 (2017)


 
 

Sean O'Shea has more than 20 years of experience in the litigation support field with major law firms in New York and San Francisco.   He is an ACEDS Certified eDiscovery Specialist and a Relativity Certified Administrator.

The views expressed in this blog are those of the owner and do not reflect the views or opinions of the owner’s employer.

If you have a question or comment about this blog, please make a submission using the form to the right. 

Your details were sent successfully!

© 2015 by Sean O'Shea . Proudly created with Wix.com

bottom of page