TruthMovement an internet research-guide for students and scholars. Best viewed in Chrome Browser

Blog Search

Friday, July 3, 2015

Computer Forensics: an approach to evidence in cyberspace by www.digitalevidencepro.com/Resources/Approach.pdf

Computer Forensics: an approach to evidence in cyberspace by www.digitalevidencepro.com/Resources/Approach.pdf

COMPUTER EVIDENCE V. DAUBERT: THE COMING CONFLICT by Christopher V Marsico

COMPUTER EVIDENCE V. DAUBERT: THE COMING CONFLICT by Christopher V Marsico Center for Education and Research in Information Assurance and Security, Purdue University, West Lafayette, IN 47907-2086

Challenges in Forensic Computing by Rebecca T. Mercuri

Challenges in Forensic Computing by Rebecca T. Mercuri

Challenges in Forensic Computing


Security Watch
Challenges in Forensic Computing

Rebecca T. Mercuri
Communications of the ACM
Volume 48, Number 12 (2005), Pages 17-21


Table of Contents

The ever-changing nature of technology contributes to the problems encountered by experts when collecting and preparing digital evidence for courtroom presentation.


In recent years, forensic computing has evolved beyond that of an ad hoc pseudoscience to a recognized discipline with certified practitioners and guidelines pertaining to the conduct of their activities. With the ubiquity of computer-based devices in everyday use, forensic techniques are increasingly being applied to a broad range of digital media and equipment, thus posing many challenges for experts as well as for those who make use of their skills.

According to Computer Forensics World (www.computerforensicsworld.com), the field primarily involves the "use of analytical and investigative techniques to identify, collect, examine, and preserve evidence/information which is magnetically stored or encoded." Forensic investigations may also address the analysis and reporting of digital evidence after an incident has occurred, with the goal of preparing "legally acceptable" materials for courtroom purposes (see www.aic.gov.au). Matters may involve computer security breaches, computers used in committing illegal deeds, criminal activity that had a computer as its target, or computer-based devices that inadvertently collected information pertinent to a crime or dispute (see www.forensics.nl).

Forensic computing experts can be deployed in a broad range of criminal, municipal, and civil arenas. I have worked on a variety of cases in this capacity, including:
Investigation of a law firm's accounting information by a state Office of Attorney Ethics to determine whether escrowed funds had been misused;
Reconstruction of thousands of deleted text and image files in a murder case, in order to gather information about the activities of the victim and various suspects;
Examination of source code used in the construction of an MPEG decoder chip set, to see if patents had been violated;
Evaluation of the contents of a database to determine the cost of its production, as mitigating evidence in a large financial disagreement between business partners;
Consideration of possible foul play by a former company employee, in the damage of computer records;
Mathematical analysis of photographs to see if they had been digitally altered; and
Preparation of explanations for an abnormally high missed vote rate exhibited by certain self-auditing electronic election equipment.

Many forensic matters (including some of those mentioned here) do not go to trial, especially in the business arena where a convincing set of data often suffices to induce an out-of-court settlement, or where investigative techniques are applied on a "need-to-know" basis, such as to determine whether internal or external corporate espionage or malicious activity has occurred. Experts may assist in preparing legal briefs, they can be requested to provide sworn testimony and opinions in city, state, and federal hearings conducted by legislative bodies and their commissions or task forces, and they frequently work hand-in-hand with computer security teams to assist in the development of procedural, policy, and control techniques to help prevent (or assist in mitigating) losses. Investigations can be performed in a few hours or days on simple matters, or can persist over the course of years for complex cases. Although some experts are engaged for the full range of investigative and testimonial tasks, those who are valued for their highly persuasive verbal skills, and who can react well to on-the-spot challenges, may only review and present (or rebut) evidence prepared by other forensic computing specialists.

Unless appointed by the court to provide a neutral interpretation of findings from all sides of a dispute, forensic experts tend (unofficially, of course) to be looked upon or identify themselves as either "black hats" (typically those working for defense teams) or "white hats" (those allied with plaintiff or prosecution teams). Law enforcement officers are usually branded as being "white hats" since their experts are often used as witnesses by the district attorneys in criminal cases. But, in practice, since digital media evidence is typically impounded in the custody of state and local police, or similar federal agencies, such officers must necessarily also cooperate with the defense team in allowing their experts to access the data for discovery and case-preparation purposes.

Given the adversarial nature of this process, and since the caveat of "possession being nine-tenths of the law" applies to the ease with which computer-based data can be (often undetectably and/or non-recoverably) modified during its collection, impounding, and analysis, certain new "rules of evidence" have evolved from the more general (non-computer) codes of practice. These rules address the chain of custody that must be authenticated when digital evidence is introduced. An example of such procedures concerns the use of materials that have been duplicated. In general, according to the U.S. House Advisory Committee on Rules, with regard to its Rule 1003 (Admissibility of Duplicates) in the Federal Rules of Evidence, "a counterpart serves equally as well as the original, if the counterpart is the product of a method which insures accuracy and genuineness." It should be noted that although these Rules of Evidence (see www.law.cornell.edu/rules/fre/overview.html) are required only for Federal court proceedings, many state codes are modeled after them, and thus are fairly consistent. Determination of violation of these rules may be the focus of efforts by defense experts in order to dismiss or raise suspicion about the authenticity or accuracy of the digital materials.

Because of the particular care that must be taken with digital media, forensic investigation efforts can involve many (or all) of the following steps (see www.itsecurity.com):
Securing materials via appropriate chain of custody;
Making full (or mirror) copies of digital information from impounded sources;
Following procedures to prevent alteration of data and files;
Using software and hardware tools to ensure that the original media is not damaged or compromised in any way, and that the copies do not contain extraneous material (such as residual data that may be introduced through prior uses of the medium now holding the mirror copy);
Maintaining any data that resides in "free space," including restoration of deleted information on the original devices, using the mirror copies; 
Keeping a complete and comprehensive audit trail of steps performed in the preceding processes; and
Ensuring that client-attorney privileges and other privacy issues related to the digital evidence are not breached by the experts who have examined the data.

Certain digital information, beyond the contents of the data itself, may be pertinent to case development. This information can include file time and date stamps, folder structure hierarchies, and message transmission tags. Real-time data collection efforts may need to address surveillance legalities and privileges, and avoid inadvertent damage claims (such as may occur when a server is made inaccessible for a period of time). Things to be wary of include alterations to the digital media that could occur when the electronic device is turned on or off, and inadvertent activation of Trojan horse or time-bomb malware that was left behind to corrupt data and confound forensic efforts. One caveat is that "you should only find what is actually there," but ensuring this is so, may involve the development and implementation of collection, blocking, prevention, and tracking techniques. This is where evidence collection kits, containing software and hardware tools, can be usefully applied.

The forensic examiner's bag of tricks generally includes operating system utilities (for backups, disk manipulation, string searches, and so forth), data recovery software (to thwart file deletion attempts), file viewers and Hex editors (to perform Win/Mac data conversions and reveal information contents and patterns), and commercial firewalls (for network sniffing and port scanning during investigations). There are also packages that provide turnkey assistance for forensic examinations, complete with case management tracking for procedures, reports, and billing. Experts may build their own scripts and tools in order to provide specialized investigations, or to gain an edge over firms providing similar services.

Some useful lists of forensic products are maintained by Danny Mares and Company at www.dmares.com/maresware/linksto_forensic_tools.htm, by the Computer Crime Research Center at www.crime-research.org/library/resource.htm, and by the University of Western Sidney's School of Computing and Information Technology at www.cit.uws.edu.au/compsci/computerforensics/Software/. Although a considerable amount of this software is freely downloadable (and yes, used by hackers as well as trackers), generally you get what you pay for—namely, some of these free offerings can be a bit of a kludge. Some of the most user-friendly commercial products are sold only to law-enforcement agencies or are priced prohibitively for defense teams, so justice may not necessarily be even-handedly served with regard to examination capabilities.

If access to digital evidence is not forthcoming from an impounding agency, court orders may be necessary to obtain the data as well as use of the extraction tools, in order to determine whether protocols had been appropriately applied. Conversely, a prosecution or defense team may wish to suppress evidence from discovery, if they believe it could be damaging to the case. Here is where the time-consuming aspects of the forensic examination may come into play. Typically it is not possible to perform a comprehensive decomposition and logging of all materials (such as the contents of every sector of a terabyte hard drive, or thousands of hours of digital video from a surveillance camera), so a "scratch-and-sniff" approach might be used to yield promising information. Even though cost-effective, tactical decisions to proceed with only a partial investigation may be regretted in hindsight if a post-mortem comprehensive analysis shows that an alternative outcome might have prevailed.

In response to the need to analyze, preserve, protect, and defend forensic evidence, an initiative was begun in 1999 (prior to the homeland security era), in San Diego, CA, to construct and staff Regional Computer Forensic Laboratories (RCFLs). This was done through the Federal Bureau of Investigation in cooperation with local and state law enforcement [1]. By year's end, 13 of these RCFLs will be available for use by more than 1,000 agencies, spanning 15 states (see www.rcfl.gov). I had the opportunity to tour the RCFL in Hamilton Township, NJ that is a part of the newly constructed Forensic Science Technology Center administered by the FBI, the NJ Office of the Attorney General, the NJ Division of Criminal Justice, and the NJ State Police. In addition to the RCFL, the $2.2 million, 200,000-square-foot facility houses laboratories for ballistics, DNA, drug analysis and toxicology, crime scene investigation, and forensic anthropology and photography. The RCFL section contains bulletproof windows and walls, examination bays, a classroom, and a state-of-the-art digital evidence room (the modern-day equivalent of a Faraday cage) to shield impounded materials that could be sensitive to radio-frequency signals (such as cellular telephones, PDAs, and wireless-equipped computers).

The New Jersey RCFL (www.njrcfl.org) provides free digital forensic training services for law enforcement investigators and analysts, who can also receive FBI digital forensic examiner certification through participation in a 12–18 month sequence that includes coursework facilitated by the lab, and on-the-job training. Even though the unit's 21 examiners successfully handled hundreds of cases in the lab's first year of operation, they still must balance and leverage constraints of time, budget, and capacity. Toward this end, they prioritize requests into five levels: 1) immediate threats to property or people; 2) similar but potential threats; 3) general criminal investigations, such as fraud and child endangerment/pornography; 4) administrative inquiries; and 5) digital forensic research and development.

NJRCFL's laboratory director, FBI Supervisory Special Agent Larry Depew, noted that continual changes in digital technology pose far more complex challenges than those involving other "traditional" forensic disciplines, since some of the latter are, relatively speaking, performed on a "fixed box" of information. He believes the value of the computer forensic laboratory is seen not only in its investigative and archival functions, but also includes its continual improvements in process methodology. Depew views this with respect to the importance of determining "not only what I know, but what I know that isn't so." Security expert Rebecca Bace [3] also identifies this as a key challenge in forensic computing—the application of inductive reasoning on the data to determine "what is or was" as well as deductive thinking in order to intuit "what is not or was not." What adds to the complexity, she says, is that "often there is little symmetry between the inductive and deductive aspects of a particular case."

Another problem encountered by forensic examiners (especially those unaided by RCFL facilities) is that they must seek out and provide for their own training on an ongoing basis. This is a confusing matter, as it has only been since 2003 that forensic computing has been recognized as a laboratory discipline. The NJRCFL, for example, is applying for accreditation by the Board of the American Society of Crime Laboratory Directors (ASCLD; www.ascld.org). Although ASCLD approval in the category of Digital and Multimedia Evidence is available for labs meeting its standards for any or all of four subdisciplines (computer forensics, forensic audio, video analysis, and image analysis), it does not presently certify the examiners who work in these labs. Many examiner certifications are new and their relative merits may be dubious, especially as compared to the broader knowledge, flexibility, and skills of well-trained and experienced computer scientists, engineers, or IT professionals. The CompuForensics Web site (www.compuforensics.com/training-faq.htm) even mentions that the field "is as yet not regulated by any credible centralized certification authority." Some of these credentials (see examples in the table here) could be obtained via short courses taken by a computer-savvy high school graduate, although an FBI or police background check may also be required.

While a few community colleges and universities have begun to feature forensic computing specializations, there is not yet any consensus on curriculum requirements, although as the field evolves there will likely be further course offerings and some standardization. Trends seem to suggest these topics are primarily hosted by IT departments, whose graduates would typically deal with front-line defense and incident response against activities that potentially require forensic investigation [2].

Most certainly, forensic computing is an exciting profession that can be both elating and frustrating for its practitioners. Even if it were somehow possible to eradicate nefarious intent, equipment failures will continue to provide a market for investigative and reconstructive services as with any engineering endeavor (like the Space Shuttle and the power grid). The continuing maturity of this field will invariably bring some stabilization in best practices, training, certification, and toolsets, but new challenges will always emerge because of the dynamic nature of the technology at its root.

References

1. Garrison, D. Regional computer forensic laboratories. Evidence Technology Magazine 1, 4 (Nov./Dec. 2003); www.evidencemagazine.com/issues/novDec03/RCFL.htm.

2. Kruse, W. and Heiser, J. Computer Forensics—Incident Response Essentials. Addison-Wesley, 2002.

3. Smith, F. and Bace, R. A Guide to Forensic Testimony. Addison-Wesley, 2003.

Author

Rebecca Mercuri (mercuri@acm.org) recently completed a fellowship with the Radcliffe Institute for Advanced Study at Harvard University, and has resumed her expert witness and forensic computing work at Notable Software, Inc.

Tables


Table. Forensic computing examiner certifications.


©2005 ACM 0001-0782/05/1200 $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. 


Challenging the Admission of Forensic Evidence by Amelia L. Bizzaro

Challenging the Admission of Forensic Evidence
 
by Amelia L. Bizzaro
http://www.wisbar.org/newspublications/wisconsinlawyer/pages/article.aspx?Volume=83&Issue=9&ArticleID=1892
Challenging the Admission of Forensic Evidence
A national study of our nation’s crime labs and the scientific validity of several commonly used forensic science disciplines questions the basis for several different forensic science disciplines. It makes sweeping recommendations for reform and calls for standardization, accreditation, and independence for the nation’s crime labs. Attorneys must continue to press trial courts to be more active gatekeepers by challenging the underlying premise of forensic evidence, because oftentimes science does not support analysts’ testimony.

Not only does forensic science play a role in most criminal cases, it also is now part of mainstream American culture, thanks in part to the prolific and popular “CSI” television series and several other programs that depict crime solving. Prosecutors may be faced with jurors who expect forensic evidence to be as clear-cut as it is on television, where it is easy to understand, interesting, and 100 percent accurate. Jurors faced with lengthy expert testimony discussing complex scientific principles may disbelieve the evidence, holding prosecutors to an impossible standard. At the same time, it also is possible that given their familiarity with some forensic science terms, jurors may believe the evidence to be more accurate than it really is, aligning it with their television experiences in which the evidence is always infallible. As a result, lawyers on both sides of cases are faced with the daunting task of convincing jurors that art does not always imitate life.


The task of painting an accurate picture of forensic science evidence, while still difficult, has become somewhat easier since a committee of the National Academies of Science (NAS) National Research Council published its report, Strengthening Forensic Science in the United States: A Path Forward (hereinafter, the NAS Report). The NAS Report, created by a diverse group of scientists, academics, and legal scholars, called into question the basis for several different forensic science disciplines, made sweeping recommendations for reform, and called for standardization, accreditation, and independence for the nation’s crime labs.


The report questioned the reliability of most forensic science disciplines, with the exception of DNA-evidence research, noting that such research is the only discipline that “has been rigorously shown to have the capacity to consistently and with a high degree of certainty support conclusions about individualization (more commonly known as matching of an unknown item of evidence to a specific known source).”1 Compared to DNA-evidence research, the report opines, several forensic science disciplines fall woefully short. “The simple reality is that the interpretation of forensic evidence is not always based on scientific studies to determine its validity.”2


On a national scale, the report’s findings and recommendations are slowly being implemented through legislation. The Senate Judiciary Committee made public a draft outline for legislation in response to the NAS Report. The legislation calls for the creation of the Forensics Science Commission (FSC), which would be made up of members appointed by the President based on recommendations from the NAS and the American Academy of Forensic Sciences. If created, the FSC will be responsible for setting “rigorous standards for accreditation,” determining which disciplines require certification and the standards for such certification, developing a “comprehensive strategy for increasing and improving peer-reviewed scientific research related to the forensic science disciplines, including research addressing issues of accuracy, reliability, and validity in the various disciplines,” and establishing “standard protocols, methods, practices, quality assurance standards and reporting terminology for each applicable forensic science discipline in order to ensure the quality and integrity of the data generated.”3


Although no such legislation is pending in Wisconsin, the report is still a helpful tool for any lawyer seeking to admit or challenge forensic science evidence. Use of the report, however, depends largely on the forum, the case, and the type of evidence.
Admission of Scientific Evidence


The relationship between science and law has long been tumultuous. The legal system usually relies on the adversarial system to root out the truth, while the scientific community uses empirical analysis. Law and science often collide in the courtroom, where lawyers often find themselves exploring scientific concepts they may not fully understand. Similarly, judges, who also often lack a scientific background, are put in the untenable position of deciding whether evidence is relevant, and at least to some extent, reliable. Perhaps the Hon. Harry T. Edwards, cochair of the committee that authored the NAS Report, said it best: “I started the NAS project with no skepticism regarding the forensic science community. Rather, I assumed, as I suspect many of my judicial colleagues do, that the forensic disciplines are well grounded in scientific methodology and that crime laboratories and forensic practitioners follow proven practices that ensure the validity and reliability of forensic evidence offered in court. I was surprisingly mistaken in what I assumed.”4


Whether in state court or federal court, there are at least two underlying issues for every piece of forensic evidence offered for admission: 1) the extent to which the particular forensic discipline is based on reliable scientific methodology, and 2) the extent to which the expert’s conclusion depends on his or her own interpretation, which may be colored by error or bias and may lack operational and performance standards.5 The less science is involved, the more subjective the conclusion.


However, the impact of these issues depends largely on the forum. In federal court, the standard for the admission of scientific evidence has evolved. It began with the landmark case Frye v. United States,6 in which the court ultimately held that evidence was not admissible unless it was generally accepted. Fifty years later the test changed with the implementation of Federal Rule of Evidence 702, which required only that the evidence “assist the trier of fact.” What the promulgation of this rule meant for Frye was hotly debated until nearly 25 years later, in 1993, when the U.S. Supreme Court decided the landmark case Daubert v. Merrell Dow Pharmaceuticals Inc. The Court held that Rule 702, not Frye, controlled, but that the “trial judge must ensure that any and all scientific testimony or evidence admitted is not only relevant, but reliable.”7 The Daubert court emphasized that evidentiary reliability must be based on scientific validity and provided a list of factors to consider, including whether the scientific theory or technique had been tested, subjected to peer review, and accepted.


The evolution of requirements for the admission of evidence did not end with Daubert, however. In 2000, Rule 702 was amended to permit the admission of expert testimony so long as the testimony is based on sufficient facts or data, the testimony is the product of reliable principles and methods, and the witness has applied those principles and methods to the facts of the case.


Wisconsin courts have adopted the version of Rule 702 that was in place before Daubert as the test for the admissibility of expert testimony. Thus, so long as the expert is qualified and the testimony will assist the trier of fact and is relevant, it is admissible. Unlike federal courts, which permit the holding of pre-trial evidentiary hearings to determine the reliability of evidence before ruling on its admissibility, Wisconsin courts rely on juries to distinguish charlatans from scientists. Circuit court judges have “considerable discretion in determining the admissibility of expert testimony.”8 “First, the expert’s principles, methods, and tests must be ‘reliable enough to be probative’; that is, a reasonable jury must be able to find them reliable (the standard of conditional relevancy). Second, the trial judge has discretion as a ‘limited gatekeeper’ to limit or exclude expert testimony based on a number of factors, including the consumption of time and the degree to which it assists the trier of fact.”9


Because of the immense amount of discretion vested in both federal and state courts to admit or exclude expert testimony, appellate court decisions are confined to determining whether the lower courts abused discretion, a highly deferential standard difficult to overcome. As the NAS Report acknowledged, it is difficult to know just what is happening at the trial court level because district courts do not routinely publish decisions. “Reported opinions in criminal cases indicate that trial judges sometimes exclude or restrict expert testimony offered by prosecutors; reported opinions also indicate that appellate courts routinely deny appeals contesting trial court decisions admitting forensic evidence against criminal defendants.”10 The same is not true in civil cases. In civil cases, the NAS Report found, the parties are more equally matched in terms of their ability to introduce forensic evidence. “And, ironically, the appellate courts appear to be more willing to second-guess trial court judgments on the admissibility of purported scientific evidence in civil cases than in criminal cases.”11


While federal court appellate decisions concerning the scientific underpinning of evidence are hard to come by, relevant state court decisions are virtually nonexistent. Appellate courts are not in a position to decide whether evidence from a specific forensic science discipline is admissible, given that their review is almost always limited to whether there was an erroneous exercise of discretion. That is not to say, however, that appellate courts have never considered the admissibility of scientific evidence.


Federal courts have specifically rejected comparative lead-bullet-analysis evidence, which parties used in attempts to match recovered bullets to a particular box of ammunition. Declared unreliable across the board, such evidence is per se inadmissible. Wisconsin appellate courts have directly addressed two types of expert evidence to date: polygraph evidence and psychiatric testimony of a defendant’s ability to form intent. The Wisconsin Supreme Court rejected polygraph evidence, in part because “the legal and scientific communities remain significantly divided on the reliability and the usefulness of the polygraph in a criminal case.”12 Despite recognizing that the polygraph had some degree of validity and reliability, the court held the evidence inadmissible because it relied too much on the examiner’s subjective evaluation and on factors that could not be reliably quantified.13


Similarly, the Wisconsin Supreme Court held that expert psychiatric testimony regarding a defendant’s capacity to form intent was inadmissible when based on the defendant’s mental health history. “There is substantial doubt whether evidence such as was sought to be introduced here is scientifically sound, and there is substantial legal doubt that it is probative on the point for which it was asserted in this case.” 14


On the trial court level, several federal district courts have begun to question whether firearm and tool- mark identification evidence, which seeks to match recovered bullets or casings to a particular firearm, meets the test for the admissibility of expert testimony. The Wisconsin Court of Appeals may also soon weigh in on this particular topic when it decides a pending case, State v. Jones.15 Simply because the appellate courts have not reached many decisions on the admissibility of specific forensic science disciplines does not mean that trial lawyers should avoid actively litigating the issues in the circuit and district courts. If nothing else is clear, it is clear that the trial court level is the only place to actively litigate forensic science issues, given the high standard of review.

The NAS Report


The NAS Report is a powerful resource concerning the admissibility of forensic science evidence. The U.S. Supreme Court, for example, cited the prepublication version with approval, noting that it is not “evident that what respondent calls ‘neutral scientific testing’ is as neutral or as reliable as respondent suggests.”16 Similarly, the American Academy of Forensic Sciences17 and the Board of Directors of the American Statistical Association18 have adopted the report.


The most valuable part of the report is its detailed evaluation of several forensic science disciplines. The report authors pored over journal articles and studies, solicited and listened to direct testimony, and conducted independent research. The authors concluded that the majority of the disciplines they evaluated call for the examiner to declare a match using subjective methodology completely lacking in scientific validity. As a result, the disciplines have similar problems: the conclusions reached often are prone to confirmation bias (an analyst’s predisposition to confirm that the evidence supplied by law enforcement matches the identified suspect) and cannot be replicated from one examiner to the next, sometimes not even by the examiner who declared the match. The NAS Report criticized analysts for lacking supporting documentation detailing their evaluation, given their willingness to testify to a zero-error rate.


The three most frequently used types of forensic evidence are friction-ridge analysis (fingerprints, palm prints, and sole prints), pattern/impression evidence (encompassing anything that can leave an impression of a pattern, like shoeprints and tire tracks), and firearm and toolmark identification evidence. All three disciplines involve comparing a recovered item (for example, a fingerprint from a glass door) to a known sample (the suspect’s fingerprint taken by law enforcement). Analysts view the recovered sample and compare it to the known sample, determining whether a match in fact exists. Unlike DNA analysis, few tools other than a microscope and the analysts’ own vision are used.


Overall, the NAS Report’s criticisms of each of these three disciplines were similar: the methodology is subjective; analysts cannot consistently replicate the results, in part because of a lack of documentation; analysts often improperly embellish the accuracy of their findings; and there is no valid, independent research supporting the methodology.


Amelia L. Bizzaro, Marquette 2003, is the principal at Bizzaro Law LLC, Milwaukee, and practices appellate law. She is on the board of directors for the Wisconsin Association of Criminal Defense Lawyers and is chair of its Oct. 8-9 seminar, Whatever Happened to the Science in Forensic Science. She is a member of the State Bar’s Appellate Practice Section and is cochair of the Milwaukee Bar Associations Bench/Bar Court of Appeals Committee. The Wisconsin Law Journal recently named her one of 2010’s up and coming lawyers. She can be reached at abizzaro@bizzarolaw.com.


The method for identifying fingerprints, for example, “does not guard against bias; is too broad to ensure repeatability and transparency; and does not guarantee that two analysts following it will obtain the same results.”19 Fingerprint experts often testify that their methodology, when done correctly, has a zero-error rate. Such a conclusion, the NAS Report concluded, “is unrealistic, and, moreover, it does not lead to a process of method improvement.”20


When it comes to pattern/impression evidence, the NAS Report concluded, “there is no consensus regarding the number of individual characteristics needed to make a positive identification.”21 There is no research about “the persistence of individual characteristics, the rarity of certain characteristic types, and the appropriate statistical standards to apply to the significance of individual characteristics.”22


The NAS Report, like several recent federal district court decisions, was perhaps most critical of firearm and toolmark identification evidence. “A fundamental problem with toolmark and firearms analysis is the lack of a precisely defined process.”23 The controlling authority for declaring a firearm identification match lies with the Association of Firearm and Toolmark Examiners (AFTE), which advises examiners to declare a match when there is “sufficient agreement” between two sets of marks.24 The AFTE’s definition “does not even consider, let alone address, questions regarding variability, reliability, repeatability, or the number of correlations needed to achieve a given degree of confidence.”25


Because the majority of forensic science consists merely of having an analyst look at the evidence, prosecutors should exercise caution in admitting it, defense lawyers should be prepared to challenge it, and the courts should carefully consider whether such visual examinations, with nothing more, are really helpful to the trier of fact.
Challenging the Admissibility of Forensic Evidence and Limiting Experts’ Opinions


Given Wisconsin’s assist-the-trier-of-fact standard for the admission of expert testimony, litigators must challenge forensic evidence by demonstrating that the evidence in question is not reliable enough to be probative and is not helpful to the trier of fact. If, as is true within some disciplines, the experts cannot agree with one another about what constitutes a match, or even the terminology for describing a match, then experts within those disciplines cannot possibly help jurors figure it out. Certainly, when declaration of a match depends on what the expert sees, then jurors ought to be able to also see the match for themselves with the help of pictures. Simply declaring a match is not enough. Forensic evidence from disciplines that depend on the examiners’ subjective opinion to know a match when they see it should be inadmissible for the same reasons that polygraph and defense psychological testimony on intent are inadmissible.


Although it remains to be seen whether this argument will persuade state court judges, federal district courts have begun to limit the admissibility of some types of forensic evidence. More than one federal district court has limited the type of opinion experts can express to the jury in firearm and toolmark identification cases. In one case, the court concluded that allowing the examiner “to testify that he had matched a bullet or casing to a particular gun ‘to a reasonable degree of ballistic certainty’ would seriously mislead the jury.” As a result, it permitted the examiner to state his opinion “in terms of ‘more likely than not,’ but nothing more.”26 Another court noted, “there is no reliable statistical or scientific methodology which will currently permit the expert to testify that it is a ‘match’ to an absolute certainty, or to an arbitrary degree of statistical certainty.”27 The court refused to allow the expert to “assert any degree of statistical certainty, 100 percent or otherwise, as to a match.”28


Experts to Discuss Nuts and Bolts of Forensic Science at WACDL Seminar for Defense Attorneys 


The Wisconsin Association of Criminal Defense Lawyers (WACDL) is hosting a two-day seminar about forensic science Oct. 8-9 at the Great Wolf Lodge in Wisconsin Dells. Join nationally renowned legal and scientific experts who will discuss the nuts and bolts of several important forensic science fields, paying special attention to successfully and effectively cross-examining analysts. Topics include trace evidence, fingerprints, firearm and toolmark identification, DNA, and blood testing.


The WACDL is committed to promoting the proper administration of criminal justice; fostering and maintaining the integrity, independence, and expertise of the defense lawyers in criminal cases; and encouraging an unyielding concern for the protection of individual rights and due process. The seminar is limited to attorneys who share this commitment. For more information and to register for the seminar, please visit www.wacdl.com or call (608) 223-1275.


Federal courts, while not refusing to allow experts to testify, are at least limiting the conclusions of the experts. These limitations, while certainly a step in the right direction, are not enough. Attorneys must vigorously cross-examine the experts on their conclusions, particularly relating to the two main concerns expressed by the NAS Report: confirmation bias and the lack of documentation accompanying test results.


Analysts carelessly use terms like “match,” “consistent with,” “identical,” “similar in all respects tested,” and “cannot be excluded as the source of” without any agreement or consensus within the discipline about the meaning of these terms.29 “The use of such terms can have a profound effect on how the trier of fact in a criminal or civil matter perceives and evaluates the evidence.”30 Attorneys should force the analysts to define these nebulous terms on the stand. If an analyst does not know what those terms mean, how can a jury rely on his or her conclusions to any degree?


Similarly, analysts should not be permitted to simply supply a one-sentence report declaring a match without explaining more. Reports should, at a minimum, describe “methods and materials, procedures, results, and conclusions, and they should identify, as appropriate, the sources of uncertainty in the procedures and conclusions along with estimates of their scale (to indicate the level of confidence in the results).”31 Analysts who omit these critical details should be forced to explain themselves on the stand. Again, an inability to do so will show the jury how little science is really involved.


Although most lab reports do not reveal it, most analysts also have notes from their evaluation, in addition to the report. These notes may explain what the reports do not (but also may be nothing more than the same one-line sentence declaring a match). Arguably, the state is required to provide these materials in response to a discovery demand because they are statements of a witness, and failure to do so is reversible error unless the state can prove harmless error.32 The notes may reveal further avenues of cross examination, not for what the notes say but for what they do not, particularly involving disciplines in which the analyst declares a match, like friction-ridge analysis, pattern/impression evidence, and firearm and toolmark identification evidence. If the analyst’s notes do not describe the supposedly unique marks, where they are located, or what makes them unique, then the analyst certainly will not be able to testify to those things several months later at trial.


In addition to the state’s obligation to provide relevant materials, attorneys have an obligation to become familiar with the problems within a particular discipline and to seek to challenge admissibility of the evidence and the analysts’ opinion about it. Attorneys must avail themselves of readily available information calling into question the reliability of a particular forensic discipline.
Conclusion


The NAS Report is a valuable tool that looked closely at the state of our nation’s crime labs and the scientific validity of several frequently used forensic science disciplines. Although the law often lags behind developments in science, attorneys must continue to challenge trial courts to be more active gatekeepers by challenging the underlying premise of forensic evidence, despite the fact that the evidence may have been generally accepted in the past. Such challenges are primarily fact-based and achieved through cross-examination, because very few appellate court decisions have addressed specific disciplines.
Endnotes




1National Academies of Science, National Research Council, Committee on Identifying the Needs of the Forensic Science Community, Strengthening Science in the United States: A Path Forward 87 (final publication 2009) (hereinafter NAS Report). Available for purchase at www.nap.edu/catalog.php?record_id=12589.


2Id. at 8.


3Senate Judiciary Committee, “Draft Outline of Forensic Report Legislation” (May 5, 2010), available at http://www.theiai.org/current_affairs/20100505_Draft_Outline_of_Forensic_Reform_Legislation.pdf.


4Harry T. Edwards, cochair, Committee on Identifying the Needs of the Forensic Science Community, National Academies of Science, presentation at the Superior Court of the District of Columbia, Conference on The Role of the Court in an Age of Developing Science and Technology, Washington, D.C., May 6, 2010, The National Academy of Sciences Report on Forensic Sciences: What It Means for the Bench and Bar 2.


5NAS Report, supra note 1, at 9.


6Frye v. United States, 293 F. 1013 (D.C. Cir. 1923).


7Daubert v. Merrell Dow Pharm. Inc., 509 U.S. 579, 589 (1993).


8Daniel Blinka, Wisconsin Practice Series § 702.1, at 572 (3d ed. 2008).


9Id.


10NAS Report, supra note 1, at 97 (citations omitted).


11Id. at 98 (citations omitted).


12State v. Dean, 103 Wis. 2d 228, 234-35, 307 N.W.2d 628 (1981).


13Id.


14State v. Steele, 97 Wis. 2d 72, 97, 294 N.W.2d 2 (1980).


15State v. Jones, Appeal No. 2009AP2835-CR. The briefs, including an amicus brief from the Innocence Network by attorney Jerome Buting, are available at http://wscca.wicourts.gov/index.xsl.


16Melendez-Diaz v. Massachusetts, 129 S. Ct. 2527, 2536 (2009).


17Science in Court, 464 Nature 325, 325 (2010).


18American Statistical Association Statement on Strengthening Forensic Science (April 17, 2010), available at http://www.amstat.org/outreach/pdfs/Forensic_Science_Endorsement.pdf.


19NAS Report, supra note 1, at 142.


20Id.


21Id. at 149.


22Id. at 150.


23Id. at 155.


24Id. The AFTE does not define any of the terms it uses for declaring a match, although its definition is considered “the best guidance available for the field of toolmark identification[.]” Id.


25Id.


26United States v. Glynn, 578 F. Supp. 2d 567, 569-70, 575 (S.D.N.Y. 2008).


27State v. Monteiro, 407 F. Supp. 351, 372 (D. Mass. 2006).


28Id. at 373. Other courts have reached similar conclusions. See United States v. Green, 405 F. Supp. 2d 124 (court did not allow examiner to testify that match excluded “all other guns”); United States v. Taylor, 663 F. Supp. 2d 1170, 1179 (D. N.M. 2009) (precluding ballistics examiner from “stating that he can conclude that there is a match to the exclusion, either practical or absolute, of all other guns”). Firearm and tool mark identification is not the only discipline in which a court has limited an expert’s opinion. In Commonwealth v. Patterson, 840 N.E.2d 12, 15 (Mass. 2005), the court held that the ACE-V (analysis, comparison, evaluation, and verification) methodology is sufficiently reliable to admit expert testimony; however, general reliability was not enough for the Commonwealth to introduce evidence that “fingerprint identification could be applied reliably to simultaneous impressions not capable of being individually matched to any of the fingers that supposedly made them.”


29NAS Report, supra note 1, at 185.


30Id.


31Id. at 186.


32Wis. Stat. § 971.23 (1)(e); State v. Lettice, 205 Wis. 2d 347, 352, 556 N.W.2d 376 (1996).