Crime Bomb

Pub date April 20, 2010

Editors note: This story was originally published May 31,  2001.

They found Virginia Lowery lying in the garage of her Excelsior home, an electrical cord around her throat, an ice pick jammed through her skull — in one ear and out the other. For the next 11 years San Francisco homicide detectives made no progress on the case. Promising leads turned into dead ends. Theories collapsed. The cops assigned to the case retired. It looked like Lowery’s 1987 slaying would never be solved.

Then in April 1998, by pure chance, police found Robert C. Nawi. Or rather, they found his fingertips.

When Nawi, a 57-year-old carpenter, got in a shouting match in a North Beach watering hole, he was picked up by the cops on misdemeanor charges and shuttled to county jail, where he was fingerprinted and booked. The computer spat out some interesting news: Nawi’s digits, according to the database, resembled a fingerprint found at the scene of Lowery’s slaying.

Soon thereafter, police evidence analyst Wendy Chong made a positive print match, and the new suspect found himself facing murder charges and life in a cage.

Nawi’s fate, to be decided at trial next year, rests largely on police readings of his fingerprints, as well as some DNA gathered by the coroner. Which raises some questions: How, exactly, did the cops and their computers analyze the evidence? Did they get it right? Is anybody checking their work?


Making a match between the distinguishing ridges and whorls, often microscopic, of two fresh fingerprints is a relatively simple task for a print expert. However, cases like Nawi’s aren’t so clear-cut: the print collected in Lowery’s garage is faint, smudged, and missing in patches.

Michael Burt, the resident forensicscience guru at the San Francisco Public Defender’s Office, shows me an 8-by-10-inch enlargement of the print discovered at the murder scene; it’s blurry, grainy, and only about 60 percent complete. To my layperson’s eye, it bears little resemblance to the clear, fresh mark left by Nawi at his booking. “The one print is so washed out you can’t see anything,” says Burt, who is representing Nawi. “This is not science at all; it’s subjective and shouldn’t be allowed.”

Burt, a 22-year veteran defense lawyer known around the Hall of Justice for his trademark cart full of documents, has plenty of cause to doubt the cops’ evidence. Despite what you may have seen on Law and Order, fingerprint examiners can — and often do — get it wrong. Last year 141 of America’s top forensic labs were tested to see if they could accurately match two fingerprints: 39 percent failed; 11 labs made false IDs. San Francisco analysts are rarely, if ever, graded for accuracy.

Jim Norris, head of the San Francisco Police Department’s forensics division, argues that new computer imaging tools are making it possible to match even sketchy, partial prints. “When somebody shows a print that was originally collected at the crime scene, and it looks very difficult to deal with, what they’re not looking at is the image that has been [digitally] enhanced,” Norris explains. “It’s a lot easier to deal with.” Norris admits that the department has seldom tested its print examiners for accuracy, but he says their work is constantly checked by superiors.

According to Burt, in this particular instance analysts didn’t turn to computers but simply enlarged the prints before making the call. The district attorney’s DNA evidence against Nawi is equally flawed, he says. When coroner Boyd Stephens autopsied the corpse, he — per routine — snipped the woman’s fingernails with a household nail clipper and stuck them in an envelope. Unrefrigerated, the clippings slowly rotted for more than a decade, until, in the wake of Nawi’s arrest, prosecutor John Farrell had them tested for DNA.

When the crime lab got the evidence, in 1998, DNA analyst Alan Keel scraped all 10 nails with a single cotton swab, combined the scrapings into one tiny pile, and dropped them into a genetic-typing device. According to standard forensic procedure, each nail should’ve been swabbed and tested separately.

Now, Burt contends, the sample has deteriorated because of a lack of refrigeration and has been contaminated with the DNA of more than one person. “[Keel] says there are three, possibly four different individuals underneath her fingernails,” the lawyer says. “He’s trying to grab my client out of that mixture. There’s no scientific way to do that.”

Norris disagrees: “There are ways to deal with [DNA] mixtures; it’s not a common problem luckily, but it’s something that comes up — for example, in rape cases where there are multiple assailants. There are ways to deal with it.”

I run down the scenario for Dr. Simon Ford, a Ph.D. biochemist and DNA expert who heads up San Francisco–based Lexigen Science and Law Consultants. “That’s not good,” Ford tells me. “You should deal with each hand separately, at least, and probably each nail separately. I don’t think combining all the nails together is a good idea.”


The dispassionate examination of crime scene evidence — narcotics, fingerprints, hair and fibers, genetic material, firearms, and everything else — is a cornerstone of the American justice system. The work, which can mean the difference between life and death for a suspect, is carried out by more than 500 labs nationwide, most of them run by law enforcement agencies.

In the public imagination — as shaped by endless cops-and-lawyers TV shows — forensic science is a perfectly impartial arbiter of justice. Eyewitnesses get confused. Police may be corrupt. Lawyers can corkscrew facts. Juries, not always composed of the brightest lights, can be swayed by mob dynamics. But science doesn’t lie. If the analyst says the bullet came from the suspect’s gun, then it must have.

It’s a comforting thought.

There’s just one problem: All forensic science is performed by humans, and all people make blunders. They mislabel samples. They use malfunctioning equipment. They inadvertently drop a flake of skin in a vial of blood, thus adding their own DNA to the sample.

Subjectivity, too, plays a starring role in forensic science, much of which depends on human-made comparisons. In one case heard last year by San Francisco Superior Court Judge Robert Dondero, two DNA experts couldn’t agree on the meaning of a genetic sample.

In addition to honest mistakes born of incompetence and overwork, there are continuously uncovered examples of fraud: the lab analyst, believing that the verdict justifies the means, willing to lie on the stand or fake test results. While the scientific question of DNA accuracy has been hashed out extensively in court rooms and the media, the issue of police crime lab accuracy has gone ignored, both by press and government regulators.

Each year California cops make 1.5 million arrests. Each of the state’s 19 local crime labs — run by sheriffs, prosecutors, and cops — performs thousands of analyses annually. Each of those tests, if faulty, could put an innocent person behind bars, or set a guilty soul free.

And in the wild world of forensics there are precious few safeguards against human bias and error: Crime labs are almost entirely unregulated. There are virtually no federal laws governing their operation; no law that says, “Bullet comparisons must be done using the best, most accurate techniques”; no law that says, “DNA examiners must meet these basic educational criteria”; no requirement that crime labs be audited and inspected. In California only DUI-<\h>testing procedures are regulated by state law.

“There’s more regulation in whether some clinical lab can give a test for strep throat than there is on whether you can use a test to put somebody in the gas chamber,” public defender Burt says. “That to me seems backwards. The stakes are the highest in the criminal justice system. These people are deciding who lives or dies.”

The ramifications spread beyond individual cases. While billions of dollars have been poured into police departments and prisons over the past two decades, pols and badge wearers have shown little interest in adequately funding or regulating crime labs. California’s facilities need hundreds of millions of dollars in repairs and equipment upgrades. The idea of public oversight is off the radar entirely.

The nonprofit American Society of Crime Laboratory Directors (ASCLD) is the closest thing forensics has to a regulatory agency. Created in the early 1970s to “improve the quality of laboratory services provided to the criminal justice system,” the group runs a voluntary accreditation program for forensic facilities. To get the society’s stamp of approval, a facility must pass a 149-point inspection. (Sample question: “Are the procedures used generally accepted in the field or supported by data gathered in a scientific manner?”) To maintain the certification, a lab must be tested annually and be reinspected every five years.

Of the approximately 500 labs in the United States, a mere 187 are accredited by the ASCLD. Only 11 of California’s 19 local crime labs have the group’s seal of approval. The San Francisco police facility isn’t one of them. Neither is the Contra Costa sheriff’s lab. Nor the San Mateo sheriff’s forensic unit.


“Got dope?” asks the white-coated woman who opens the locked door to the SFPD crime lab. She’s expecting cops bearing drug-filled baggies, to be weighed and tested and filed away until the courtroom beckons. Crime lab chief Martha “Marty” Blake steps out of her windowless office to greet me.

A few months back, Blake and her 18-person team traded overstuffed quarters in the city’s central cop shop at Eighth Street and Bryant for expansive new $1.5 million digs out in the asphalt wastes of the Hunters Point shipyard. “I’m getting ready to apply for accreditation, hopefully by next spring,” she says, pointing to a file cabinet emblazoned with the ASCLD seal. “We couldn’t get accredited in that facility when we were downtown at the Hall of Justice. It was too cramped. There was no way we could guarantee there would never be any chance for any contamination of the evidence when we had four people crammed into a little room trying to look at clothing, for example.”

Blake’s operation has taken its lumps over the years. In 1994 analyst Allison Lancaster was canned after she was videotaped faking drug tests. Last year Superior Court Judge Dondero slammed the lab’s lead DNA expert for “engaging in shortcuts,” “performing missteps,” and harboring a questionable “degree of bias” against defendants. Defense lawyers like Burt continue to hammer the lab for its lack of credentials.

With her eyeglasses and graying hair Blake looks more like a schoolteacher than a cop. She pulls a xeroxed sheet of paper out of a drawer and eagerly places it in front of me. “We just switched to a new case review process. This is the sort of thing we have to implement for accreditation. Every case we produce has to go through a review by a supervisor,” she explains. “This wasn’t happening before; a review happened before, but you’d just glance over [the work] and say, ‘Hmm, looks good to me,’ and initial it. It was sort of lightweight.” Bolstered by an increased budget and a growing staff, the lab’s procedures are improving across the board, according to Blake.

Why should forensic labs, which can land someone on death row, go without government oversight? “I’d like to think we can do this ourselves,” Blake replies, noting that the state’s management of the DUI testing program has been less than stellar. “I’m a little nervous about other agencies getting involved in regulation,” she says, because they don’t “really know the science.”

Nationally, the accountability vacuum is producing a steady stream of scandals, raising unsettling questions about the way we administer justice in this locked-down nation. A small sampling:
• Let’s start with the trial of the century, wherein O.J.’s defense team put the forensic bunglings of the Los Angeles Police Department on display for “unacceptable sloppiness,” pointing out a dozen major instances of possible evidence contamination. After losing the Simpson trial, the lab promptly began a thorough overhaul.
• In 1993 the West Virginia Supreme Court found a police blood expert guilty of fabricating or misrepresenting evidence in a staggering 134 cases. The man, one Fred Zain — employed by the state cops during the 1980s — was put on trial for perjury, while the state freed several unjustly imprisoned death row inmates and paid out millions to people who had been wrongfully convicted. Bexar County, Texas, where Zain worked in the early ’90s, also prosecuted him for perjury.
• A few years later, in 1997, the reputation of the Federal Bureau of Investigation crime lab — at the time widely regarded as the pinnacle of forensic science — was shredded by the allegations of a whistle-blowing scientist. The bureau’s lab practiced shoddy science and regularly presented inaccurate, pro-prosecution testimony, charged Dr. Frederic Whitehurst, one of the agency’s top explosives experts. The FBI denied the allegations and tried to discredit Whitehurst, but a scathing 517-page report by the Justice Department’s inspector general corroborated many of the scientist’s major claims and recommended disciplinary action against five agents.

• An April 1997 front-page story in the Wall Street Journal brought more unflattering publicity to the FBI lab, scrutinizing the track record of agent Michael Malone, a hair and fiber analyst. The paper quoted three well-known forensic scientists who challenged Malone’s analyses (one labeled him a “fraud”), illustrated numerous cases where the agent seemed to be fudging the evidence — and noted that courts were busy overturning convictions obtained with his testimony. “The guy’s a total liar,” one defense lawyer told the Wall Street Journal.
• In 1998 San Diego jurors convicted a top county police DNA expert of embezzling $8,100 in cash seized as evidence in murder cases. That same year the San Diego Police Department embarked on a 10-month internal investigation into charges of sloppy work and missing evidence at its crime lab, and it admitted that it had lost crucial evidence in an unsolved homicide case.
• Last year a crime lab chemist in Prince George’s County, Md., claimed that the police department was using improperly calibrated drug analysis equipment. Defense lawyers promptly challenged some 100 pending drug cases.

California is one of the few states that has actually scoped the inner workings of its local crime labs. The results of that onetime review, performed in 1998 by the state auditor’s office, are disturbing. Quality control was lacking at most of the facilities. Many of the labs were using “outdated and improperly working equipment.” As in San Francisco, many didn’t make their scientists undergo regular proficiency testing.

Without quality assurance measures — minimal at 13 of the 19 labs — the potential for error shoots through the roof. California auditor Elaine Howel says the study raised serious questions. “There are several issues,” she says. “Is the evidence being handled appropriately so there’s no potential for contamination?” Labs, according to Howel, should “make sure they are consistently applying the methodology so one forensic examiner isn’t using one technique and someone is using a different technique to conduct the same type of testing. That ties back to the credibility of the results.”

Ten of the outfits were relying on “outmoded” technology that needed replacement. At the Huntington Beach Police Department lab, staffers worked up a Rube Goldberg–<\d>esque scheme to revive a broken arson analysis gadget. Sort of. “Because the laboratory does not have the funds to replace this equipment, staff found a creative way to cool the [machine] using hoses rigged to a faucet,” auditors found. But, they noted, “this method could negatively affect the analysis of the evidence processed by this instrument.”

Then there was the question of whether the analysts themselves were up to par. “We think forensic examiners need to be tested every year to make sure they’re maintaining competence in their ability to perform the forensic examinations they’re doing,” Howel tells me. Eight of the labs had no proficiency testing for their staffers.

“It helped us put our operation in perspective to the rest of the state,” says S.F. lab chief Blake, who thinks the audit was fair. “We did look like we were swamped. It helped us get our additional staff.”

Whitehurst, the former top explosives expert at the FBI, doesn’t like the term ‘whistle-blower.’ “We’re simply scientists, and we disagree with the type of science that’s being practiced — because it’s not science,” he told me. “Our forensic labs are dictating truth; they’re not discovering it.” Whitehurst says he constantly hears from irate crime lab scientists claiming their operations are riddled with improprieties.

The Ph.D. chemist spent eight years at the bureau combing the rubble of bomb blasts for clues. And complaining. During his tenure with the bureau, he made 237 written complaints concerning what he saw as a pattern of bunk science and bogus testimony on the part of his colleagues. The charges spurred an 18-month probe by the Justice Department, the phone-book-size results of which were made public in 1997, undoubtedly marking one of the FBI’s worst public embarrassments.

The special-inspection team, an international panel of renowned forensic scientists, had few kind words for the lab, finding “significant instances of testimonial errors, substandard analytical work, and deficient practices” in numerous investigations, including the Unabomber, Oklahoma City, and World Trade Center bombings. Among the skeletons in the bureau’s closet: “scientifically flawed reports”; examiners devoid of the “requisite scientific qualifications”; and five agents who couldn’t be trusted.

Whitehurst’s experiences have led him to believe that crime labs should be overseen by federal or state authorities, rather than by ASCLD and its voluntary certification program. “It’s a foregone conclusion; there’s no question in my mind in five years forensic labs will be regulated, and they will be audited,” said Whitehurst, who now lives in Bethel, N.C., and acts as an expert witness in criminal trials. “There’s too much discovery happening.”

Lab directors argue that their work is constantly reviewed by the courts — juries don’t have to believe a forensic expert; judges can overturn verdicts based on forensic evidence — making their profession among the most scrutinized.
Whitehurst disagrees, saying juries, defense lawyers, and judges are often baffled by the science presented to them. “Listen to this phrase: pyrolisis-gas chromatography/mass spectrometry,” he says. “Do you know what that is? Let’s try this one: fourier transform infrared spectrometry. I’ve got a doctorate in chemistry and a jurisdoctorate also. What I’m saying to you are completely foreign concepts. When I try to explain how a ultraviolet spectraphatometer works, or how a micro spectraphatometer works, just saying the words begins the glass-over of the eyes.”

The Alameda County Sheriff’s crime lab is housed in a two-story building in the foothills just off 150th Avenue in San Leandro. On the second floor, in a series of linoleum-tiled rooms connected by a cluttered hallway, the lab’s technicians scope the physical remnants of crime, putting bullets beneath microscopes, lifting latent fingerprints from knife handles, culling DNA strands from splattered blood.

Each year the operation, which analyzes evidence for most of the county’s police forces, handles some 200 “major” investigations, most of them murders and rapes. But drug cases (1,800 to 2,000) and DUIs (more than 4,700) make up the bulk of the work. There are only eight lab technicians to handle the massive load.

“Every analytical report has to be right on the mark,” said lab director Tony Sprague, who has worked at the facility for 30 years. “We have a huge responsibility to make sure all the results are accurate.”

Sprague guides me through the building, showing me a single lead particle, as magnified 10,000 times by a monstrous, $270,000 scanning electron microscope. Next door a white-<\h>coated technician sits glued to a conventional microscope, studying a handgun cartridge. Across the hall are the analysts’ personal workstations: on one of the wide-topped tables sit the innards of an auto; on another lie sheets of paper covered with boot prints.

Sprague is an amiable gearhead and explains in detail how each of the machines works. The gas chromatograph/mass spectrometer, an ovenlike slab of a machine, can detect the presence of gasoline or kerosene in air samples collected at the scene of a suspected arson fire. Another device uses infrared light to determine the chemical composition of a given substance — a bag of white powder for instance.

The lab’s ASCLD accreditation in June 1999 was a huge undertaking, according to Sprague. “It took us about two years [to get certified],” he says. “It was costly from the standpoint that you have to take dedicated staff time away from analytical work to get the paperwork done for the accreditation process. In our case we really didn’t change our ways of doing forensic science to meet accreditation standards. There was really no issue about doing things differently — the thing we had to do, we had to document all the policies, the procedures, all of our quality assurance records had to be brought up to a little bit higher level.”

Voluntary reviews by the nonprofit ASCLD are enough regulation for Sprague, who views government oversight as a losing proposition. “Some mandated federal program? I don’t know that that’s really the answer,” he says. “That would involve a huge bureaucracy. It would be a very difficult situation.”

Ralph Keaton, executive director of ASCLD’s accrediting board, agrees. “I think crime laboratories should have some kind of program to review the quality of the work being produced by the laboratory — and that’s the reason we came into existence,” he tells me via telephone from the organization’s headquarters in Garner, N.C. “It’s my opinion that no one can evaluate the type of work being done better than the actual practitioners of that discipline. Just like the oversight of the medical profession is best done by the doctors themselves.”

Speaking to me in his office library, Sprague tells me he is proud of the work his team does, proud to be acknowledged by his peers. But he admits to a certain frustration, saying that his lab is seriously short-staffed: “We’re about one-third the strength we should be at for what we’re doing.”