One of Scottsdale’s DUI-conviction-machines has some serious problems. I first wrote about it in 2012 after a Scottsdale City Court judge prevented me from telling a jury about the problems. The judge demanded an offer of proof before he was willing to admit evidence of anything calling into question the city’s malfunctioning piece of equipment. Instead of making the state bear the burden of proving the test was accurate and admitting all of the information about its problems, he presumed the results were accurate and precluded any information to the contrary.
I wrote about it again in 2013, when a Maricopa County Superior Court judge finally ruled that blood test results from the machine in several cases were inadmissible pursuant to Rule 702 of the Arizona Rules of Evidence. The rule deals with witnesses who are qualified as experts by knowledge, skill, experience, training, or education, and the superior court felt the underlying principles and methods were not properly applied by the Scottsdale Crime Lab with regard to the malfunctioning machine, which the last subsection of the rule requires.
I was thrilled that a judge finally did something about it. On top of that, he didn’t just allow defense attorneys to point out the machine didn’t work right a lot of the time. He decided based on things at least one city court judge wouldn’t even let defense attorneys mention that evidence from the broken machine shouldn’t even be introduced. Unfortunately, I was skeptical about whether the Court of Appeals of Arizona would affirm a ruling so clearly helpful to DUI defendants. Sometimes, I hate being right.
Before I tell you just what the appellate court did, it’s worth explaining some background about the machine. Scientists would call a real one a “gas chromatograph,” but I don’t know if the messed-up one Scottsdale has deserves such a scientific-sounding title. Here’s the first problem the superior court described:
There have been a number of dropped data instances. At first, the lab thought that the data drops were due to loose caps. They later altered their conclusion.
A malfunctioning instrument is fine in a lab, isolated from the justice system. The results are scrapped, the problems are fixed. When people’s lives are on the line, like in a criminal case, it seems to me that we should be a little more careful. Scottsdale couldn’t care less, of course, because Scottsdale never cares when it comes to convicting people. They didn’t even think it was worthwhile to hire someone qualified to look for the root of the problem. It’s just a bunch of no-good drunk drivers getting sent away by the machine, after all.
The criminalist who seemed in charge of dealing with the issue had no experience installing software. This is the same person who thought the problem was a loose cap in August of 2009 and just had the lab re-prepare, recap and reanalyze samples before calling it good when the re-runs checked out “okay.” Seeing how dropped data sounds an awful lot like something software-related, you’d think that might bother someone in Scottsdale. But then again, this is Scottsdale. That same unqualified criminalist was still working on the problem at the end of 2010, when she changed her mind about it being a loose cap. Here’s why:
Perkin Elmer, the manufacturer of the “2003 instrument”, advised it was a communication error. Evidence at the hearing indicated that a completely loose cap results in a flat line, not a shut down.
The manufacturer, which probably does have some experience with software, eventually installed a software patch to address these problems with the machine. Had Scottsdale put someone who knew something about software in charge of the software problem, perhaps they might have tried to fix the software sooner. Either way, though, it probably wouldn’t have worked. After they tried to fix the communication error, there were still problems. From the superior court:
However, even after this repair was completed, the same issues of dropped data recurred on more than one occasion.
Being Scottsdale and not caring the slightest bit about anything that might stand in the way of convictions, they never took the instrument out of service. They just kept plugging away. The software engineer at the manufacturer said they did “a root cause analysis,” but he didn’t really know much about the mis-numbered vials. He thought it “could be a consequence” of the problem and “not an issue that was fixed specifically.” Here are some more of his thoughts:
I said that this rapid remedy was not specifically about fixing a problem with misnumbered vials. I’m saying that this is a possible symptom of this problem and that we fixed the underlying problem, which was this timing issue between the instrument and the software. And fixing that problem would make the symptom of the misnumbered vials go away.
The engineer went on to explain:
We understood the problem to be the instrument did not start sometimes. And when we investigated the part of the software in the system where that occurs, we knew exactly the area of the software that was affected, and the patch that we put in place addressed that issue. The testing that we performed would have been surrounding the application of that patch in that limited functionality, and if that problem was resolved, it should eliminate the problem of the misnumbered vials.
It “should” eliminate the problem. Comforting, huh? It gets worse:
During his deposition, [the engineer] further testified that the problem identified was that the instrument sometimes didn’t start and collect data. According to the testimony of [the criminalist], [the manufacturer] wanted to install a second patch at a later time, but the Crime Lab declined. [The criminalist] felt that the dropped data was an inconvenience since it occurred so infrequently. [The criminalist] also testified that after the patch was installed by [the engineer], the problem recurred on May 12, 2011, in a batch that was run by [another criminalist]. The detector shut off at vial number 64. A non-conforming memo was prepared. No reason was given for the shutoff.
The people who don’t know anything about software declined to fix the software even though the people who know about software and made the machine felt differently. In Scottsdale, it seems they can’t be bothered with things like knowledge and expertise. At least they’re consistent, as the superior court noted that, “[i]nterestingly,” their lab manager was not even qualified to read chromatographs.
Defense experts who actually have some relevant expertise all agreed that the machine should have been taken out of service. They said there should have been an investigation to avoid a recurrence, to determine what caused the problem, and to understand its scope. They said data drops create non-conforming work and that the cause for the data drops has never been fixed. The software engineer the defense called opined that, “from his perspective, the findings are not scientifically reliable if there are data drops, mislabeling, and if the machine stops securing data.” Seems obvious enough, right?
The superior court in its opinion listed one problem after another. One time, a run involving 28 subject was rerun because the original run didn’t contain sufficient information required for the labeling of the vials. There wasn’t even a departmental report number, subject name, or vial number in some instances. Others times, the were communication failures at certain vials. After a point, the vials would have the correct number but incorrect names. Another time, it mislabeled all the vial with the same number. It also missed the injection of the quality control sample once and skipped vials, labeled the “B” column as an “A” column in runs, and ceased to collect data during analysis, yet it was never taken out of service. In several other instances, there were jams that resulted in samples not being analyzed, vials that were mislabeled in various other ways, and failures to analyze data completely. A few problems only came to light thanks to defense attorneys. One run had two peaks while another had only one. The city’s analyst hadn’t noticed the problem, confessing that he missed it to a defense lawyer in an interview. He didn’t and probably still doesn’t know why the error occurred because just like with every other problem with the machine, the batch was rerun but no meaningful root cause analysis was done.
Judges let the city get away with this stuff, and people at the lab knew about the problems and their incompetence and even joked about it. Here’s an email from the lab manager to her supervisor:
Let me ponder this one if I can…how long can we live with one GC instrument? I am inclined to say a full validation in light of the Court issues. At a minimum sensitivity, reproducibility, and precision. [employee of the manufacturer] + [our criminalist] does not equal super confidence…
The supervisor had wanted to run a validation of their other machine using the broken one. In the email, she also said this:
The only problem I see in comparing one to other is if it came up we are now comparing 2002 to 2003 which was the problem instrument.
As the superior court noted, the emails were never disclosed; the press just happened to discover them after filing a freedom of information request. Here’s another undisclosed gem from the manufacturer to the criminalist:
It was very nice meeting you today at CAC! Thank you very much for sharing your concerns about our HS-GC you have in your lab. Let’s work on resolving the issue quickly so you don’t have to face defense attorney’s challenges in Court related to our malfunctioning instrument.
Scottsdale’s lab employees didn’t think that was worth disclosing. In all fairness, though, courts have trained them not to worry about disclosing things they should.
When a superior court judge ordered Scottsdale to disclose all subject test chromatograms and notes for 2011, the appellate court quickly vacated the order:
As discussed above, [the defendant] has failed to explain how evidence that the [chromatograph] periodically experiences data drops and thereby fails to produce test results undermines the accuracy and reliability of her test results. In the absence of such an explanation, there is no basis to conclude that the subject test data contains exculpatory evidence.
No one doubts that the machine frequently doesn’t work, and no one knows just what the root of the problem is or what other problems might occur as a result. Because the defense can’t undermine the accuracy and reliability of a malfunctioning machine using the incomplete information the city has provided from its malfunctioning machine, though, the city shouldn’t have to disclose complete information about the machine. I have no doubt the appellate court would’ve vacated an order telling Scottsdale to disclose internal emails about the machines too, as the defense couldn’t prove the emails would contain exculpatory evidence. See how that works? Does it seem fair to you?
It should come as no surprise to you at this point that the appellate court vacated the superior court’s ruling on the admissibility of evidence from the broken machine. Essentially, the court thought the state had shown the results were scientifically valid and the defense didn’t present anything suggesting the results for the defendants were inaccurate. Like I said before, even though nobody really knows the root of the problem with the malfunctioning machine or its extent even now, the defendants couldn’t show the malfunctioning machine malfunctioned in their case. Thanks to the same court, the defense also didn’t get complete information about the malfunctioning machine’s various malfunctions because they didn’t have enough evidence to show it malfunctioned in their cases.
It’s a remarkably unfair situation and an abuse of powerful scientific evidence, but I’d expect nothing less from Scottsdale. The appellate court’s continued approval is what’s really disturbing.
Filed under: DUI · Tags: appeal, BAC, blood, chromatogram, chromatograph, court of appeals, driving under the influence, DUI, DWI, results, rule 702, samples, scottsdale city court, scottsdale crime lab, special action, test, unreliable