THE HONORABLE DEBORAH J. DANIELS
ASSISTANT ATTORNEY GENERAL
OFFICE OF JUSTICE PROGRAMS
ANNUAL CONFERENCE ON CRIMINAL JUSTICE
RESEARCH AND EVALUATION
MONDAY, JULY 19, 2004
Thank you, Sarah. And thanks to your staff for their customary stellar job in organizing this event.
As usual, this conference promises to cover just about everything the research and evaluation community is doing in the area of criminal justice - from crime mapping to illegal gun markets to the presentation of DNA evidence in the courtroom. For eleven years running, it has been the forum for sharing research findings and discussing emerging trends in criminal justice policy and practice.
It is also at least partly responsible for bridging the divide between analysis and application, and for helping the various criminal justice disciplines appreciate the implications of research on their work. In years past, law enforcement practitioners sometimes turned a deaf ear to suggestions from the research community that they base policy determinations on sound research. But when studies in Kansas City, Birmingham, and San Diego in the 1970s began to demonstrate the efficacy of beat-profiling, problem-oriented policing, and tailored patrolling, the evidence was just too compelling to ignore. And thus was born the concept of community oriented policing, an approach that has been attended by significant drops in crime rates.
In recent years, not only law enforcement, but also prosecutors, corrections officials, probation and parole officers, and service providers have looked to researchers for guidance in solving the problems posed by crime and delinquency.
And what is perhaps more remarkable - and particularly germane to this year's conference - is the fact that these professionals haven't been content to rely on findings antecedent to their own efforts. They want to know if what they're doing is, in fact, working in their own communities. They recognize the necessity of measuring success at the local level, not just relying on a study conducted in a different community suggesting that a particular approach may work. One might say that the work you do has created an appetite for evaluation.
But why evaluation? In this era of limited and increasingly coveted resources, why are we so focused on measuring results? Shouldn't we be using our scarce funds to give criminal justice professionals the tools they need to do their jobs, not to tell them how many firearms crimes were committed in Jackson, Mississippi three years ago? Isn't all this data collection and analysis just a distraction from the real work of fighting crime?
These are not merely rhetorical concerns. They are very genuine and valid questions, and unless we want to be labeled as hypocrites, they deserve serious answers.
Let me frame the response with an example. Many of you are familiar with the Drug Abuse Resistance Education, or DARE, program. DARE is the most widely used school-based drug abuse prevention program in the nation, operating in 75% of U.S. school districts to the tune of $200 million annually. Launched in the early 1980s, it has become part of our human services landscape.
Well, much to our dismay and in spite of the noble efforts of its instructors, DARE was found in randomized controlled trials to have little or no effect on drug use by participants. Of course, the good folks who run DARE didn't become defensive. They recognized the value of the studies in advancing their goals and altered the curriculum to make it more effective.
But anyone would agree that $200 million is a great deal of money for something that yields a negligible return. And the obvious collateral effect is the loss of investment opportunities in programs that do work. To put it in bottom-line terms, we simply cannot afford not to evaluate.
I often use a personal example, having more to do with the cost of human life than the cost of programs, in describing the origins of my personal commitment to social science research in the criminal justice arena: the groundbreaking research on the handling of domestic violence cases conducted by Dr. David Ford of Indiana University back in the early 1980s, in collaboration with the prosecutor's office in Indianapolis.
We learned some valuable, and in many cases counterintuitive, lessons through that research - such as the fact that "no-drop" policies do not necessarily have the positive effect in protecting the victim that we once thought they did. In fact, we found that combining an arrest of the perpetrator with a choice on the part of the victim as to whether to pursue the case had the highest likelihood of protecting her from further violence - not forcing her to go through with the case whether she wanted to or not.
We in OJP and at the Department of Justice feel the urgency of this charge to evaluate the effectiveness of policy options, and are working to meet it.
We have worked closely in the past two years with the Coalition for Evidence-Based Policy, an offshoot of the Council for Excellence in Government, to further the cause of research to inform policy across the federal government. Just last month, the Office of Justice Programs sponsored a National Forum on Evidence-Based Crime and Substance Abuse Policy, here in Washington.
We invited researchers and policymakers from various agencies of government, including Congress as well as the executive branch, to discuss the critical need for more research and evaluation of various approaches to crime and drug abuse reduction.
Further, we promoted the concept that randomized controlled trials, often believed by both human services and law enforcement professionals not to have application to their work, can be very effective in measuring the effectiveness of policy approaches taken by these professionals.
We intend to continue encouraging the use of this "gold standard" whenever possible in criminal justice, delinquency prevention, and drug abuse policymaking. And I'm happy to report that there is a growing recognition across the federal government that we must embrace randomized controlled trials for research on government-funded programs, and seek to come as close as possible to that standard in the research we fund.
We have also developed a "what works" web site, to help inform practitioners and community leaders regarding effective approaches. We will be very discerning about what we determine rises to that level. And we continue to fund efforts to identify those approaches that are truly effective.
Those of you who attend and present at this conference are on the cutting edge, not just in this area, but in several others. I'm delighted to see that the lead-off plenary is on a subject to which I have devoted quite a bit of my personal time: information-led policing. Your excellent panel on this subject will explain how we are seeking to identify new technologies and make those available to policing agencies in order to make them far more effective - both in the traditional law enforcement sense and in the new frontier of homeland security.
I'm glad, too, that you will be discussing the issue of partnerships between law enforcement agencies and researchers, and the tremendous benefits that can be realized by bringing together the policymakers with the data analysts.
You'll benefit from the state of the art research in areas ranging from early intervention systems to prevent delinquency and drug abuse, to controlling illegal gun markets, to offender re-entry, to police misconduct, to terrorism prevention and response.
The agenda is a full one, and one that should energize each of you, no matter from what perspective you approach the conference - as a researcher, a data analyst, a policymaker, or a feet-on-the-street implementer of policy.
I commend you for your individual and collective commitment to criminal justice research and evaluation. I know that, when you depart this conference, you will be better equipped to improve our nationwide approach to both crime and terrorism. Best wishes for a most successful conference.