NCJ Number
85338
Date Published
1981
Length
19 pages
Annotation
The main premise of this essay is that there are certain idiosyncracies of crime data that, if exploited, allow more stringent tests of models concerning crime than are usually applied.
Abstract
The discussion refers in particular to the possibility of obtaining, through direct probabilistic reasoning, fairly specific estimates of the size of the natural random fluctuations in various kinds of crime data (i.e., variations over time or space unrelated to systematic factors). With an estimate of what portion of a model's prediction errors can reasonably be attributed to chance, one can obtain some idea of how much of its errors are consequences of its own imperfections. The study examines regression models for random fluctuations in homicide levels and robbery levels. The analysis was based on the conjectures that through direct probabilistic reasoning and careful attention to data, one can reliably estimate the amount of random noise in the recorded levels of various crimes and that such estimates can generate additional tests of the validity of deterrence models, tests geared directly to their particular purposes. While some evidence is offered in favor of both conjectures, neither has been proven. There is the possibility that they are true only under highly restricted circumstances. Additional research is required to provide a clear indication of the power of the approach suggested here. A discussion of random fluctuations in the recorded levels of robbery is appended. Mathematical equations, tabular data, and four references are provided.