Executive Summary State and Federal Corrections Information Systems An Inventory of Data Elements and an Assessment of Reporting Capabilities A joint project: Association of State Correctional Administrators Corrections Program Office, OJP Bureau of Justice Statistics National Institute of Justice September 1998, NCJ 171686 Acknowledgments This report was prepared by the Urban Institute under the supervision of Allen Beck, Ph.D., of the Bureau of Justice Statistics. Laura Maruschak was the project monitor at BJS. The project is sponsored by the following: Corrections Program Office, Larry Meachum, Director Bureau of Justice Statistics, Jan Chaiken, Ph.D., Director National Institute of Justice, Jeremy Travis, Director Project assistance was provided by the Association of State Correctional Administrators (ASCA). The project was supported by BJS grant number 97-MU-MU-K007. Principal staff for the project at the Urban Institute were William J. Sabol, Ph.D., Barbara Parthasarathy, Katherine Rosich, Mary Spence, and Mark Braza. Charles Friel, Ph.D., served as consultant to the project. Dave Williams and O. Jay Arwood designed and produced the report. Tom Hester of BJS provided editorial review. Marilyn Marbrook of BJS administered final publication. The project staff acknowledges the cooperation and support of the ASCA State-Federal Committee, especially the contributions they made at the Committee meeting of June 16, 1998, in St. Louis, Missouri. Committee members and other meeting participants, including Chairman Joseph Lehman, Kathy Hawk Sawyer, Robert Bayer, Mike Sullivan, Harold Clarke, Morris Thigpen, Ari Zavaras, Richard Lanham, Elaine Little, Larry Meachum, Phil Merkle, Camille Camp, and George Camp provided excellent comments after reviewing the draft report. The project thanks the members of the project's Advisory Committee for their assistance and thanks all of the staff of the corrections departments who responded to our surveys and participated in interviews. The Corrections Program Office, the Bureau of Justice Statistics, and the National Institute of Justice are components of the Office of Justice Programs, the U.S. Department of Justice. The Association of State Correctional Administrators is a nonprofit membership organization dedicated to the improvement of correctional services and practices. The contents of this report do not necessarily reflect the views or policies of these organizations. Table of contents Executive summary The inventory asks departments what data they have and how they are able to use those data 1 How the Inventory was conducted 3 Correction departments collect a common core of data elements 3 Staffing and software pose obstacles to providing statistical information 6 Using the Inventory report 10 Next steps 11 Tables Table 1. Availability of corrections data in the six high priority areas 2 Table 2a. Percent of full availability of core data elements from departments that collect information on released offenders 7 Table 2b. Percent of full availability of core data elements from departments that do not collect information on released offenders 8 Table 3. Severity of problems in departments' information systems 9 Executive summary In a series of meetings, members of the State-Federal Committee of the Association of State Correctional Administrators and representatives from the Corrections Program Office (CPO), the Bureau of Justice Statistics (BJS), National Institute of Justice (NIJ), the National Institute of Corrections, and the Federal Bureau of Prisons identified the need to assess the current status of offender-based information systems in corrections. Correctional administrators wanted to move toward a set of performance indicators that could be used to describe, measure, and compare management outcomes among departments of corrections. Administrators also expressed that they often lack basic information needed to formulate new policies or to defend existing practices. Researchers highlighted the difficulties of conducting comparative studies in the absence of basic agreement on concepts and definitions, and the diversity in the quality and coverage of data elements in these systems. In response, CPO, BJS, and NIJ sponsored a project to conduct an inventory and assessment of more than 200 data elements in State and Federal corrections information systems. As a result of a competitive process, the Urban Institute in Washington , DC, was awarded a cooperative agreement to conduct the inventory. An advisory committee, including representatives of the State-Federal Committee, other corrections officials, corrections researchers, and representatives of the sponsoring agencies and the Urban Institute, was formed to guide the design of the inventory and to identify priority information areas for attention. The results of these efforts are highlighted in this Executive Summary. The report that this summary describes, State and Federal Corrections Information Systems: An Inventory of Data Elements and an Assessment of Reporting Capabilities, fully presents the detailed findings. The report, now under review, will be available in the fall of 1998. The Inventory asks departments what data they have and how they are able to use those data The Inventory of State and Federal Corrections Information Systems is built around six priority information areas: offender profile, internal order, program effectiveness, public safety, recidivism, and operational costs (table 1). The Inventory addresses two questions: What data on most adult sentenced prisoners do departments collect and maintain in electronic form in their information systems? and To what extent can departments use these data to respond to requests for statistical information about groups of offenders? In answer to the first question, most of the 52 departments of corrections collect and maintain a common core of data elements that measure many key events in and outcomes of the corrections system. These data elements can be used to describe and profile offenders, to measure recidivism in terms of returns to prison, and to measure aspects of public safety related to offender registry requirements. However, not all departments define and collect these data equally, and 12 departments do not collect any data about released offenders. In several other important areas, including internal order, program effectiveness, and operational costs, departments do not maintain core sets of data. Table 1. Availability of corrections data in the six high-priority areas Priority information Data about offenders, events, and outcomes area In the common core Not in the common core Profiling offenders Who are they? Demographic Family characteristics characteristics Socio-economic status What have they done? (Offenses and Conviction offenses Criminal history criminal history) Criminal incident Where are they? (Sentences) Sentences imposed (Commitments) Current commitments Expected time to be served (Assessments) Risk assessment Needs assessments Classification decisions Confinement characteristics (Releases) Post-commitment movements Good time, other adjustments Releases from custody Internal order Misconduct and infractions Responses to misconduct Legal proceedings Program effectiveness Program participation Drug testing Medical testing Public safety Offender registry Crimes committed after release Information about victims Employment and residence Recidivism Violations while on release Responses to violations and returns to prison Operational costs Program management Medical services Facility management Most of the data elements in the survey were offender-based data elements and nearly half make up the common core. Most of the core data elements are related to offenders' characteristics, offenses, sentences, how long they can expect to stay in prison, their security risk, and where they were confined. Additional data elements describe release requirements, offenders' behaviors after release, criminal justice system responses to the behaviors, and returns to prison for violations of conditions of supervision. Corrections departments have encountered obstacles to using their information systems to generate statistical information about groups of offenders. Staffing and software present severe or critical obstacles in up to 28 departments and moderate obstacles in up to 20 others. Conversely, hardware presents few or no obstacles for most (40) departments. Data availability and quality are severe or critical obstacles in 12 departments and moderate obstacles in 22 others. Few departments maintain all of the core data elements in electronic form for the vast majority of offenders. Nine departments rate 88% or above out of a possible 100% on these criteria, while 32 departments rate less than 75%. Many departments may be able to construct critical measures of corrections outcomes that rely on smaller sets of data. For example, if recidivism is measured by the number of offenders who return to prison, then as many as 46 departments may be able to provide electronic data on this issue. These departments have some essential elements of a measure of recidivism (including the date of release from prison, date and type of commitment, and type of offense), and they maintain online or archive records of prior commitments to and releases from prison, and can link these records electronically. How the Inventory was conducted During January 1998 an Inventory questionnaire and Obstacles survey were mailed to information officers in departments of corrections in 50 States, the District of Columbia, and the Federal Bureau of Prisons. Additional telephone interviews were conducted to obtain background about systems architecture and capabilities. The Inventory contained 242 questions about data elements and capacities of information systems. Of these, 207 were questions about offender-based data elements, 15 were about facilities, and 20 were about capacities to link data. For each of the 207 questions about offender-based elements, the questionnaire asked officials whether they maintained the element. Those departments that maintained a data element were asked how it is stored C in electronic or paper form C and for what percentage of offenders it is collected. The Obstacles survey collected information on barriers information officials encounter in producing statistical information in response to queries about offenders. The survey was organized into five categories: institutional and legal, staffing, software, hardware, and data. Officials in all 52 departments responded to the Inventory and telephone interviews, and 51 returned the Obstacles survey. Corrections departments collect a common core of data elements In the information systems they use to manage adult, sentenced prisoners, most departments of corrections maintain a common core of data elements. These elements are generally maintained electronically for the large majority of offenders, and they relate primarily to the areas of profiling offenders and, to a lesser extent, public safety and recidivism. The majority of core data elements describe who enters prison; what they have done; why they entered prison; how long they can expect to stay there; their risk, needs, and confinement characteristics; their post-commitment movements; and how and to whom they are released from prison. The common core also measures the behaviors of offenders after release, including violations of the conditions of their supervision and whether they returned to prison. Twelve departments do not collect such data; instead, another department (for example, probation or parole) or another information system does. In other areas there are fewer, if any, core data elements. For example, there is relatively little common information regarding program participation, drug testing results, medical services, misconduct, infractions, responses to misconduct, the crimes and victims of crimes committed by offenders on release in the community, or about offenders' connections with mainstream institutions, such as labor markets. Departments most commonly collect data about demographic characteristics Offender profiles describe offenders' demographic characteristics and give an indication of their ties to mainstream institutions outside of prison. These institutions include families, schools, the military, and labor markets. The 11 data elements related to the demographic characteristics of offenders are more commonly collected than the other elements in the offender profile. Data maintained electronically for more than 75% of offenders are characterized as high availability. Fifty-one of 52 departments report that they maintain high-availability data elements in the areas of race and sex, and 50 do so for an offender's date of birth. Only 29 maintain high-availability data about offenders' education, 23 about their military service, and 17 about employment prior to incarceration. While 35 departments maintain high-availability data on the marital status of offenders, only 16 maintain data on the number of offenders' dependents. Departments commonly collect conviction, sentencing, and commitment data Data on conviction offenses, sentencing decisions, and assessment, classification, and confinement decisions also belong to the offender profile. In general, corrections departments maintain data in these areas with high availability. All departments maintain data elements electronically for the type and date of commitment to prison, and 51 maintain data on length of sentences. At least 49 maintain electronically several detailed data elements that describe offenders' conviction offenses and their expected dates of release from prison. At least 46 departments also maintain high-availability data elements for the total length of sentences, dates of sentences imposed, and whether sentences are imposed concurrently or consecutively. Departments maintain more data elements related to criminal history than to the particular crime for which an offender was convicted. Thirty-six departments can report electronically on the criminal justice status of most offenders entering prison, and up to 29 can do so for an offender's prior arrests, convictions, and the severity of these offenses. Only 21 departments maintain high-availability data on the date of a particular incident; 13 can report on whether a weapon was involved; and 6 can describe the location of the incident and the number of victims involved. However, other departments maintain data on criminal incidents in paper form. For example, up to 13 have some data elements on paper that describe criminal incidents. Most departments (between 37 and 48) collect data that describe offenders' needs, their security classifications, risk assessments, and units in which they are housed. Moreover, most departments that maintain these data elements do so electronically. Only 21 departments report the results of drug tests, and only 12 of these maintain the data electronically. Most departments can describe prisoner release information, but fewer maintain data elements on programs and internal security Departments commonly collect data elements regarding post-commitment movements (including transfers and releases from prison), changes in expected release dates (such as good-time or other adjustments), and offender registry. For example, all 52 maintain high-availability data on transfers and methods of release from prison, 41 maintain such data on good-time adjustments, and 32 maintain data about victim notification requirements. Other data elements related to movements and releases are also collected by a majority of departments. These include the reasons for movements, reasons for changes in sentences or good time, and time served in custody. Data elements that pertain to program participation and outcomes, drug testing, and medical treatment are less commonly collected and maintained in electronic format than those describing offender movements. For example, while up to 42 departments collect data about program participation, only 22 to 32 collect that information in electronic form; the others either do not collect these data elements or collect them in paper form. Similarly, data elements that describe misconduct and infractions are collected by a majority of departments but often in paper form. For example, while 47 departments maintain data about the most recent instance of misconduct, only 33 maintain them electronically and for a majority of offenders. Even fewer departments maintain data describing the event: between 20 and 24 departments do not indicate whether someone was injured or if drugs or alcohol were involved. Some departments do not collect data on public safety or recidivism Data elements about public safety are beyond the scope of 14 information systems, and 12 departments do not collect any data about the behavior of offenders after they are released from prison. Most of the 40 departments that do gather data on recidivism focus on data elements that describe the nature of the violation and the criminal justice response. For example, nearly all of the 40 departments can describe the type of supervision and reason for termination, and between 34 and 35 maintain the data electronically. However, if offenders commit crimes while on release, relatively few of the departments can describe the nature of the crime or the victims. Between 17 and 27 departments do not collect such data, and the few that do maintain most of them on paper. Of the 40 departments that collect data about released offenders, most do not collect data about whether offenders are employed, whether their employers were notified of their status, or where they live. Limited measures of recidivism are widely collected Despite the limitations of the scope of the information systems in 12 departments, many departments can provide some data on recidivism measured as a return to prison. Many departments maintain archived records of offenders' commitments to prison and the reasons for those commitments, and many can link the records of offenders who return to prison repeatedly. These departments may be able to count the number of times a person returns to prison and the length of time between each stay. Fewer departments are able to provide data on recidivism as measured by rearrest or reconviction. Staffing and software pose obstacles to providing statistical information Corrections departments encounter several barriers to using data to produce statistical information about offenders. These include the availability of data in electronic form; the staffing, software, and hardware available to the information system; and the institutional and legal restrictions on the information system. A few departments maintain all or most core data elements electronically No department maintains all the data elements about offenders in electronic format. However, several maintain all or most of the core data elements electronically for the majority of offenders. Such departments score at or near 100% on an availability index. For example, Colorado's department of corrections scored 100% for all core data elements, and six other State departments scored at or over 80% (table 2a). A third tier of departments rates fairly high on the availability index for the core data. These seven departments score above 70% in at least three areas but not below 60% in any area. As for profiles of offenders, nine departments received a perfect score for the demographics data elements and two did so for the data on commitment to prison. There were two perfect availability scores in the area of public safety; and eight in recidivism. The Federal Bureau of Prisons scores above 85% in the three areas for which it maintains data (table 2b). Several other departments (Georgia, New Hampshire and Rhode Island) score above 75% in these areas. Table 2a. Percent of full availability of core data elements from departments that collect information on released offenders Recividivism and Offenses Releases violations Demo- and com- and public under Department graphics mitments safety supervision Alabama 79% 94% 92% 95% Alaska 52 25 19 14 Arizona 100 98 96 100 Arkansas 52 67 60 52 California 82 74 48 79 Colorado 100 100 100 100 Delaware 73 55 59 39 District of Columbia 73 46 24 0 Florida 82 98 80 95 Idaho 73 54 43 48 Illinois 91 79 73 95 Indiana 85 77 75 77 North Dakota 91 67 60 68 Ohio 61 82 100 59 Oklahoma 76 71 80 71 Oregon 70 83 56 64 South Carolina 97 84 91 38 South Dakota 79 83 80 55 Tennessee 82 91 56 100 Texas 82 89 92 94 Utah 88 93 57 100 Vermont 61 58 33 27 Virginia 82 75 65 55 Washington 91 81 76 44 Wisconsin 64 75 77 68 Wyoming 94 65 68 3 Table 2b. Percent of full availability of core data elements from departments that do not collect information on released offenders Recividivism and Offenses Releases violations Demo- and com- and public under Department graphics mitments safety supervision Federal Bureau of Prisons 100% 94% 89% N/A Connecticut 64 82 48 N/A Georgia 76 87 89 N/A Hawaii 82 69 36 N/A Maine 85 67 76 N/A Maryland 48 68 59 N/A Nevada 82 71 59 N/A New Hampshire 94 77 79 N/A New Jersey 88 8 69 N/A Pennsylvania 100 72 63 N/A Rhode Island 100 78 89 N/A West Virginia 97 58 61 N/A Most departments have some capacity to link and retrieve archived data electronically Most departments archive records of repeated events, such as commitments to prison, behavior in custody, and releases from custody. Of these departments, most are able to link them and to retrieve them electronically. For example, 46 departments maintain an online history of an offender's commitments into prison. Thirty-one also archive these records. Of the 31 that archive the records, 28 can retrieve and link them electronically. Lack of experienced programming staff is a severe problem Departments face several other obstacles to providing comparable statistical information about offenders, the most severe of which arise from staffing and software. Twenty-eight departments report critical or very severe staffing problems, including a lack of experienced programmers and a lack of resources to train them. In addition, 14 reported critical or severe software problems, such as poor query capabilities. Staffing and software problems are inter-related. Having sophisticated statistical software without enough trained staff to operate it does not eliminate the software obstacles. Similarly, having staff but not providing them with adequate software means that customized programs may have to be written for all new queries. The reported difficulties related to staffing and software suggest that lack of adequate resources for operating a corrections information system may be the major obstacle to overcome. Eight departments rate staffing problems as "critical" (table 3). The 14 departments that rate software factors as "very severe" to "critical" problems focus on the ability to integrate data, data file structure, and the capability of statistical utilities and packages. Hardware is the least severe difficulty for most departments, and no department rates it as critical. Data storage, system reliability, and the ability to access historical data are some of the obstacles included in the hardware category. Table 3. Severity of problems in departments' information systems Number of departments with problems described as-- Problem Very Very area Critical severe Moderate little None Staffing 8 20 18 6 0 Data 4 8 22 16 2 Software 2 12 20 13 5 Legislative and insti- 0 5 25 19 3 tutional Hardware 0 1 11 26 14 The purpose of the Inventory and Obstacles surveys was to provide a basis for improving the quality of corrections data and enhancing electronic sharing of information. The report identifies the capacity of corrections departments to provide comparable data for performance measures and for cross-jurisdictional research. The approach taken has been to describe existing information systems, rather than to recommend a model system for all departments or to develop a strategy for future actions. The report identifies a common core of data elements that most or all departments collect; describes and analyzes the obstacles departments face in responding to statistical inquiries; and describes departments' capacities for sharing and linking data internally and externally. The Inventory and Obstacles surveys may be used-- * by departments for expanding data collection. The report identifies a common core of data elements in six high-priority information areas and provides a listing by department of data availability. Departments may use this information in developing priorities for adding data elements and improving availability. * by departments to assist in their ongoing information system redesign. Departments currently modifying their information systems may use the report to identify commonly collected data elements and to understand how departments differ in their capacities to maintain data in electronic form. * by research directors and other corrections researchers to determine availability of data elements in cross-jurisdictional studies. In designing comparative studies, researchers may use the report to identify the reporting capabilities of participating departments. * by ASCA members to develop strategies for establishing performance measures. ASCA members may use the report to develop more specific priorities about measuring corrections performance. The report may be used to identify indicators based on commonly collected data elements and to decide what information to expand for performance measures. The report may be viewed as a step in the broader effort to develop comparative corrections data and performance measures. To undertake this effort, several inter-related steps are required. These include identifying priority measures of corrections performance, enhancing information systems to provide those data elements needed to create the measures, and eliminating the barriers to reporting statistical information about offenders. Identifying priority measures of corrections performance Corrections performance may be measured in many ways, as indicators can be developed for many different events and outcomes of corrections processing. To provide reliable measures of corrections performance, and to do so in a manner that permits valid comparisons between departments, work needs to be done in identifying and prioritizing measures and in precisely defining them. Priorities may be based on the data elements that departments currently collect, or they may be based on a normative framework of measuring performance. Once selected, measures need to be defined. Definitions must be precise and should prescribe the specific data elements to use in their creation. Finally, a framework for interpreting measures needs to be specified. To make knowledgeable comparisons care must be taken to identify legitimate differences between departments. Enhancing the capacities of information systems After measures have been prioritized, enhancements may be considered from the point of view of developing comparative performance indicators. Enhancements in the ability to share and link data electronically can occur in several areas, including increasing the availability of data elements maintained electronically; obtaining data elements from other agencies that record data on events beyond the coverage of a department's information system; and sharing data with agencies that request information. Longer-run objectives in enhancing the ability to share data electronically extend to the development of offender tracking systems and integrated information systems. Exploring ways to overcome barriers to reporting statistical information In addition to lack of data availability, insufficient staffing and inadequate software may form barriers to reporting statistical information. Departments may have the data needed to generate measures but may not have the staff or software to do so easily. Problems that arise in creating and reporting statistical information about groups of offenders need to be better understood according to the class of problem they present. For example, problems related to data availability and quality may require different resources than those associated with the processing and analysis for generating the measures. Developing methods for producing and reporting common measures should be explored. Finally, the World Wide Web should be considered as a tool for searching data bases and disseminating data. For each step in the broader effort to develop comparative corrections data, basic questions like the following need to be addressed. Identifying priority measures How should performance be measured? How are the priority information areas linked to these measures? What are the appropriate intermediate and long-term outcomes for performance-based measures? Enhancing information systems How should common definitions be developed? Which noncore data elements should have a high priority for moving into the core? How can data quality be improved? Who should be involved in improving data quality? How can the systems be enlarged to cover all stages, including tracking post-release behavior? How have departments successfully integrated their systems with those of other departments? Exploring barriers to reporting statistical information How can electronic sharing of information be promoted? What new technologies can increase speed and reduce costs? How can a department optimize resources to overcome obstacles? (End of file) mm 10/98