Organizations eligible to participate in the IPA Mobility Program include—
- state and local governments
- domestic colleges and universities that are accredited
- Indian tribal governments
- federally funded research and development centers
- other eligible organizations
Under the revised IPA Mobility Program regulations (5 CFR part 334), OJP is responsible for certifying the eligibility of “other organizations” for participation in the IPA Mobility Program.
Other eligible organizations include national, regional, state-wide, area-wide, or metropolitan organizations, with a representing member from—
- state or local governments
- associations of state or local public officials
- nonprofit organizations that, as one of their principal functions, provide services, such as professional advisory, research, educational, and developmental to governments or universities concerned with public management
- federally funded research and development centers.
An employee of a nonfederal organization must be employed by that organization in a career position for at least 90 days before entering into an IPA agreement. The Department of Justice requires that participants be U.S. citizens. Individuals excluded from participating include—
- federal, state, or local government employees serving under noncareer, excepted service, noncompetitive, time-limited, temporary, or term appointments
- elected federal, state, or local government officials
- members of the uniformed military services and the Commissioned Corps of the Public Health Service and the National Oceanic and Atmospheric Administration
- students employed in research, graduate or teaching assistant positions, or similar temporary positions.
Interagency Personnel Agreement (IPA) participants are selected by the BJS director or a designee and are based on the employee’s qualifications and interests, BJS needs, and the mutual benefits to BJS and the organization employing the candidate. IPA projects should focus on improving one or more of BJS’s statistical programs. BJS’s statistical programs and data collections are described on the BJS website at https://bjs.ojp.gov/data-collections.
Recent BJS efforts to improve its statistical infrastructure have focused on survey design, the use of administrative records for statistical purposes, data quality assessments, and record linkage. There is a wide range of opportunities for an IPA within these general areas. For example, the survey design area includes sampling for continuous administration of establishment surveys and alternative sample designs for the National Crime Victimization Survey that address within-place explicit stratification to increase the number of victims who respond to the survey.
BJS is interested in developing survey instruments for new topical areas and aligning BJS’s surveys with other national survey instruments to facilitate comparisons. Within the area of using administrative records, BJS and the federal statistical system have interests in developing an administrative records analogue to the total survey error model for sample surveys.
Examples of other areas of interest include—
- National Crime Victimization Survey—
- -Small area estimation, using both sample data collected directly within states and model-based approaches
-Interviewing juveniles and persons younger than age 12 on sensitive topics related to criminal victimization.
- -Small area estimation, using both sample data collected directly within states and model-based approaches
- Law enforcement statistics—
- -Demonstrating the utility of incident-based crime statistics (such as the National Incident-Based Reporting System) for statistical, research, and evaluation purposes.
- Recidivism statistics—
- -Assessing the quality of criminal history records (also known as records of arrest and prosecution, or RAP sheets) for completeness and operational and statistical uses
-Imputation for item nonresponse
-Research designs for comparing the recidivism outcomes for groups of offenders.
- -Assessing the quality of criminal history records (also known as records of arrest and prosecution, or RAP sheets) for completeness and operational and statistical uses
- Federal justice statistics—
- -Assessing the quality of imputation and using the “dyad link” file in the Federal Justice Statistics Program.
- Indian country statistics—
- -Designing and conducting surveys of criminal justice systems in Indian country with assistance from experts on Indian country issues.
- Juvenile justice statistics—
- -Using existing BJS statistical program data to develop statistics on juvenile victims and offenders, and on contact of juveniles with adult criminal justice agencies.
Because the range of topics is potentially wide, individuals interested in considering an IPA arrangement should talk with BJS about their ideas.
Interested applicants should discuss their ideas with BJS staff, in particular the Unit Chief responsible for a statistical program area. Discussions can be facilitated by sending an email to [email protected] with the subject line “IPA” and a brief statement of interest.
BJS accepts, on an ongoing basis, ideas and concept papers for projects from potential applicants for an IPA mobility assignment. Individuals may submit concept papers to BJS at any time by emailing the documents to [email protected] with the subject line “IPA.” Papers will be routed to the appropriate BJS staff for follow-up. Concept papers should be no more than 3 pages in length and should discuss the nature of a proposed project. BJS will contact all persons who submit concept papers for IPAs.
BJS may also solicit concept papers from individuals who have been identified as qualified for a particular project.
Applicants to the IPA Mobility Program will have to demonstrate that they have the requisite skills, capabilities, and experience to conduct a project under the IPA Mobility Program. The skill level should be commensurate with the scope, content, and focus of a project. Applicants may demonstrate skills and capabilities by submitting a resume or curriculum vitae (CV). Letters of reference may also be required, if a CV does not sufficiently demonstrate skills.
When a potential candidate has been identified, BJS will contact the management of the candidate’s organization to confirm that an IPA mobility assignment is feasible and in the interests of the organization.
The projects performed under an IPA will focus on a specific BJS statistical program. However, when several programs are closely related, a project may address issues common to several programs.
The level of support is conditional on the nature of the project, skills, and expertise of the IPA. Under the IPA, BJS will provide support for salary, benefits, and travel costs associated with the project.
The duration of an IPA is variable, depending on the nature of the project. It may last up to 2 years, and an extension for an additional 2 years may be granted if the extension benefits both BJS and the IPA’s home host agency. An IPA can be full-time, part-time, or intermittent.
It is not necessary that an IPA relocated to the Washington, D.C., area. However, if an IPA is off-site, routine travel to BJS will be required to discuss the project and meet with BJS staff. If an IPA is structured so that a person relocates to Washington, D.C., to work at BJS, BJS will not pay relocation expenses.
BJS periodically collects data on pregnancy at admission and maternal health services received since admission in its Survey of Prison Inmates and Survey of Inmates in Local Jails. BJS released a prison and jail report on this topic. BJS also publishes an annual report based on data collected from the Federal Bureau of Prisons under the First Step Act. Additionally, BJS conducted a study to assess the feasibility of collecting maternal health data at the state, federal, local, and tribal levels and released a report.
BJS does not hold copyrights on the suggested information; it may be freely distributed, copied, or reprinted. We encourage the appropriate citation: The U.S. Department of Justice, Office of Justice Programs, Bureau of Justice Statistics. If the data were acquired from a published report, please provide the report title, NCJ number, and release date. If acquired from the website, please provide the correct URL: https://bjs.ojp.gov/.
When national estimates are derived from a sample, as with the NCVS, caution must be used when comparing one estimate to another estimate or when comparing estimates over time. Although one estimate may be larger than another, estimates based on a sample have some degree of sampling error. The sampling error of an estimate depends on several factors, including the amount of variation in the responses and the size of the sample. When the sampling error around an estimate is taken into account, the estimates that appear different may not be statistically different.
One measure of the sampling error associated with an estimate is the standard error. The standard error can vary from one estimate to the next. Generally, an estimate with a small standard error provides a more reliable approximation of the true value than an estimate with a large standard error. Estimates with relatively large standard errors are associated with less precision and reliability and should be interpreted with caution.
Data users can use the estimates and the standard errors of the estimates provided in NCVS reports to generate a confidence interval around the estimate as a measure of the margin of error. A confidence interval around the estimate can be generated by multiplying the standard errors by ±1.96 (the t-score of a normal, two-tailed distribution that excludes 2.5% at either end of the distribution). Therefore, the 95% confidence interval around an estimate is the estimate ± (the standard error X 1.96). In other words, if different samples using the same procedures were taken from the U.S. population, 95% of the time the estimate would fall within that confidence interval. See the NCVS Methodology for an example.