This appendix documents the technical steps that support the brief An Assessment of Probation Sentencing Reform. In the report, we use individual-level data of people on community supervision from state criminal justice agencies as well as a review of policy documents and interviews with stakeholders. In this appendix, we detail the data sources and methodology used in our analysis.
This policy assessment relied on public documents, interviews with stakeholders, and administrative data. Documents, policies, and reports about the sentencing policies were collected. Stakeholders in both Louisiana and Georgia participated in interviews with project staff. The administrative data used in our analysis come from two sources—the Louisiana Department of Public Safety and Corrections (DPSC) and the Georgia Department of Community Supervision (DCS). Both states provided data on people on community supervision before and after the implementation of their respective sentencing policies.
Policy documents. The authors reviewed Louisiana state statutes to understand the details of the reform as written as well as other related statutes. This included reviewing changes in the statute since it was enacted. Additionally, we reviewed agency-level policies related to the length of probation sentences. We also reviewed publicly available data and reports on reform efforts in the state.
Interview data. Based on information learned through reviews of statute and policy, the authors created Louisiana-specific interview guides and used them for stakeholder groups participating in individual and small-group virtual interviews. Through guidance from DPSC, we scheduled interviews with probation officers and supervisors from each of three regions for a total of six interviews, ensuring diversity in geographic representation to ensure a holistic understanding of the policy’s application in urban and rural areas throughout the state. We also met with the three regional administrators and judges from multiple regions. At the beginning of each interview, we explained the purpose of the study, our funder, made clear their participation was voluntary, and received verbal consent from each participant. Through these interviews, the authors gained a better understanding of how the policy has been applied, what the implementation process has consisted of, and stakeholders’ perspectives on the reform. Sixteen people were interviewed to contextualize our quantitative findings.
Administrative data. Urban received administrative data from the DPSC on all probation terms served from 2013 to 2019. The data were at the level of individual supervision terms and contained information about people’s demographic characteristics, sentencing, supervision type, revocations, and completion types. Urban worked with DPSC data analysts to understand the data and create variables necessary for analysis.
Policy documents. The Urban and CJI team reviewed Georgia statutes and publicly available reports to understand the details of the policy reforms and the history of justice reform in the state.
Interview data. This information allowed us to develop Georgia-specific interview guides for virtual focus groups and individual interviews with different stakeholders. To determine whom to include in the focus groups, we examined the court structure and determined that district-level focus groups would best allow us to include people from different regions (including urban and rural areas) and counties of different sizes. County use of BIDs and early termination also factored into our selection of sites for focus groups. This allowed us to gain perspectives from frequent and infrequent users of the policies. People participating in the focus groups identified judges for the Urban and CJI team to interview, and we were able to interview two judges. At the beginning of each interview and focus group, we explained the purpose of the study and the funder, made clear their participation was voluntary, and received verbal consent from each participant. The information gleaned from the 30 people interviewed in Georgia helped us understand local implementation of the policy reforms and contextualized findings from our quantitative analysis.
Administrative data. Urban received administrative data from DCS on all probation and parole terms for people on supervision from January 2017 through January 2021. For the sentencing analyses, the data were limited to probation terms beginning in or after 2017. The data were at the level of individual supervision terms, but sometimes contained multiple docket terms per person, meaning someone could have multiple dockets open during a single supervision term. The data contained information about demographic characteristics, sentencing, supervision type, risk level, sanctions, violations, eligibility information and petitions related to the policy, and revocations. The research team worked closely with the DCS research team to understand the data structure, variables, and how the policy was implemented in the state’s record management system.
In both states, we processed the administrative data into analysis-ready files. This included structuring the data into a single observation for each unique supervision period for each person and creating many variables, including those related to the sentence, supervision term, and violations, sanctions, and recidivism, when available. After the data were processed, descriptive analyses were conducted to understand the sample, policy implementation, and trends over time. We were unable to conduct outcome analyses with a quasi-experimental design, which is described further for each state below. The data processing and descriptive analyses were conducted in Stata 16.
Urban conducted original analysis of the data extract. ANOVA, chi-square, and t-tests were conducted to determine the significance of the observed differences between groups pre- and postreform. Owing to the limited time between the implementation of the reform and Urban’s analysis, it was difficult to assess the impact of the reform on time served for eligible people. Further, our eligibility definition was restricted based on the data, defined only as being on supervision for offenses that were nonviolent and non sex offenses. This definition does not incorporate criminal history in terms of whether the offense was a first, second, or third conviction. The policy may also allow for a few violent offenses, but it was not possible to isolate those specific offenses in the data. Because of these limitations, Urban focused on implementation and preliminary outcomes of the reform, such as overall revocation and successful completion rates. Further, individual-level analyses of revocations and recidivism were also limited. Given that the reform was implemented in 2017, most people who started supervision after the reform had not had enough follow-up at the time of analysis to observe supervision and recidivism outcomes.
Urban conducted original analysis of the provided data. Chi-square, and t-tests were conducted to determine the significance of the observed differences between groups pre- and postreform. Due to the limited time between the implementation of the reform and Urban’s analysis, it was difficult to assess the impact of the reform on time served for eligible people. BID eligibility data was provided by DCS, and early termination eligibility criteria included having a qualified offense, a sentence of three or more years, having served three years minimum, and having a sentence that started or ended after reform. We were not able to include whether restitution and conditions requirements were met due to data limitations. We did have information on violation and sanctions, and while generally they are associated with noncompliance, we could not distinguish whether they factored into failing to meet restitution and supervision conditions. Many people who received BIDS and early termination had sanctions, so they could not be used as a proxy for eligibility. Furthermore, very few cases have been granted early termination or BIDs, which prevented an analysis of postsupervision recidivism outcomes given that few cases were available and there has been only a short amount of follow-up time for those cases. Instead, we focused on implementation and preliminary outcomes of the reform—more specifically, on approval outcomes for BIDs and decisionmaking around early termination. Lastly, Georgia only had data for supervision cases as of January 2017 onwards, so all pre-reform analyses were limited to the six months before reform, i.e., January through June 2017.