Srae Pas Ssb_120925_clean

SRAE PAS SSB_120925_CLEAN.docx

Sexual Risk Avoidance Education (SRAE) Performance Measures

OMB: 0970-0536

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes


Sexual Risk Avoidance Education (SRAE) Performance Measures


OMB Information Collection Request

0970-0536





Supporting Statement

Part B


Type of Request: Revision


December 2025








Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers:

Caryn Blitz, Selma Caal



Part B


B1. Objectives

Study Objectives

The objective of the Sexual Risk Avoidance Education (SRAE) Performance Measures are to document how SRAE-funded programs are operationalized in the field and assess program outcomes. The Family and Youth Services Bureau (FYSB) and Office of Planning, Research, and Evaluation (OPRE) in the Administration for Children and Families (ACF), at the Department of Health and Human Services (HHS), seek approval of a temporary extension of currently approved information collection activities and revisions to be implemented in 2026 for the Sexual Risk Avoidance Education (SRAE) Performance Measures The continued collection of performance measures (PM) data will allow ACF to (1) continue to monitor the extent to which the programs meet SRAE implementation objectives and advance toward expected outcomes; (2) inform program improvement efforts; and (3) update ACF, grant recipients, and others on the program’s status and progress.


Generalizability of Results

This study is intended to present an internally valid description of the SRAE Program, not to promote statistical generalization to other programs or populations. The study will continue to include information on all SRAE grant recipients, subrecipient program providers, and participants who respond to data collection.


Appropriateness of Study Design and Methods for Planned Uses

The SRAE PM are designed to describe the implementation and outcomes of the SRAE Program. The PM data collected through this effort will continue to provide necessary information to ACF and grant recipients to effectively manage the programs. Entry and exit surveys of youth participating in SRAE are necessary to collect information on the demographic and behavioral characteristics of program participants and their perceptions of program effects. Administrative data from grant recipients and their subrecipient program providers are needed to understand the structure and features of SRAE programs, participant numbers, implementation supports, and staff perceptions of quality challenges and needs for technical assistance. Because these are PM, data are required on the universe of grant recipients, programs, and participants. As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions.


B2. Methods and Design

Target Population

The target population for the SRAE PM includes all recipients of SRAE grants (estimated at 190, based on the most recent round of PM data), their subrecipient program providers, and youth participants. The 190 grants include 108 General Departmental SRAE (GDSRAE), 38 State SRAE (SSRAE), and 44 Competitive SRAE (CSRAE) grants. Based on PM data from the most recent round of data collection, the number of subrecipients is estimated at 490 across all grants. Grant recipients are expected to serve an average of about 325,000 participants per year.1 Program participants are youth ages 10–20. The data collection instruments to be used by each target population are as follows:

  • Instrument 1: Participant Entry Survey – youth participants

  • Instrument 2: Participant Exit Survey youth participants

  • Instrument 3: Performance Reporting System Data Entry Form – grant recipients

  • Instrument 4: Subrecipient Data Collection and Reporting Form – subrecipient program providers

Sampling and Site Selection

The SRAE PM will continue to include all SRAE grant recipients, subrecipient program providers, and participants. ACF will use the PM data to monitor and report on progress in implementing SRAE programs. In addition, the information will be used to support continuous quality improvement by (1) the program office, to improve the SRAE Program overall, and (2) grant recipients, to improve their own program(s). All SRAE grant recipients must be included in the study so that (1) the measures reflect the complete scope of the SRAE Program and (2) the data can be used to promote program improvement among all grant recipients.


B3. Design of Data Collection Instruments

Development of Data Collection Instruments

The SRAE PM and data collection instruments were developed through a deliberative process over two years and subsequently revised after data collection began. Several documents informed the initial development of the data collection instruments, including: (1) data collection instruments used to collect PM information for other adolescent pregnancy prevention grant programs; (2) performance progress reports used to monitor previous FYSB grant programs that promoted refraining from nonmarital sex; and (3) documentation of measures used in other relevant data collections, including survey items from the Youth Risk Behavior Survey, the National Longitudinal Study of Adolescent to Adult Health, and the National Youth in Transition Database. In addition, ACF consulted with FYSB program staff (grant recipients’ project officers), select SRAE grant recipients, and ACF leadership to obtain their feedback on the proposed measures, processes, and instruments. Finally, the participant entry and exit surveys were pretested with youth.


During the initial years of SRAE PM data collection, the instruments were revised several times to (1) address comments from SRAE grant recipients on the data collection instruments, (2) streamline the instruments to ask only those questions necessary to achieve the objectives of the SRAE PM, and (3) include—and later revise—measures related to effects of the COVID-19 pandemic on SRAE program operations. Some of these revisions resulted in the creation of alternate versions of the entry and exit survey instruments to meet the needs of different types of grant recipients and populations of youth.2 These changes were previously reviewed and approved by OMB.


ACF is now requesting a temporary extension of the currently approved instruments, through June 2026, followed by proposed revisions to be implemented in July 2026. The proposed revisions are designed to (1) address feedback from grant recipients to simplify and clarify participant surveys, (2) align race and ethnicity measures with current OMB guidance, and (3) ensure the measures meet FYSB data needs by removing items related to COVID-19 and others not widely utilized and replacing them with other measures of interest. To inform the most recent round of potential improvements to the PM, an ACF contractor gathered input from SRAE and PREP grant recipients, federal project officers (FPOs), and FYSB leadership through listening sessions and written feedback documents. The contractor also conducted cognitive pretests with 33 youth participating in SRAE programs to collect feedback on the proposed revisions to the Participant Entry and Exit Surveys (Instruments 1 and 2) and made additional revisions to address their comments. This pretest effort was approved by OMB, Administration for Children and Families Youth Programs –Youth Participant Entry and Exit Survey Pretest, OMB Control # 0970-0355.3 The cognitive pretest sample included youth ages 11 to 17, both males and females, as well as youth from a mix of racial and ethnic backgrounds.


Each of the four data collection instruments addresses the study objectives described in Section B1 above. Instruments 1 and 2 capture information on the characteristics of the youth participating in the program and their perceptions of program effects, and Instruments 3 and 4 capture information on grant recipients’ and subrecipients’ implementation of SRAE programs. Both types of data will be used to monitor program implementation and outcomes, inform program improvement, and provide status and progress updates.


B4. Collection of Data and Quality Control

Instruments 12: Participant Entry and Exit Surveys. As in previous rounds of SRAE PM data collection, each grant recipient and its subrecipients will make decisions regarding procedures for collecting the participant entry and exit surveys. Some grant recipients may have elected to work with local evaluators who will administer the PM surveys. Grant recipients and local evaluators could decide to use paper-and-pencil or web-based surveys in group or individual settings. Grant recipients will inform their individual program participants that participation is voluntary and that they may refuse to answer any or all of the questions in the entry and exit questionnaires.

Instruments 34: Performance Reporting System Data Entry Form and Subrecipient Data Collection and Reporting Form. Grant recipients will continue to submit data separately on participant attendance, reach and dosage. Data on these measures will continue to be collected by subrecipient program providers (Instrument 4). Administrative data on program features and structure, allocation of funds, and staff perceptions of quality challenges will continue to be collected by grant recipients and subrecipients (Instruments 3 and 4). Grant recipients will continue to prepare and submit their final data to ACF through the SRAE Performance Measures Data Portal. The Performance Reporting System Data Entry Form (Instrument 3) contains the list of all data elements grant recipients will submit, collected from among their subrecipients.

The timing of participant survey data collections will be customized for each site depending upon the start and end dates of each cohort of participants. Administrative performance measurement data will continue to be submitted to ACF once a year, and participant information will continue to be submitted to ACF twice a year.


Experiences to Date

OMB originally approved the information collection in October 2019, and grant recipients have been collecting SRAE PM data since January 2020, including participant entry and exit surveys since September 2020. Since the initial approval, there have been several rounds of revisions to the measures, as summarized in the previous section. One-year extensions were granted from OMB in January 2024 and December 2024.


In the most recently completed program year (2023-2024), PM data were submitted for 190 grants and 490 program providers, including entry survey data for 187,482 youth participants and exit survey data for 148,203 participants.


With the requirement of PM as part of the funding award process, this information collection is providing uniform data of great utility to the government. ACF’s contractor will continue to provide training and technical assistance to ensure that grant recipients and program providers understand the measures, instruments, and data collection processes.


B5. Response Rates and Potential Nonresponse Bias

Response Rates

The data collected through the SRAE PM are not designed to produce statistically generalizable findings. Although SRAE grant recipients are required to collect grant-, provider-, and program-level measures and to administer the entry and exit surveys to youth, response to the surveys is at each youth participant’s discretion. Response rates will not be calculated or reported.

As participants will not be randomly sampled and findings are not intended to be representative, non-response bias will not be calculated. Respondent demographics for the participant entry and exit surveys will be documented in PM reports and other documents.

Table B5.1 describes the respondents and expected number of responses associated with the PM data collection.

Instruments 12: Participant Entry and Exit Surveys. We anticipate 209,212 youth to complete the participant entry survey each year and approximately 163,708 participants, will complete the participant exit survey each year. These estimates are based on the number of entry and exit surveys submitted by grant recipients in recent years.4

As in earlier rounds of SRAE PM data collection, the number of responses for participant surveys will be maximized through the administration of entry surveys to all participants at enrollment and administration of the exit surveys during final program sessions. For participants who are absent at program exit when exit surveys are administered, the exit surveys will be administered to them as soon as possible individually or in a small group.

Instruments 34: Performance Reporting System Data Entry Form and Subrecipient Data Collection Reporting Form. Because the collection and submission of PM data is a funding requirement for all SRAE grants, we expect all grant recipients and subrecipients to report PM data.

To reduce grant recipient burden and maximize grant recipient response rates, ACF will continue to provide common data elements definitions across program models and obtain these data in a uniform manner from grant recipients through the SRAE Performance Measures Data Portal, developed for previous SRAE PM data submission (see Instruments 3–4).

Table B5.1. Annual Respondent Universe and Expected Responses for the Study of Performance Measures

Data Collection

Type of Respondent

Annual Respondent Universe

Total Expected Responses

Instrument 1: Participant Entry Survey

Youth participant

325,000

209,212

Instrument 2: Participant Exit Survey

Youth participant

325,000

163,708

Instrument 3: Performance Reporting System Data Entry Form

Grant Administrator

190

190

Instrument 4: Subrecipient Data Collection and Reporting Form

Subrecipient Administrator

490

490


NonResponse

As with previous rounds of SRAE PM data, analyses will be conducted based on respondents’ submission of data for a given measure, with no imputation or weight adjustments to address missing data. Because we expect high response rates to all components of the data collection, we do not plan any nonresponse bias analysis.


B6. Production of Estimates and Projections

The PM data will continue to primarily be used internally by ACF and SRAE grant recipients but will also be used to inform other stakeholders. For example, PM data will inform ACF’s annual reporting to the OMB on the progress of the SRAE Program, and fact sheets and end-of-cohort reports will be made available to the public.

The PM data will continue to be submitted by all SRAE grant recipients, program providers, and youth participants. The analyses will include computation of statistics such as percentages and means based on the respondents; we will not produce estimates intended to apply to any broader population.


B7. Data Handling and Analysis

Data Handling

PM data will continue to be collected by grant recipients and their subrecipients. In some cases, grant recipients may engage local evaluators to assist them with data collection.

Grant recipients will continue to enter PM data into the SRAE Performance Measures Data Portal maintained by ACF’s contractor. The entry screens of the Portal include a series of automated validity checks to identify some types of errors as the data are entered. Error messages will continue to alert grant recipients to inconsistencies between data elements, values beyond the expected range, and similar issues, and provide grant recipients an opportunity to correct such errors before the data are submitted. For example, an error message will appear if the number of facilitators reported to have received training or been observed for quality monitoring during the period exceeds the total number of facilitators reported for the period. The system will also continue to conduct automated checks to ensure that the full set of PM are entered (e.g., the grant recipient entered survey data for each program that served youth).

Once submitted, ACF’s contractor will conduct additional quality checks to identify remaining issues, as with previous rounds of SRAE PM data. Cases with unresolved data issues may be omitted from analyses. If suspect data are included in any tabulations, caveats will be included alongside the reported data.


Data Analysis

The contractor will continue to analyze SRAE PM data to generate PM reports for ACF and other audiences. Core analyses will include computing means and sums of continuous numeric measures (such as number of participants served) and producing frequency distributions of categorical and character variables (such as program models implemented). Cross-tabulations will be used to explore potential relationships between variables, such as whether perceptions of program effects differ by participant’s age or other characteristics. We will examine changes in the PM data over time. Analyses will continue to be conducted separately for each of the SRAE Program’s three funding streams (GDSRAE, SSRAE, and CSRAE). Where feasible, we will obtain comparable measures for the youth population nationwide for comparison.


Data Use

ACF uses information from the PM to fulfill reporting requirements to OMB concerning the SRAE program. In addition, the PM data will be used to develop end-of-cohort reports and briefs, which will synthesize PM data across years. These reports will include summaries of the data collection and analysis methods as well as appropriate caveats and data limitations. Each year, the contractor will prepare a brief public-facing fact sheet that highlights key findings based on the PM for the previous reporting year.


B8. Contact Person(s)





Attachments

  • Instrument 1a (Current): Participant Entry Survey for high school and older youth

  • Instrument 1a (Implement 2026): Participant Entry Survey for high school and older youth

  • Instrument 1b (Current): Participant Entry Survey for middle school youth

  • Instrument 1b (Implement 2026): Participant Entry Survey for middle school youth

  • Instrument 1c (Current): Participant Entry Survey for high school and older youth in programs with impact evaluations

  • Instrument 1c (Implement 2026): Participant Entry Survey for high school and older youth in programs with impact evaluations

  • Instrument 1d (Current): Participant Entry Survey for middle school youth in programs with impact evaluations

  • Instrument 1d (Implement 2026): Participant Entry Survey for middle school youth in programs with impact evaluations

  • Instrument 2a (Current): Participant Exit Survey for high school and older youth

  • Instrument 2a (Implement 2026): Participant Exit Survey for high school and older youth

  • Instrument 2b (Current): Participant Exit Survey for middle school youth

  • Instrument 2b (Implement 2026): Participant Exit Survey for middle school youth

  • Instrument 3 (Current): Performance Reporting System Data Entry Form

  • Instrument 3 (Implement 2026): Performance Reporting System Data Entry Form

  • Instrument 4 (Current): Subrecipient Data Collection and Reporting Form

  • Instrument 4 (Implement 2026): Subrecipient Data Collection and Reporting Form


1 The number of youth listed as the target population (n= 325,000) reflects the number that grant recipients are expected to serve each year. This differs from the number of youth responses, as noted in the burden table in A12 of SSA, which reflects the number of youth expected to complete the entry and exit surveys each year (based on data reported from grant recipients in recent years).

2 Grant recipients use the original versions of the entry survey (Instrument 1a) and the exit survey (Instrument 2a) with high school and older youth. Alternative versions (Instruments 1b and Instrument 2b) were developed for use with middle school youth; these versions exclude some of the more sensitive items. Later, additional alternate versions of the entry survey were developed (Instrument 1c for high school and older youth and 1d for middle school youth) for use by grant recipients participating in impact evaluation studies; these versions include a limited number of questions to decrease burden due to grant recipients’ participation in other studies that often include extensive surveys. Previously there was a version of the exit survey for programs conducting impact studies but use of those was discontinued prior to the last OMB approval in 2022.

3 The OMB approved pretest included both the PREP and SRAE performance measures surveys. Many of the survey questions are the same across both PREP and SRAE entry and exit surveys, allowing overlapping questions to be tested with a larger youth sample (78 youth).

4 In contrast, the number of youth listed as the Annual Respondent Universe in Table B5.1 reflects the number of youth that grant recipients are expected to serve each year, which differs from the estimates of responses to the surveys.

7

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorLara Hulsey
File Modified0000-00-00
File Created2025-12-18

© 2025 OMB.report | Privacy Policy