ACTS Supporting Statement B Revision 3 - 20251216 - CLEAN

ACTS Supporting Statement B Revision 3 - 20251216 - CLEAN.docx

Integrated Postsecondary Education Data System (IPEDS) 2025-26 through 2026-27

OMB: 1850-0582

Document [docx]
Download: docx | pdf




Integrated Postsecondary Education Data System (IPEDS) 2024-25 2025-26 through 2026-27


Supporting Statement Part B




OMB No. 1850-0582 v. 33





Submitted by:

National Center for Education Statistics (NCES)

Institute of Education Sciences

U.S. Department of Education




February 2024

Revised May 2025

Revised December 2025

SECTION B. Description of Statistical Methodology


B.1. Respondent Universe


In 2022-23, IPEDS collected data from 5,983 Title IV postsecondary institutions in the United States and the other jurisdictions. By law, all Title IV institutions are required to respond to IPEDS (Section 490 of the Higher Education Amendments of 1992 [P.L. 102-325]). IPEDS allows other (non-title IV) institutions to participate on a voluntary basis; approximately 200 non-title IV institutions elect to respond each year. Institution closures and mergers have led to a decrease in the number of institutions in the IPEDS universe over the past few years. Due to these fluctuations, combined with the addition of new institutions, NCES uses rounded estimates for the number of institutions in the respondent burden calculations for the upcoming years (estimated 6,000 Title IV institutions plus 200 non-title IV institutions for a total of 6,200 institutions estimated to submit IPEDS data during the 2024-25 through 2026-27 IPEDS data collections).


Table 1 provides the number of institutions that submitted data during the 2022-23 IPEDS data collection and the number of institutions estimated to submit data during the 2024-25 through 2026-27 IPEDS data collections, disaggregated by the type of institution (Title IV institutions are disaggregated by highest level of offering: 4-year award or above, 2-year award, less than 2-year award). Note that based on the 2022-23 data collection, NCES has decreased the estimates for the number of institutions that are expected to report to IPEDS in the 2024-25 through 2026-27 data collections.


Table 1. Actual 2022-23 and Estimated 2024-25 through 2026-27 Number of Institutions Submitting IPEDS Data

Institution Type

2022-23 Institution Counts*

Estimates Used in Burden Calculations for the 2024-25 to 2026-27 Collections

Total

6,183

6,115

Title IV institutions

5,983

5,935

4-year

2,757

2,750

2-year

1,569

1,560

Less than 2-yr

1,657

1,625

Non-Title IV institutions

200

180

* For Title IV institutions: U.S. Department of Education, National Center for Education Statistics, IPEDS, Fall 2022 Institutional Characteristics component (provisional data).


NCES expects that, after exemptions, about 1660 four-year institutions will submit data for the new Admissions and Consumer Transparency Supplement (ACTS) survey component during each of the 2025-26 and 2026-27 IPEDS data collections. Only 4-year public, private for-profit, and private not-for-profit institutions that are baccalaureate granting or above are eligible to complete the new component. Eligible institutions may be exempted from completing ACTS in a survey year if they (1) do not award non-need-based aid and (2) were open admission or admitted 100 percent of applicants in that year.


Institutions that are excluded from the ACTS universe (e.g., two-year colleges, 4-year institutions that are not primarily baccalaureate degree awarding or above) will not be shown the ACTS component as one of the required Winter surveys when accessing the data collection system site. To support potentially in-scope institutions in determining their eligibility, they will be shown the ACTS component, as well as a screener that evaluates their year-by-year eligibility for submission. Questions regarding eligibility can be directed to the IPEDS HelpDesk, via the contact information provided on the data collection system site.


Table 2 provides the number of experienced and new keyholders that submitted data for a given IPEDS component during the 2022-23 IPEDS data collection, disaggregated by the type of institution. These experienced vs. new keyholder designation is drawn directly from self-reported data in the data collection system, where users indicate whether they are submitting data for the first time when they register.


Table 2. 2022-23 Counts of Experienced and New Keyholders Submitting IPEDS Data, by Institution Type and IPEDS Component

Survey component

Total

4-year institutions

2-year institutions

Less than 2-year institutions


Experienced

New

Experienced

New

Experienced

New

Experienced

New

IC

4,515

1,505

1,913

763

1,155

424

1,447

318

C

4,515

1,505

2,121

796

1,109

386

1,285

323

E12

4,507

1,502

1,906

820

1,102

389

1,499

293

SFA

4,416

1,472

1,889

779

1,093

386

1,434

307

OM

2,742

914

1,847

616

895

298

0

0

GR

4,019

1,339

1,611

663

1,031

396

1,377

280

GR200

3,762

1,254

1,403

593

1,017

383

1,342

278

ADM

1,496

499

1,339

443

94

20

63

36

EF

4,487

1,495

1,952

789

1,083

393

1,452

313

F

4,375

1,458

1,935

760

1,162

415

1,278

283

HR

4,484

1,494

1,976

779

1,118

396

1,390

319

AL

2,811

937

1,954

675

857

262

0

0

* Note: These counts do not match any published numbers because they include the non-Title IV institutions that voluntarily submit data to IPEDS.


Table 3 provides the actual response rates, by survey component and the type of institution, for the 2022-23 IPEDS data collection. Because IPEDS is a mandated federal data collection, and institutions can be fined for non-response, all response rates approximate 100%.


Table 3. IPEDS 2022-23 Title IV Institutions Response Rates, by Institution Type and IPEDS Component

Survey component

4-year institutions

2-year institutions

Less than 2-year institutions

IC

100.00%

100.00%

100.00%

C

100.00%

99.94%

99.82%

E12

99.42%

99.94%

99.82%

SFA

100.00%

99.94%

99.88%

OM

100.00%

100.00%

N/A

GR

99.96%

100.00%

99.81%

GR200

100.00%

100.00%

99.93%

ADM

100.00%

99.84%

100.00%

EF

100.00%

99.94%

99.76%

F

99.96%

99.87%

99.82%

HR

99.93%

100.00%

99.88%

AL

99.96%

100.00%

N/A



B.2. Statistical Methodology


No sampling is utilized for any of the IPEDS survey components. Because of the institutional compliance requirements outlined in Part A sections A.1 and A.2 of this submission, and per extensive discussions at the IPEDS Technical Review Panel meetings, with other areas of the Department of Education, including the Office for Civil Rights, the Office of Postsecondary Education, the office of Federal Student Aid, and the Office of Vocational and Adult Education, and with other Federal Agencies such as Census, the Bureau of Economic Analysis (BEA), and the U.S. Equal Employment Opportunity Commission (EEOC), IPEDS must collect data from the universe of Title IV institutions.


Methods specifically applicable to the ACTS component. Data collected as part of the ACTS component will be subject to statistical disclosure limitation (SDL) techniques. IPEDS data are not collected under a pledge of confidentiality. However, both NCES and its parent agency, the Institute of Education Sciences (IES) are required to “protect the confidentiality of persons in the collection, reporting, and publication” of data (see section 183 of the Education Sciences Reform Act of 2002 [ESRA]).

ACTS data are reported to IPEDS in the aggregate, with the fundamental unit of aggregation being a race/ethnicity × sex pair. Additional disaggregates vary, including by such characteristics as admissions test scores or family income. Although the general structure of the ACTS data is known a priori, the contents of an individual institution’s submission cannot be. Uncertainty arises both due to the availability of data at the institution level (e.g., some institutions may not collect admissions test scores, precluding disaggregation by a derived variable) and the distribution of students across various reporting categories. As such, it is prudent to prepare for the circumstance in which small cell sizes may inadvertently create the opportunity for the identification of an individual and their circumstances. SDL techniques, as well as other approaches to reporting and dissemination, can mitigate that risk.


Prior to any ACTS-related reporting or dissemination (including the release of any ACTS data), the ACTS data will be reviewed by the IES Disclosure Review Board (DRB). The IES DRB, which includes representation from IES’s Office of Science, NCES, and the National Center for Education Evaluation (NCEE), routinely reviews record-level data arising from IES studies. The purpose of the DRB is to recommend or review SDL approaches to mitigate the risk associated with the release of data arising from an IES collection. The DRB also reviews the efficacy of SDL approaches after they are implemented to ensure data utility and respondent confidentiality is optimized. The DRB may recommend or require a variety of SDL approaches be applied to the ACTS data prior to release, including coarsening, top or bottom coding, or swapping.


The DRB may also recommend to the NCES Commissioner that ACTS data, even after SDL techniques have been applied, be available only as Restricted Use Files. In that instance, data will only be licensed to qualified researchers who (1) register a research plan with NCES that represents a bona fide statistical purpose, (2) agree to ESRA confidentiality provisions and acknowledge that any attempt on their part to reidentify an individual from the ACTS data is a felony, (3) agree to have any analyses arising from the ACTS data reviewed by IES for disclosure risk prior to sharing them with non-license holders, and (4) receive final approval from the NCES Commissioner to access the data.


Finally, all analyses of ACTS data, including those generated by approved ACTS data licensees, will also be reviewed by IES prior to their public release. The disclosure risk review (DRR) process, overseen by IES’s Office of Science, ensures that analyses (both individually and in combination) and accompanying text (e.g., descriptions of data and data files, discussions of methodological approaches) do not permit the inadvertent identification of individual respondents. Typical remedies arising from DRR review include mandatory rounding to the nearest appropriate unit, the collapsing of small cells, and the insertion or deletion of language such that key analytic information is preserved but potentially disclosive details are obscured.



B.3. Methods to Maximize Response Rates


IPEDS response rates for institutions receiving federal financial aid are consistently 99.8% and higher. IPEDS targets the Title IV institutions (others may respond, but no follow-up is done) and the web-based survey system incorporates an automated e-mail module that automatically generates follow-up e-mail to “keyholders” (individuals appointed by the CEOs as responsible for IPEDS data submission). As shown in Table 19 of Part A section A.16 of this submission, frequent communications occur with the institutions over the course of the data collection to ensure compliance with this statutorily mandated collection. Follow-up e-mails are generated if an institution does not attempt to enter data or if, at two weeks and one week before closeout, the components are not locked. The CEOs of non-responding institutions are also contacted by standard mail and with follow up phone calls if, two weeks prior to closeout, the school has not entered any data. New institutions and institutions with new keyholders receive additional telephone and email prompts. This has proven to be very successful in past years. In addition, the names of institutions that do not respond to the IPEDS surveys, and a history of all regular contact with these institutions, is provided to the Federal Student Aid office for appropriate action. These methods to maximize response rates will also be applied for the four-year institutions with selective admissions which will be required to respond to the new Admissions and Consumer Transparency Supplement (ACTS).


B.4. Tests of Procedures and Methods


The bulk of the data collection procedures and data items described in this submission have been tested and refined over more than two decades. IPEDS’ web-based data collection method was tested in a successful pilot collection of Institutional Price and Student Financial Aid information in August 1999 and has been in full-scale implementation since the fall of 2000. Similarly, most of the data elements requested for all survey components—except the Admissions and Consumer Transparency Supplement (ACTS)—have already been collected in previous IPEDS surveys and prior to that, similar data elements had been collected for over 20 years in the Higher Education General Information Survey (HEGIS), the predecessor to IPEDS.


Nonetheless, data quality is an overriding concern that NCES must continue to assess and evaluate. As a result, IPEDS implements a variety of strategies to promote data quality at the point of collection. One approach is to assess relevant data from different IPEDS components and from different survey years to evaluate the consistency and reliability of reported data. These interrelationships among surveys and over time are used to develop the automated tests used to edit each IPEDS data submission. Edit checks currently help to identify potential problems and provide opportunities to correct them early in the data collection. As the number of institutions that automate their responses to IPEDS, rather than manually entering IPEDS data into online forms, increases, it becomes increasingly difficult to validate their responses in real time. In response, NCES has required that schools using automation resolve errors prior to data submission. As a result, NCES has been gathering cleaner data in a timelier fashion. The web-based system still accommodates intermediate reporting units such as community college boards, state university systems offices, and corporate offices.


IPEDS also routinely seeks feedback from data providers and users in an effort to promote data quality and reduce burden. NCES routinely revises data collection items, definitions, and instructions based on the recommendations of IPEDS constituents, and following appropriate public comment periods.


The new ACTS survey represents a unique case, given the circumstances of its collection. In this case, NCES has created the data collection items, definitions, and instructions based on the elements outlined in the Secretary of Education’s Directive following the August 7 Ensuring Transparency in Higher Education Admissions Executive Memoranda. Survey items on the ACTS will be implemented for the first time during the Winter 2025-26 data collection window. Notably, most ACTS items draw upon data elements that are used to respond to other IPEDS surveys (e.g., Admissions, Student Financial Assistance, Cost, Completions, Graduation Rates) as well as those collected in other IES studies, including the National Postsecondary Student Aid Study (NPSAS).


ACTS also uses a novel data collection mechanism, in which (1) record-level data are aggregated and (2) aggregated data are uploaded into the IPEDS collection system. Although this approach has been previously by IPEDS, the process of collecting a student-level data via a template file has been used by thousands of colleges—including 4-year, 2-year, and less-than 2-year college—via numerous cycles of NPSAS. Through its experience with NPSAS, NCES and the IPEDS data collection contractor have developed expertise in designing easy-to-use templates, thorough documentation, and FAQ text that can support institutions in this type of response. NCES and its contractor also have significant expertise designing on-line collection systems that allow for simple uploading of template files, automated error checking systems with a web interface, and school-facing dashboards that clearly apprise users of the status of their data (e.g., submitted with errors, errors needing resolution, submitted without errors, accepted by RTI).


NCES meets weekly with its IPEDS data collection contractor during collection windows for the purpose of monitoring the quality and progress of the data collection. NCES anticipates maintaining this approach during the introduction of the ACTS survey, increasing its cadence as needed to respond to emergent concerns. Should it become necessary in its inaugural collection, NCES is prepared to extend the ACTS collection window past its currently planned end date (March 18, 2026) so that all parties have ample time to submit high-quality in response to the requirements of the new component.


B.5. Reviewing Individuals


Listed below are individuals who have reviewed, in whole or in part, the IPEDS surveys, and/or participated in Technical Review Panel (TRP) meetings charged with revising and refining the surveys and data items collected.


Due to its implementation timeline, no TRP will be held to discuss the contents of the ACTS component. As such, the individuals below are not known to have reviewed, in whole or in part, the ACTS component.


Current Representatives from, or on behalf of, the National Center for Education Statistics (as of December, 2025)

Matthew Soldner, Acting Commissioner, National Center for Education Statistics

Ross Santy, Chief Data Officer, U.S. Department of Education

Brian Fu, Office of the Chief Data Officer, U.S. Department of Education

Jason Delisle, Chief Economist, U.S. Department of Education

Cody Christensen, Office of the Chief Economist, U.S. Department of Education


Prior Representatives from the National Center for Education Statistics (prior to April, 2025)

Aida Ali Akreyi

Samuel Barbett

Elise Christopher

Carrie Clarady

Christopher Cody

Moussa Ezzeddine

Tracy Hunt-White

Tara Lawley

Marie Marcum

Andrew Mary

Audrey Peek

Stacey Peterson

McCall Pitcher

Roman Ruiz

Jie Sun

Kelly Worthington


Representatives from Associations, Postsecondary Institutions/Systems, and Other Federal Offices – TRP 61

Maureen Amos, Northeastern Illinois University

Eric Atchison, Arkansas State University System

Eileen Brennan, Henry Ford College

Bryan Cook, The Association of Public and Land-grant Universities

Mary Ann Coughlin, Springfield College

Bill DeBaun, NCAN

Charlotte Etier, NASFAA

Meredith Fergus, Minnesota Office of Higher Education

Nancy Floyd, Minnesota State Colleges & Universities (MnSCU)

Donyell Francis, Board of Regents of the University System of Georgia

Brian Fu, U.S. Department of Education

Tanya Garcia, Georgetown University Center on Education and the Workforce

Luke Gentala, Liberty University

Emmanual Guillory, UNCF

Eric Hardy, U.S. Department of Education, FSA

Stephen Haworth, Adtalem Global Education

Nicholas Hillman, University of Wisconsin-Madison

Aaron Horn, MHEC

John Ingram, Community College of Allegheny County

Darby Kaikkonen, Washington State Board for Community and Technical Colleges

Christine Keller, Association for Institutional Research

Susan Lounsbury, Southern Regional Education Board (SREB)

Brent Madoo, U.S. Department of Education: Office of the Chief Data Officer

Patrick Perry, California Student Aid Commission

Kent Phillippe, American Association of Community Colleges

Sarah Pingel, Education Commission of the States

Jason Ramirez, National Association of Independent Colleges and Universities

Nerissa Rivera, Duke University

Mary Sommers, University of Nebraska Kearney

Jonathan Turk, American Council on Education (ACE)

Christina Whitfield, State Higher Education Executive Officers (SHEEO)


TRP 64

Eric Atchison, Arkansas State University System

Dianne Barker, National Alliance of Current Enrollment Partnerships (NACEP)

Eileen Brennan, Henry Ford College

Matthew Case, California State University, Office of the Chancellor

Melissa Clinedinst, National Association for College Admission Counseling

Bryan Cook, The Association of Public and Land-grant Universities

Mary Ann Coughlin, Springfield College

Alicia Crouch, Kentucky Community and Technical College System

Michael Flanigan, Virginia Commonwealth University

Nancy Floyd, Minnesota State Colleges and Universities

Kurt Gunnell, Western Governors University

Misty Haskamp, University of Missouri

Christine Keller, Association for Institutional Research

Wendy Kilgore, American Association of Collegiate Registrars and Admissions Officers (AACRAO)

Abby Miller, ASA Research

Joann Moore, ACT, Inc

Kent Phillippe, American Association of Community Colleges

Jason Pontius, Board of Regents State of Iowa

Jason Ramirez, National Association of Independent Colleges and Universities

Ashley Robinson-Spann, College Board

Christina Whitfield, State Higher Education Executive Officers (SHEEO)

Shaun Williams-Wyche, Midwestern Higher Education Compact


TRP 69

Kathryn Akers, Pennsylvania's State System of Higher Education

Eric Atchison, Arkansas State University System

Amy Ballagh, Georgia Southern University

Angella Bell, Board of Regents of University System of Georgia

Matthew Case, California State University, Office of the Chancellor

Nate Clark, Career College of Northern Nevada

Gloria Crisp, Oregon State University

Alicia Crouch, Kentucky Community and Technical College System

Nancy Dugan, Eastern Iowa Community Colleges

John Fink, Community College Research Center, Teachers College, Columbia University

Nancy Floyd, Minnesota State Colleges and Universities

Kurt Gunnell, Western Governors University

Misty Haskamp, University of Missouri

Michael Johnston, Pensacola State College

Jacob Kamer, Tennessee Higher Education Commission

Bryan Kelley, Education Commission of the States

Wendy Kilgore, American Association of Collegiate Registrars and Admissions Officers (AACRAO)

Bao Le, Association of Public and Land Grant Universities

Luis Maldonado, American Association of State Colleges and Universities

Tod Massa, State Council of Higher Education for Virginia

Carolyn Mata, Oglethorpe University

Hironao Okahana, American Council on Education

Kent Phillippe, American Association of Community Colleges

Kristina Powers, Institute for Effectiveness in Higher Education

Elena Quiroz-Livanis, Massachusetts Department of Higher Education

Jason Ramirez, National Association of Independent Colleges and Universities

Tracy Rhoades, University of Phoenix

Mikyung Ryu, National Student Clearinghouse

Bill Schneider, NC Community College System

Colby Spencer Cesaro, Michigan Independent Colleges and Universities

Adam Swanson, Mississippi Gulf Coast Community College

Loralyn Taylor, Ohio University

David Troutman, Texas Higher Education Coordinating Board

Mamie Voight, Institute for Higher Education Policy (IHEP)

Zach Waymer, Higher Learning Commission (HLC)

Christina Whitfield, State Higher Education Executive Officers (SHEEO)

Shaun Williams-Wyche, Midwestern Higher Education Compact




PART B IPEDS 2024-25 THROUGH 2026-27 | 13

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2025-12-18

© 2025 OMB.report | Privacy Policy