Surveys and data collections are in a time of flux in almost every aspect; and response rates are declining throughout the federal statistical agencies. Modes of communication are changing, population composition is shifting with new populations of interest emerging, and attitudes toward participation in federal surveys are deteriorating. The National Center for Education Statistics (NCES) charged a panel of technical experts to examine the impact on the response rate of the current recruitment process and recruitment materials. In particular, the panel was asked to consider the potential of four avenues for improving response and participation rates in NCES studies and surveys:
- Design of a recruitment strategy and monitoring its implementation;
- Materials design and text;
- Special issues in recruitment of hard-to-reach populations; and
- Overall strategy to address recruitment at all levels from administrative decision-makers to participants.
The panel met via teleconferences with an in-person meeting at NCES on 4-5 January 2018.
- Recruitment should be planned holistically with coherent elements integrated into a process that can be monitored and managed in real time.
- Recruitment strategies and materials should be developed to communicate effectively with the specific target population(s), based on understanding what motivates response from that population.
- Language should be direct and clear, written specifically for the target audience (i.e., not a redact of language suitable for OMB); and text should be limited to the essential communication with required caveats and government terms separated and deferred to follow the essential communication.
- NCES should take ownership of the recruitment process design, materials development, monitoring and analysis of effectiveness whether work is done in-house or is delegated to a contractor.
1. In development, shift attention given to individual items to the full sequence of materials, the context, and the role of each item in that context. Make strategic decisions about modes of contact and modes of response.
2. Once in implementation phase monitor data collection assiduously to assess efficacy of the messaging campaign and make adjustments if needed. Focus on recipients opening the materials and acting on them.
3. Engage professional expertise (within NCES and/or from contractor) in materials development, testing and monitoring. Allow time for adequate testing and consequent revision of materials.
4. Language throughout all text (materials and website information) must be intelligible, i.e., in plain and compelling language, for the target population; and actions to be taken should be immediately clear.
5. NCES studies should prominently indicate NCES (or National Center for Education Statistics) on all materials and also should make direct links to the NCES website prominent. This branding should be universal for all NCES studies and surveys; specifically, this should include NAEP.
6. Standards and expectations for new recruitment materials need to be spelled out both for NCES staff and for contractors (included in the RFP), especially to avoid recycling materials which may have been developed from a different communication or for a different mode of contact.
In addition to the specific findings, the panel is concerned that NCES faces two structural challenges in effectuating recommended changes. Since NCES studies are not in synchrony, various NCES staff are grappling with different crucial issues associated with different stages of survey development, implementation, and data assembly. Hence no time is natural for sharing knowledge among staff about any particular stage. Secondly, the in-house technical expertise to continue to assure best practices is limited. While contractors do possess this expertise, a conflict of interest arises when the investment required for innovation may not be in the contractor’s own interest. The panel notes that NCES leadership is well aware of this issue and has been proactive in encouraging innovation.
Additional specifically focused recommendations are included in the final section of this report.
Edward Blair, Michael J. Cemo Professor of Marketing & Entrepreneurship; Chair of the Department of Marketing & Entrepreneurship, C. T. Bauer College of Business, University of Houston
Don A. Dillman, Regents professor, Department of Sociology; Deputy Director for Research and Development, Social and Economic Sciences Research Center, Washington State University
Rachel Horwitz, Survey Methodologist, U.S. Census Bureau
Jennifer Godinez, Associate Director, Minnesota Education Equity Partnership (MnEEP)
Kristen Olson, Leland J. and Dorothy H. Olson Associate Professor of Sociology, University of Nebraska-Lincoln
Darby Steiger, Senior Survey Methodologist, Westat
Nell Sedransk, Director, National Institute of Statistical Sciences