Skip to main content

2022 Collaborative Midterm Survey

The 2022 Collaborative Midterm Survey aims to help promote innovation, inform the future of collaborative survey research, and understand the midterm election. 

  • Sign that reads "Midterm 2022"
  • Collaborative Midterm Survey

    • -Deadline has passed-

      Researchers, survey organizations/vendors, and/or individuals, organizations, or teams representing some combination of the above are invited to apply to participate in providing data for the 2022 Collaborative Midterm Survey. 

      Each researcher(s)/organization(s)/team will receive up to $225,000 (total of $675,000 for the entire survey) to conduct their part of the survey. All data will be archived and made publicly available by the Roper Center for Public Opinion Research.

      To learn more before submitting your proposal, view the recording of our informational webinar or visit our FAQ page. You can also email questions to

    • Questionnaire

      Each of the three survey versions will include the same (approximately) 25 questions on vote choice, policy preferences, racial attitudes, and feeling thermometer questions, approximately 20 existing demographic questions, and 5 to 10 questions that match the question wording of other prominent midterm election surveys. The remaining 25 questions will differ across surveys (selected teams will be invited to provide input into these 25 questions). The median interview time will be approximately 20 minutes.


      Each researcher/organization/team will conduct at least 6,400 complete interviews (for a total sample size of more than 19,000). We encourage a hybrid sampling approach with a minimum of 1,200 via a probability-based sample. The probability sample could use methods such as face-to-face interviews based on random address-based sampling, RDD, a probability-based panel, other approaches, or some combination of these. The remaining minimum of 5,200 could be probability-based, non-probability-based, or a mix. Samples should be allocated to allow for national-level estimates as well as state-level estimates in California, Wisconsin, and Florida. We emphasize these three states because of their electoral importance, the diversity of the populations, and because the size of these states allow for valid inferences given the proposed sample sizes.

      1. Help Understand the 2022 Midterm Election. From 1958 to 2002, the American National Election Study (ANES) conducted midterm surveys to provide insights into elections, voting behavior, and outcomes. While high quality election surveys can always offer important and unique insights into voter preferences and behaviors, media and campaign effects, and political representation and democratic accountability, understanding these factors in the context of midterm elections has never been more important.
      2. Expand understanding of key segments of the electorate. Surveys are often too small and methodologically opaque to analyze variation within racial, geographic, partisan, and other important groups. The sample size of nearly 20,000 combined with methodological transparency and an emphasis on hard-to-reach populations will allow for analyzing state-level data as well as subgroups of the population. Further, we will partner with other election surveys to include some common questions and demographic variables, allowing the Collaborative Midterm Survey to be merged with these surveys, creating an unprecedented opportunity to understand the preferences, attitudes, and behaviors of groups that are impossible to analyze in traditional surveys.
      3. Promote innovative, collaborative, and cost-effective survey strategies. Most survey projects make decisions about sample size, sample type, and survey provider early in the process based on budget constraints and on what strategies have proved effective in the past. To encourage innovation, we are flipping this model and soliciting proposals that encourage methodological diversity and innovation. The budget will be large enough to allow risks and innovative strategies. At the same time, the competitive proposal process encourages cost effective strategies. To ensure collaboration, up to three proposals will be selected to implement the Collaborative Midterm Survey. Proposals can come from any sector, including researchers, nonprofit organizations, survey organizations, tech firms, media organizations, or teams representing a combination of these or other areas.
      4. Develop a transparent, data-driven, and inclusive framework that allows direct assessment of the advantages and tradeoffs of various survey methods. Declining survey response rates amidst rapidly shifting survey methods and changing social conditions mean it is increasingly difficult to identify the most accurate and cost-effective survey strategies. These challenges are especially important to solve for the many government surveys designed to understand economic and business conditions, health outcomes, crime victimization, and many other areas. The combination of three collaborators that each use multiple methodological strategies to conduct the same survey during the same time period along with rigorous methodological disclosure will allow the 2022 Collaborative Midterm Survey to offer unprecedented insight into the advantages and tradeoffs of various survey methods. Recognizing tradeoffs is important, because it may be that some methods are better for reaching harder-to-reach populations, while other methods yield more accurate national or state-level data. Thus, the 2022 Collaborative Midterm Survey does not imply that a single best approach exists. Rather, the collaborative and multi-method approach is designed to offer a comprehensive and inclusive framework for identifying tradeoffs of various sampling and methodological strategies. The 2022 Collaborative Midterm Survey will include numerous indicators that can be compared to known population benchmarks at the state and national level. We envision this framework being used in future surveys, offering an ongoing transparent, data-driven, and inclusive framework to continually monitor the most effective survey methods for various goals.
      5. Rapid and public dissemination of results. All data and methodological documentation will be made publicly available through the Roper Center for Public Opinion Research and the project website. The data will be made available through an easy-to-use search interface and presented through intuitive data visualizations. Furthermore, a data launch and hackathon will take place on January 20. This event will be livestreamed to encourage broad attendance. Finally, methodological reports and recommendations written by the project team and others will be made accessible through the project website.
    • Proposals will be evaluated on numerous criteria, including strategies for: 

      • Generating national-level insights and insights related to at least the states of California, Wisconsin, and Florida.
      • Reaching traditionally hard-to-reach populations
      • Weighting
      • Quality controls
      • Cost-effectiveness (Up to $225K per proposal)
      • Estimated data delivery time
      • Previous election polling and record of data transparency (if relevant)

      Proposals will be evaluated anonymously by the full project team (PIs and Senior Advisors).

    • Timeline

      September 7: Selections announced (up to 3 different researchers/organizations/teams)

      September 14: Final questionnaire provided to researchers/organizations/teams

      October 26-November 22: Midterm Data Collection

      November 23-December 20: Data delivery to Project team

      The Roper Center for Public Opinion Research will archive and make publicly available all topline and individual-level data.

      January 20: Data launch and hackathon at Cornell Tech

    • Partner on Common Questions

      Whether you are involved with a data collection proposal or not, one of our goals is to include some common questions and demographics with other surveys conducted around the Midterm Election.

      We will merge these identically worded questions across surveys to facilitate additional analyses with larger sample sizes. If you are connected to a survey or surveys that will be conducted around the 2022 Midterm and are interested in partnering on common questions, please let us know (

      Questions about either opportunity?

      View the recording of our informational webinar, visit our FAQ page, or email our team at

  • NYC Hackathon and Data Launch

    • Data Launch

      On January 20, 2023, the Collaborative Midterm Survey Team is hosting a data launch and hackathon at Cornell Tech to showcase the survey results from this project. 


      Cornell Tech Verizon Space


      Cornell Tech Event space


    • Hackathon Agenda:

      January 20th, 2023

      • 9:30-10:00am: Continental Breakfast
      • 10:00-10:20am: Welcome and Vision of the 2022 Collaborative Midterm Survey (PIs: Peter Enns, Colleen Barry, and Jon Schuldt)
      • 10:20-11:00am: What We Learned (Project Senior Advisors & Roper Center: Jamie Druckman, David Wilson, Juliana Horowitz, Sergio Garcia-Rios, and Kathleen Weldon)
      • 11:15am-12:30pm: Hackathon 1: Midterm Findings 
      • 12:30-1:30pm: Mentorship Lunch
      • 1:30-2:45pm: Hackathon 2: Data Deep Dive 
      • 3:00-4:15pm: Survey Teams: What We Learned 
      • 4:15-4:35pm: Conclusion
      • 4:35-5:45pm: Reception 

About the Team

The 2022 Collaborative Midterm Survey is led by three PIs at Cornell University and four senior advisors.

Principal Investigators

  • Peter K. Enns, PI

    Professor of Government and Public Policy and the Robert S. Harrison Director of the Cornell Center for Social Sciences at Cornell University

    Headshot of Peter Enns
  • Colleen L. Barry, co-PI

    Dean of the Brooks School of Public Policy at Cornell University

  • Jonathon P. Schuldt, co-PI

    Associate Professor of Communication and Executive Director of the Roper Center for Public Opinion Research at Cornell University

    Headshot of Jonathon Schuldt

Senior Advisors

  • Jamie Druckman

    Payson S. Wild Professor of Political Science at Northwestern University

    Headshot of Jamie Druckman
  • Sergio Garcia-Rios

    Assistant Professor and Associate Director for Research, Center for the Study of Race and Democracy at The University of Texas at Austin and the Election Polling Director at Univision Television Network

    Headshot of Sergio Garcia-Rios
  • Juliana Horowitz

    Associate Director of Research at Pew Research Center

    Headshot of Juliana Horowitz
  • David C. Wilson

    Dean of the Goldman School and Professor of Public Policy at University of California–Berkeley

    Headshot of David C. Wilson

This project is funded by the National Science Foundation (Award: 2210129) with additional support from:

  • We'd love to hear your ideas, suggestions, or questions!

    Are you
    CAPTCHA This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.