Skip to main content

2022 Collaborative Midterm Survey

The 2022 CMS aims to understand the social and political climate, promote innovation, and inform the future of survey research. 

  • Lorrie Frasure, Mark Lopez, Jennifer Agiesta, Nate Cohn and Rahsaan Maxwell at the Hackathon.

Data Viz Tool

 

 

 

Data Innovation Award

Collaborative Midterm Survey 2024 Data Innovation Award

Nominate Here

The Collaborative Midterm Survey is pleased to introduce the CMS Data Innovation Award. This award recognizes innovative uses of CMS data by students and early career professionals – within 6 years of highest degree. Recipients will receive $750.

We are excited to recognize innovative use of CMS data. This could include data used in a conference presentation, academic article, op-ed, dissertation/ thesis, or other use of the data. 

Innovation is broadly defined. Innovations can include: 

  • A novel insight from the data
  • An analysis of CMS data alongside other data
  • A creative data visualization
  • Using the data to inform policy, benefit the public, support a community partner, or entrepreneurial project 
  • Enhance statistical or survey methods 

We expect to award up to 10 CMS Data Innovation Awards in 2024. We encourage self-nominations and nominations of others. Nominations can be submitted at any time and are reviewed on a rolling basis. Nominate Here

 

Frequently Asked Questions:


Where are the submission guidelines?
Submission guidelines are detailed on the application itself. You will be asked to provide the title of your project, where it was presented/published, a brief description of how it was used (250 words max), and a summary of key findings and/or innovations related to CMS data (300 words max). You will also be asked to upload your paper, presentation, or other related product on the application. We have no specific formatting guidelines for the product you will submit. Please feel free to follow the formatting guidelines of the journal, conference, news outlet, etc. that you intend to publicize your work in. 


Is there a deadline? 
There is no deadline. Submissions are continuously reviewed on a rolling basis. Once you feel that your submission is ready, feel free to submit at any time.


Can I collaborate with colleagues? 
Definitely! We fully welcome collaborative projects. Please make sure to detail all authors involved on the project on the application. 


Can I submit a working paper?
Yes. We welcome any type of submission that showcases CMS data in a public forum, such as conference presentations, academic articles, op-eds, dissertations/theses, and similar outlets of public dissemination.  This provides flexibility to submit a paper for the award before a formal publication. 

Where can I find more information about accessing the data? 
We have an informational webinar of the methodologies, results, and details about the 2022 CMS here
 

Lorem Impsum

Thought Summit

Thought Summit on The Future of Survey Science


Government, industry, and academia depend on surveys more than ever before. However, shifting social behaviors, technology, and public trust mean that the accuracy and reliability of surveys are in flux. This Thought Summit brings together experts in survey research with connections to large NSF-funded surveys, data science, and AI. Our goal is to identify strategies and infrastructure to support the most accurate and cost-effective surveys. To do this, the Thought Summit will focus on two overlapping areas.

 1.) Identifying ways government-funded surveys can collaborate to gain efficiencies and meet survey research goals.

2.) Identifying ways AI and data science can enhance these collaborations and inform strategies to solve contemporary challenges faced by survey research.

We are inviting up to five PhD students to attend the Thought Summit fully-funded. More information about the call and how to apply can be found here. The deadline to apply is May 31, 2024.

The Thought Summit will be held between September 22 and September 25, 2024. The summit is supported by the Cornell Center for Data Science for Enterprise and Society.

More details to come. 

Hackathon & Data Launch

Hackathon & Data Launch

Participants joined us in-person or virtually on January 20, 2023 at Cornell Tech in NYC as experts from industry, academia, and media offered their perspectives on the innovations, methods, and data from the 2022 Collaborative Midterm Survey. We called this a hackathon because all panelists received the survey data in advance and incorporated their analyses into the presentations and comments. All data and methods were made publicly available at the time of the event.

The Collaborative Midterm Survey partnered with The Graduate Roosevelt Island Hotel for conference attendees. 

Panelists 

Panelist Headshots

 

Agenda

Participants joined us in-person or virtually on January 20, 2023 at Cornell Tech in NYC as experts from industry, academia, and media offered their perspectives on the innovations, methods, and data from the 2022 Collaborative Midterm Survey. We called this a hackathon because all panelists received the survey data in advance and incorporated their analyses into the presentations and comments. All data and methods were made publicly available at the time of the event.

  • 9:15-10:00am: Continental Breakfast

  • 10:00-10:30am: Welcome and Vision of the 2022 Collaborative Midterm Survey

    • Peter Enns (Cornell University)

  • 10:35am-11:50am: Data Deep Dive Panel: Insights from the multi-mode approach of the Collaborative Midterm Survey

    • Sunshine Hillygus (Duke University), Cindy Kam (Vanderbilt University), G. Elliott Morris (The Economist), David Rothschild (Microsoft Research), Chaired by Jamie Druckman (Northwestern University) 

  • 11:50am-1:00pm: Mentorship Lunch

  • 1:00pm-2:00pm: Data Collection Panel: Collaborating to increase survey accuracy and representation

    • Kristen Conrad and Mickey Jackson (SSRS), Cory Manento (Gradient), Kevin Collins (Survey160), Julianna Pacheco, Caroline Tolbert (University of Iowa), Chaired by Juliana Horowitz (Pew Research Center) 

  • 2:05-3:20pm: Election Roundtable: Insights from the Collaborative Midterm Survey

    • Jennifer Agiesta (CNN), Nate Cohn (New York Times), Mark Lopez (Pew Research Center), Rahsaan Maxwell (NYU), Moderated by Lorrie Frasure (UCLA) 

  • 3:20-3:35pm Coffee

  • 3:35-4:45pm: Roundtable on Innovation in Public Opinion and Survey Research

    • Project Senior Advisors: Jamie Druckman (Northwestern), Sergio Garcia-Rios (UT Austin & Univision News), Juliana Horowitz (Pew Research Center), David Wilson (UC Berkeley), Moderated by Colleen Barry (Cornell)

  • 4:50-5:00pm: Concluding Remarks

    • Jonathon Schuldt (Cornell University)

  • 5:00-6:00pm: Hosted Reception (Drinks and Hors d’Oeuvres) 

Graduate Student Travel Grant

With generous support from the National Science Foundation, we had a limited number of travel grants for graduate students to attend the Hackathon and Data Launch. Individuals selected through the competitive process received two nights of accommodation at The Graduate Hotel on Roosevelt Island and were reimbursed for up $1,000 in travel costs. In addition to presentations from leading media, industry, and academic experts, the Hackathon also included a mentorship lunch of all interested students who attended in person.

Congratulations to the following students for being selected! 

Recordings of the Collaborative Midterm Survey Hackathon & Data Launch are now publicly available!

The Hackathon & Data Launch's virtual component was recorded via Zoom. To view each panel recording click here

Photos from the Collaborative Midterm Survey Hackathon and Data Launch on January 20th, 2023

Hackathon Collage

 

Meet the Data Teams

Following a competitive open call for data collection proposals, our principal investigators and senior advisors conducted a thorough review of all applicants. In the broad and international search, these three exceptional proposals were chosen to be a part of the 2022 Collaborative Midterm Survey.

SSRS

SSRS is a full-service survey and market research firm known for innovative methodologies and optimized research designs.

This team planned to poll 3,100 respondents using a probability-based sample and 3,300 respondents using a non-probability sample from their panel partners. Their sample size was chosen to ensure an adequate number of completes in CA, FL, and WI due to their electoral importance and to allow for statistical power for analyses.

The probability-based samples included their in-house TCPA compliant probability panel, supplemented by an Address-Based Sample (ABS) for WI. The non-probability sample came from one of their trusted panel partners, some of which are the largest and highest quality first-party non-probability panel providers in the world.

The full team includes: 

Gradient Metrics & Survey 160 

Gradient Metrics is a team of data scientists, programmers, and researchers driven by statistics, that bring together traditional market research and data science to build models. Survey 160 is a software product designed specifically to conduct surveys via text message (SMS) conversation.

This team planned to poll 1,600 respondents nationally and another 1,600 respondents in each of the 3 key states via probability-based sampling. They obtained the probability-based sample through a mixed-mode approach. The vast majority of their sample (N=5,500) came from SMS-to-web responses, while the remaining 900 came from mail-to-web (one-third of which were derived from address-based sampling, with the rest coming from registration-based sampling).

Their sampling methods allowed them to gather more probability-based samples at a relatively low cost. In particular, the SMS-to-web format allowed for a cost-effective means of gathering a random sample. Simply put, without the need to draw on non-probability-based sources, they aimed to maximize samples in each of the three target populations.

The full team includes: 

University of Iowa 

The Iowa Social Science Research Center (ISRC)  is an interdisciplinary research center at the University of Iowa. The ISRC offers a variety of data collection services including consultation on survey project design and instruments, as well as full-service project management. They consult for clients nationwide on phone, web, mail, and mixed-mode data collection, as well as focus groups, data entry, data analysis, and other services that support researchers. 

This team expected to poll 1,200 respondents via a telephone RDD sample and 5,200 using a web sample. They planed to use a random digit dial telephone sample to conduct a computer assisted telephone interview using data collection resources from the Iowa Social Science Research Center (ISRC) call center. 

The full team includes: 

Goals & Innovations

  1. Help Understand the 2022 Midterm Election. From 1958 to 2002, the American National Election Study (ANES) conducted midterm surveys to provide insights into elections, voting behavior, and outcomes. While high quality election surveys can always offer important and unique insights into voter preferences and behaviors, media and campaign effects, and political representation and democratic accountability, understanding these factors in the context of midterm elections has never been more important.
  2. Expand understanding of key segments of the electorate. Surveys are often too small and methodologically opaque to analyze variation within racial, geographic, partisan, and other important groups. The sample size of nearly 20,000 combined with methodological transparency and an emphasis on hard-to-reach populations will allow for analyzing state-level data as well as subgroups of the population. Further, we will partner with other election surveys to include some common questions and demographic variables, allowing the Collaborative Midterm Survey to be merged with these surveys, creating an unprecedented opportunity to understand the preferences, attitudes, and behaviors of groups that are impossible to analyze in traditional surveys.
  3. Promote innovative, collaborative, and cost-effective survey strategies. Most survey projects make decisions about sample size, sample type, and survey provider early in the process based on budget constraints and on what strategies have proved effective in the past. To encourage innovation, we are flipping this model and soliciting proposals that encourage methodological diversity and innovation. The budget will be large enough to allow risks and innovative strategies. At the same time, the competitive proposal process encourages cost effective strategies. To ensure collaboration, up to three proposals will be selected to implement the Collaborative Midterm Survey. Proposals can come from any sector, including researchers, nonprofit organizations, survey organizations, tech firms, media organizations, or teams representing a combination of these or other areas.
  4. Develop a transparent, data-driven, and inclusive framework that allows direct assessment of the advantages and tradeoffs of various survey methods. Declining survey response rates amidst rapidly shifting survey methods and changing social conditions mean it is increasingly difficult to identify the most accurate and cost-effective survey strategies. These challenges are especially important to solve for the many government surveys designed to understand economic and business conditions, health outcomes, crime victimization, and many other areas. The combination of three collaborators that each use multiple methodological strategies to conduct the same survey during the same time period along with rigorous methodological disclosure will allow the 2022 Collaborative Midterm Survey to offer unprecedented insight into the advantages and tradeoffs of various survey methods. Recognizing tradeoffs is important, because it may be that some methods are better for reaching harder-to-reach populations, while other methods yield more accurate national or state-level data. Thus, the 2022 Collaborative Midterm Survey does not imply that a single best approach exists. Rather, the collaborative and multi-method approach is designed to offer a comprehensive and inclusive framework for identifying tradeoffs of various sampling and methodological strategies. The 2022 Collaborative Midterm Survey will include numerous indicators that can be compared to known population benchmarks at the state and national level. We envision this framework being used in future surveys, offering an ongoing transparent, data-driven, and inclusive framework to continually monitor the most effective survey methods for various goals.
  5. Rapid and public dissemination of results. All data and methodological documentation will be made publicly available through the Roper Center for Public Opinion Research and the project website. The data will be made available through an easy-to-use search interface and presented through intuitive data visualizations. Furthermore, a data launch and hackathon will take place on January 20. This event will be livestreamed to encourage broad attendance. Finally, methodological reports and recommendations written by the project team and others will be made accessible through the project website.

Questionnaire

Each of the three survey versions included the same (approximately) 25 questions on vote choice, policy preferences, racial attitudes, and feeling thermometer questions, approximately 20 existing demographic questions, and 5 to 10 questions that match the question wording of other prominent midterm election surveys. The remaining 25 questions differed across surveys (selected teams were invited to provide input into these 25 questions). The median interview time was approximately 20 minutes.

Sample

Each researcher/organization/team conducted at least 6,400 complete interviews (for a total sample size of more than 19,000). We encouraged a hybrid sampling approach with a minimum of 1,200 via a probability-based sample. The probability sample could have use methods such as face-to-face interviews based on random address-based sampling, RDD, a probability-based panel, other approaches, or some combination of these. The remaining minimum of 5,200 could have been probability-based, non-probability-based, or a mix. Samples were allocated to allow for national-level estimates as well as state-level estimates in California, Wisconsin, and Florida. We emphasized these three states because of their electoral importance, the diversity of the populations, and because the size of these states allow for valid inferences given the proposed sample sizes.

Timeline

September 7: Selections announced (up to 3 different researchers/organizations/teams)

September 14: Final questionnaire provided to researchers/organizations/teams

October 26-November 22: Midterm Data Collection

November 23-December 20: Data delivery to Project team

The Roper Center for Public Opinion Research archived and made publicly available all topline and individual-level data.

January 20: Data launch and hackathon at Cornell Tech

Researchers, survey organizations/vendors, and/or individuals, organizations, or teams representing some combination of the above were invited to apply to participate in providing data for the 2022 Collaborative Midterm Survey. 

Each researcher(s)/organization(s)/team received up to $225,000 (total of $675,000 for the entire survey) to conduct their part of the survey. All data was archived and made publicly available by the Roper Center for Public Opinion Research.

To learn more about the proposal process, view the recording of our informational webinar or visit our FAQ page. You can also email questions to midtermsurvey@cornell.edu

Proposals were evaluated on numerous criteria, including strategies for: 

  • Generating national-level insights and insights related to at least the states of California, Wisconsin, and Florida.
  • Reaching traditionally hard-to-reach populations
  • Weighting
  • Quality controls
  • Cost-effectiveness (Up to $225K per proposal)
  • Estimated data delivery time
  • Previous election polling and record of data transparency (if relevant)

Proposals were evaluated anonymously by the full project team (PIs and Senior Advisors).

Partner on Common Questions

Whether you are involved with a data collection proposal or not, one of our goals was to include some common questions and demographics with other surveys conducted around the Midterm Election.

We merged these identically worded questions across surveys to facilitate additional analyses with larger sample sizes.

Questions?

View the recording of our informational webinar, visit our FAQ page, or email our team at midtermsurvey@cornell.edu.

See recent media coverage of the Collaborative Midterm Survey in the links below.

Interested in learning more about the Collaborative Midterm Survey or interviewing one of the PIs or Senior Advisors? Contact us at midtermsurvey@cornell.edu.

Hackathon Photos

Tagged Hackathon Photos

 

Media Coverage 

About the Team

The 2022 Collaborative Midterm Survey is led by three PIs at Cornell University and four senior advisors.

Principal Investigators

  • Peter K. Enns, PI

    Professor of Government and Public Policy and the Robert S. Harrison, Director of the Cornell Center for Social Sciences at Cornell University, Co-Founder of Versasight

    Headshot of Peter Enns
  • Colleen L. Barry, co-PI

    Dean of the Brooks School of Public Policy at Cornell University

    Colleen
  • Jonathon P. Schuldt, co-PI

    Associate Professor of Communication and Executive Director of the Roper Center for Public Opinion Research at Cornell University

    Headshot of Jon Schuldt

Senior Advisors

  • Jamie Druckman

    Payson S. Wild Professor of Political Science at Northwestern University

    Headshot of Jamie Druckman
  • Sergio Garcia-Rios

    Assistant Professor and Associate Director for Research, Center for the Study of Race and Democracy at The University of Texas at Austin and the Election Polling Director at Univision Television Network

    Headshot of Sergio Garcia-Rios
  • Juliana Horowitz

    Associate Director of Research at Pew Research Center

    Headshot of Juliana Horowitz
  • David C. Wilson

    Dean of the Goldman School and Professor of Public Policy at University of California–Berkeley

    Headshot of David C. Wilson

This project is funded by the National Science Foundation (Award: 2210129) with additional support from:

  • We'd love to hear your ideas, suggestions, or questions!

    Are you
    Would you like to be contacted for further assistance?