Skip to main content

2022 Collaborative Midterm Survey

The 2022 CMS aims to understand the social and political climate, promote innovation, and inform the future of survey research. 

  • Lorrie Frasure, Mark Lopez, Jennifer Agiesta, Nate Cohn and Rahsaan Maxwell at the Hackathon.

Data Viz Tool

 

 

 

Data Innovation Award

Collaborative Midterm Survey 2024 Data Innovation Award

Nominate Here

The Collaborative Midterm Survey is pleased to introduce the CMS Data Innovation Award. This award recognizes innovative uses of CMS data by students and early career professionals – within 6 years of highest degree. Recipients will receive $750.

We are excited to recognize innovative use of CMS data. This could include data used in a conference presentation, academic article, op-ed, dissertation/ thesis, or other use of the data. 

Innovation is broadly defined. Innovations can include: 

  • A novel insight from the data
  • An analysis of CMS data alongside other data
  • A creative data visualization
  • Using the data to inform policy, benefit the public, support a community partner, or entrepreneurial project 
  • Enhance statistical or survey methods 

We expect to award up to 10 CMS Data Innovation Awards in 2024. We encourage self-nominations and nominations of others. Nominations can be submitted at any time and are reviewed on a rolling basis. Nominate Here

 

Frequently Asked Questions:


Where are the submission guidelines?
Submission guidelines are detailed on the application itself. You will be asked to provide the title of your project, where it was presented/published, a brief description of how it was used (250 words max), and a summary of key findings and/or innovations related to CMS data (300 words max). You will also be asked to upload your paper, presentation, or other related product on the application. We have no specific formatting guidelines for the product you will submit. Please feel free to follow the formatting guidelines of the journal, conference, news outlet, etc. that you intend to publicize your work in. 


Is there a deadline? 
There is no deadline. Submissions are continuously reviewed on a rolling basis. Once you feel that your submission is ready, feel free to submit at any time.


Can I collaborate with colleagues? 
Definitely! We fully welcome collaborative projects. Please make sure to detail all authors involved on the project on the application. 


Can I submit a working paper?
Yes. We welcome any type of submission that showcases CMS data in a public forum, such as conference presentations, academic articles, op-eds, dissertations/theses, and similar outlets of public dissemination.  This provides flexibility to submit a paper for the award before a formal publication. 

Where can I find more information about accessing the data? 
We have an informational webinar of the methodologies, results, and details about the 2022 CMS here
 

Spring 2024 Awardees 

Congratulations to Eveline Dowling, a political science doctorate graduate from the University of California, Davis. Dowling used CMS feeling thermometer data in her dissertation chapter titled "Independents, Leaners, and Identity: Affective Polarization and Nonpartisans in the United States."  to measure potential in-group favoritism among independents. This project was also presented at the Midwestern Political Science Association in April, 2023 in Chicago, IL.

Graphic of Eveline Dowling

 

 

Coll and Torres Graphic

Congratulations to Jospeh Coll, an assistant professor of political science from The College of Wooster, and Rachel Torres, an assistant professor of political science from The University of Nevada, Las Vegas. They used CMS data in an academic paper titled "The Unqualified Voter: Racial Animus in Support for Voter Qualifications" to examine whether racial animus influences support for voter qualifications among white Americans.

 

 

Congratulations to Katie Worrall, a recent Master's graduate from University of Illinois, Urbana-Champaign, Political Science Department and now a Master of Social Work research student at the University of Nevada, Reno.  Worrall used CMS data in a conference presentation titled "Policing Protests: Does Repression, Surveillance, and Targeting of Civil Rights Protesters Cause Chilling Effects in the United States?" to create composite outcome variables to measure chilling effects including, political participation, trust, and perceptions of democracy. This work was presented at the Midwest Political Science Association, in Chicago, IL, April 2024.

Worrall Graphic

 

Thought Summit

Thought Summit on The Future of Survey Science: Sept. 2024


Government, industry, and academia depend on surveys more than ever before. However, shifting social behaviors, technology, and public trust mean that the accuracy and reliability of surveys are in flux. This Thought Summit brings together experts in survey research with connections to large NSF-funded surveys, data science, and AI. Our goal is to identify strategies and infrastructure to support the most accurate and cost-effective surveys. To do this, the Thought Summit will focus on two overlapping areas.

 1.) Identifying ways government-funded surveys can collaborate to gain efficiencies and meet survey research goals.

2.) Identifying ways AI and data science can enhance these collaborations and inform strategies to solve contemporary challenges faced by survey research.

Cornell faculty, staff, and students are invited to attend all sessions.

To further promote young talent and ensure participation of emerging scholars, funding will allow six PhD students who were selected from an open call for applications to attend the Summit.

The Summit is supported by the Cornell Center for Data Science for Enterprise and Society, the Cornell Center for Social Sciences, and the National Science Foundation (Award: 2431915).

Organizers:

Colleen Barry
Inaugural Dean, Brooks School of Public Policy

Peter Enns
Professor, Department of Government & Brooks School of Public Policy; Robert S. Harrison Director, Cornell Center for Social Sciences; Co-founder, Verasight.io

Thorsten Joachims
Professor, Department of Computer Science & Department of Information Science; Associate Dean for Research, Bowers College of Information Science

Jonathon P. Schuldt
Professor, Department of Communication; Executive Director of the Roper Center for Public Opinion Research

Monday 9/23: Survey Challenges and Opportunities


9:00am – Continental Breakfast (429 ILR Conference Center, <5min. walk from Statler Hotel)

9:30am – Session 1 (423 ILR Conference Center)

9:30-9:45: Welcome
9:45-10:45am – Doug Rivers (Stanford): Challenges (and perhaps solutions) facing surveys

10:45 – Break (15 minutes)

11:00am – Session 2: NSF Surveys: Overview, Challenges, and Opportunities (423 ILR Conference Center)

Moderator: Colleen Barry (Cornell)

ANES: Nick Valentino (University of Michigan) and Sunshine Hillygus (Duke)

GSS: René Bautista (NORC) and Pamela Herd (Georgetown)

PSID: Esther Friedman (University of Michigan)

CES: Jeremy Pope (BYU)

CMPS: Lorrie Frasure (UCLA) and Natalie Masuoka (UCLA)

CHIP50: David Lazer (Northeastern) and Jamie Druckman (Rochester)

CMS: Peter Enns (Cornell) and Jon Schuldt (Cornell)

TESS: Jamie Druckman (Rochester) and Maureen Craig (Duke)

Presenters will share one challenge or opportunity related to their NSF-funded survey. Short project overviews (3 slides) will be circulated to all attendees in advance.

12:30pm – Lunch Break

2:15pm – Session 3: Planning Session on Collaboration, Innovation, and Infrastructure (423 ILR Conference Center)

Moderator: David Wilson (Berkeley)

Conversation Starters: Frauke Kreuter (University of Maryland) and David Lazer (Northeastern)

Our vision is to generate ideas for collaboration, innovation, and infrastructure that would benefit existing and future government supported surveys as well as survey practitioners in business, research, and public policy more generally.

3:30  – Break (15 minutes)

3:45-5:15pm – Session 4: PhD Student Presentations (423 ILR Conference Center)

Moderators: Brady West (University of Michigan) and Cameron McPhee (SSRS)

Alice Malmberg (UC Davis)

Jennifer Lin (Northwestern)

Saskia Bartholomäus (GESIS)

Brianna Zichettella (University of Michigan)

Isabela Coehlo (University of Maryland)

Hansol Lee (Stanford University)

10-minute presentations followed by Q&A. We thank our “presentation consultants” for connecting with these PhD students in advance of the Summit: Trent Buskirk (Old Dominion University), Sunshine Hillygus (Duke), Natalie Masuoka (UCLA), Jeremy Pope (BYU), Melissa Sands (LSE), Nick Valentino (Michigan)

Tuesday 9/24: The Future of Survey Research


9:00am – Continental Breakfast (429 ILR Conference Center, <5min. walk from Statler Hotel)

9:30am – Session 1: AI and Survey Research (423 ILR Conference Center)

Celestine Mendler-Dünner (Tübingen AI Center and Max Planck Institute for Intelligent Systems)

10:30am – Break (15 minutes)

10:45am – Session 2: AI and other Innovations in Survey Research (423 ILR Conference Center)

Moderator: Thorsten Joachims (Cornell)

Byungkyu Lee (NYU)

Neil Malhotra (Stanford)

Paige Bollen (Ohio State) and Melissa Sands (LSE)

David Rothschild (Microsoft Research)

15-minute presentations followed by Q&A

12:15 – Break for Lunch, Reconvene on Wednesday

Wednesday 9/25: Takeaways and Action Items


9:00am – Continental Breakfast (Taylor Room, Statler Hotel, 2nd Floor)

9:30am – Session 1: Opportunities (Taylor Room, Statler Hotel, 2nd Floor)
danah boyd (Microsoft Research)

10:30 – Break (15 minutes)

10:45am – Final Session: Takeaways and Action Items (Taylor Room, Statler Hotel, 2nd Floor)

Moderator: Neil Lewis, Jr. (Cornell)

Beth McGinty (Weill Cornell Medicine)

Lee Walker (NSF)

Jamie Druckman (Rochester)

René Bautista (NORC)

12pm – Lunch (There will be “to-go” options for those who need to leave right at noon.)

External Participants

 

Alice Malmberg, Ph.D. Candidate in Political Science at UC Davis

Brianna Zichettella, Ph.D. candidate in the Department of Communication and Media at the University of Michigan

Brady West, Research Professor in the Survey Methodology and Data Science Program at the University of Michigan’s Survey Research Center, Fellow of the American Statistical Association

Byungkyu Lee, Assistant Professor of Sociology at New York University, and co-Director of the Networks in Context lab

Cameron McPhee, SSRS Chief Methodologist and former Principal Researcher and Survey Methodologist at the American Institutes for Research

Celestine Mendler-Düenner, Principal Investigator at the ELLIS Institute in Tübingen, co-affiliated with the Max Planck Institute for Intelligent Systems and the Tübingen AI Center

Cheryl Eavey, NSF Program Director, Directorate for Social, Behavioral and Economic Sciences (SBE) Division of Social and Economic Sciences (SES) Methodology, Measurement, and Statistics (MMS)

danah boyd, Partner Researcher at Microsoft Research, Visiting Professor at Cornell University, and founder of Data & Society.

David Lazer, Professor of Political Science and Computer Science at Northeastern University, Co-PI of the CHIP50 project, co-founder of Volunteer Science, co-founder of the National Internet Observatory, Visiting Scholar at the Institute for Quantitative Social Science at Harvard

David Rothschild, Economist at Microsoft Research and Co-PI on Penn Media Accountability Project

Doug Rivers, Professor of Political Science and Senior Fellow at the Hoover Institution, Stanford University, Chief Scientist at YouGov, CEO of Crunch.io

David C. Wilson, Professor of Public Policy and Political Science, and Dean of the Goldman School of Public Policy at the University of California, Berkeley

Esther Friedman, Research Associate Professor at the Survey Research Center at the University of Michigan, Associate Director of the Panel Study of Income Dynamics (PSID)

Frauke Kreuter, Co-Director of the Social Data Science Center and faculty member in the Joint Program in Survey Methodology at the University of Maryland, USA; and Professor of Statistics and Data Science at the Ludwig-Maximilians-University of Munich

Hansol Lee, PhD student in Education Data Science at Stanford University

Isabela Coelho, PhD student at the University of Maryland and Graduate Research Assistant at the Social and Data Science Center

James Druckman, Professor of Political Science at the University of Rochester, Co-PI of the CHIP50 Project, Co-PI of Time-Sharing Experiments for the Social Sciences (TESS)

Jennifer Lin, Ph.D. Candidate at the Department of Political Science at Northwestern University

Jeremy Pope, Professor of Political Science at Brigham Young University, Co-PI of the Cooperative Election Study, Co-PI of the American Family Survey; Constitutional Government Fellow at the Wheatley Institution, Senior Scholar with the Center for the Study of Elections and Democracy

Lee Walker, NSF Program Director, Research Infrastructure in the Social and Behavioral Sciences (RISBS), Accountable Institutions and Behavior (AIB)

Lorrie Frasure, Inaugural Ralph J. Bunche Endowed Chair and Professor of Political Science and African American Studies at UCLA, Co-PI of the Collaborative Multiracial Post-Election Survey (CPMS), Director of the Ralph J. Bunche Center for African American Studies

Maureen Craig, Associate Professor of Psychology & Neuroscience, Duke University; Co-PI of Time Series Experiments for the Social Sciences (TESS)

Melissa Sands, Assistant Professor of Politics and Data Science, London School of Economics

Natalie Masuoka, Associate Professor of Political Science and Asian American Studies at UCLA, Faculty Director of the Asian American Policy Initiative, Co-PI of the Collaborative Multiracial Post-Election Survey

Neil Malhotra, Edith M. Cornell Professor of Political Economy at Stanford Graduate School of Business

Nicholas Valentino, Donald R. Kinder Collegiate Professor of Political Science and Research Professor in the Center for Political Studies at the University of Michigan, PI of the American National Election Studies (ANES)

Paige Bollen, Assistant Professor of Political Science, the Ohio State University

Pamela Herd, Carol Kakalec Kohn Professor of Social Policy, Ford School of Public Policy, University of Michigan, Co-PI of the General Social Survey (GSS)

René Bautista, Associate Director of the Methodology and Quantitative Social Sciences Department at NORC-University Chicago, Co-PI and Director of the General Social Survey (GSS)

Saskia Bartholomäus, PhD student in Data and Research on Society, GESIS

Sunshine Hillygus, Professor of Political Science and Public Policy at Duke University, Director of the Duke Initiative on Survey Methodology, Associate PI of the American National Election Studies (ANES)

Trent Buskirk, Professor and Provost Fellow, School of Data Science, Old Dominion University

Hackathon & Data Launch

Hackathon & Data Launch

Participants joined us in-person or virtually on January 20, 2023 at Cornell Tech in NYC as experts from industry, academia, and media offered their perspectives on the innovations, methods, and data from the 2022 Collaborative Midterm Survey. We called this a hackathon because all panelists received the survey data in advance and incorporated their analyses into the presentations and comments. All data and methods were made publicly available at the time of the event.

The Collaborative Midterm Survey partnered with The Graduate Roosevelt Island Hotel for conference attendees. 

Panelists 

Panelist Headshots

 

Agenda

Participants joined us in-person or virtually on January 20, 2023 at Cornell Tech in NYC as experts from industry, academia, and media offered their perspectives on the innovations, methods, and data from the 2022 Collaborative Midterm Survey. We called this a hackathon because all panelists received the survey data in advance and incorporated their analyses into the presentations and comments. All data and methods were made publicly available at the time of the event.

  • 9:15-10:00am: Continental Breakfast
  • 10:00-10:30am: Welcome and Vision of the 2022 Collaborative Midterm Survey
    • Peter Enns (Cornell University)
  • 10:35am-11:50am: Data Deep Dive Panel: Insights from the multi-mode approach of the Collaborative Midterm Survey
    • Sunshine Hillygus (Duke University), Cindy Kam (Vanderbilt University), G. Elliott Morris (The Economist), David Rothschild (Microsoft Research), Chaired by Jamie Druckman (Northwestern University) 
  • 11:50am-1:00pm: Mentorship Lunch
  • 1:00pm-2:00pm: Data Collection Panel: Collaborating to increase survey accuracy and representation
    • Kristen Conrad and Mickey Jackson (SSRS), Cory Manento (Gradient), Kevin Collins (Survey160), Julianna Pacheco, Caroline Tolbert (University of Iowa), Chaired by Juliana Horowitz (Pew Research Center) 
  • 2:05-3:20pm: Election Roundtable: Insights from the Collaborative Midterm Survey
    • Jennifer Agiesta (CNN), Nate Cohn (New York Times), Mark Lopez (Pew Research Center), Rahsaan Maxwell (NYU), Moderated by Lorrie Frasure (UCLA) 
  • 3:20-3:35pm Coffee
  • 3:35-4:45pm: Roundtable on Innovation in Public Opinion and Survey Research
    • Project Senior Advisors: Jamie Druckman (Northwestern), Sergio Garcia-Rios (UT Austin & Univision News), Juliana Horowitz (Pew Research Center), David Wilson (UC Berkeley), Moderated by Colleen Barry (Cornell)
  • 4:50-5:00pm: Concluding Remarks
    • Jonathon Schuldt (Cornell University)
  • 5:00-6:00pm: Hosted Reception (Drinks and Hors d’Oeuvres) 

Graduate Student Travel Grant

With generous support from the National Science Foundation, we had a limited number of travel grants for graduate students to attend the Hackathon and Data Launch. Individuals selected through the competitive process received two nights of accommodation at The Graduate Hotel on Roosevelt Island and were reimbursed for up $1,000 in travel costs. In addition to presentations from leading media, industry, and academic experts, the Hackathon also included a mentorship lunch of all interested students who attended in person.

Congratulations to the following students for being selected! 

Recordings of the Collaborative Midterm Survey Hackathon & Data Launch are now publicly available!

The Hackathon & Data Launch's virtual component was recorded via Zoom. To view each panel recording click here

Photos from the Collaborative Midterm Survey Hackathon and Data Launch on January 20th, 2023

Hackathon Collage

 

Meet the Data Teams

Following a competitive open call for data collection proposals, our principal investigators and senior advisors conducted a thorough review of all applicants. In the broad and international search, these three exceptional proposals were chosen to be a part of the 2022 Collaborative Midterm Survey.

SSRS

SSRS is a full-service survey and market research firm known for innovative methodologies and optimized research designs.

This team planned to poll 3,100 respondents using a probability-based sample and 3,300 respondents using a non-probability sample from their panel partners. Their sample size was chosen to ensure an adequate number of completes in CA, FL, and WI due to their electoral importance and to allow for statistical power for analyses.

The probability-based samples included their in-house TCPA compliant probability panel, supplemented by an Address-Based Sample (ABS) for WI. The non-probability sample came from one of their trusted panel partners, some of which are the largest and highest quality first-party non-probability panel providers in the world.

The full team includes: 

Gradient Metrics & Survey 160 

Gradient Metrics is a team of data scientists, programmers, and researchers driven by statistics, that bring together traditional market research and data science to build models. Survey 160 is a software product designed specifically to conduct surveys via text message (SMS) conversation.

This team planned to poll 1,600 respondents nationally and another 1,600 respondents in each of the 3 key states via probability-based sampling. They obtained the probability-based sample through a mixed-mode approach. The vast majority of their sample (N=5,500) came from SMS-to-web responses, while the remaining 900 came from mail-to-web (one-third of which were derived from address-based sampling, with the rest coming from registration-based sampling).

Their sampling methods allowed them to gather more probability-based samples at a relatively low cost. In particular, the SMS-to-web format allowed for a cost-effective means of gathering a random sample. Simply put, without the need to draw on non-probability-based sources, they aimed to maximize samples in each of the three target populations.

The full team includes: 

University of Iowa 

The Iowa Social Science Research Center (ISRC)  is an interdisciplinary research center at the University of Iowa. The ISRC offers a variety of data collection services including consultation on survey project design and instruments, as well as full-service project management. They consult for clients nationwide on phone, web, mail, and mixed-mode data collection, as well as focus groups, data entry, data analysis, and other services that support researchers. 

This team expected to poll 1,200 respondents via a telephone RDD sample and 5,200 using a web sample. They planed to use a random digit dial telephone sample to conduct a computer assisted telephone interview using data collection resources from the Iowa Social Science Research Center (ISRC) call center. 

The full team includes: 

Goals & Innovations

  1. Help Understand the 2022 Midterm Election. From 1958 to 2002, the American National Election Study (ANES) conducted midterm surveys to provide insights into elections, voting behavior, and outcomes. While high quality election surveys can always offer important and unique insights into voter preferences and behaviors, media and campaign effects, and political representation and democratic accountability, understanding these factors in the context of midterm elections has never been more important.
  2. Expand understanding of key segments of the electorate. Surveys are often too small and methodologically opaque to analyze variation within racial, geographic, partisan, and other important groups. The sample size of nearly 20,000 combined with methodological transparency and an emphasis on hard-to-reach populations will allow for analyzing state-level data as well as subgroups of the population. Further, we will partner with other election surveys to include some common questions and demographic variables, allowing the Collaborative Midterm Survey to be merged with these surveys, creating an unprecedented opportunity to understand the preferences, attitudes, and behaviors of groups that are impossible to analyze in traditional surveys.
  3. Promote innovative, collaborative, and cost-effective survey strategies. Most survey projects make decisions about sample size, sample type, and survey provider early in the process based on budget constraints and on what strategies have proved effective in the past. To encourage innovation, we are flipping this model and soliciting proposals that encourage methodological diversity and innovation. The budget will be large enough to allow risks and innovative strategies. At the same time, the competitive proposal process encourages cost effective strategies. To ensure collaboration, up to three proposals will be selected to implement the Collaborative Midterm Survey. Proposals can come from any sector, including researchers, nonprofit organizations, survey organizations, tech firms, media organizations, or teams representing a combination of these or other areas.
  4. Develop a transparent, data-driven, and inclusive framework that allows direct assessment of the advantages and tradeoffs of various survey methods. Declining survey response rates amidst rapidly shifting survey methods and changing social conditions mean it is increasingly difficult to identify the most accurate and cost-effective survey strategies. These challenges are especially important to solve for the many government surveys designed to understand economic and business conditions, health outcomes, crime victimization, and many other areas. The combination of three collaborators that each use multiple methodological strategies to conduct the same survey during the same time period along with rigorous methodological disclosure will allow the 2022 Collaborative Midterm Survey to offer unprecedented insight into the advantages and tradeoffs of various survey methods. Recognizing tradeoffs is important, because it may be that some methods are better for reaching harder-to-reach populations, while other methods yield more accurate national or state-level data. Thus, the 2022 Collaborative Midterm Survey does not imply that a single best approach exists. Rather, the collaborative and multi-method approach is designed to offer a comprehensive and inclusive framework for identifying tradeoffs of various sampling and methodological strategies. The 2022 Collaborative Midterm Survey will include numerous indicators that can be compared to known population benchmarks at the state and national level. We envision this framework being used in future surveys, offering an ongoing transparent, data-driven, and inclusive framework to continually monitor the most effective survey methods for various goals.
  5. Rapid and public dissemination of results. All data and methodological documentation will be made publicly available through the Roper Center for Public Opinion Research and the project website. The data will be made available through an easy-to-use search interface and presented through intuitive data visualizations. Furthermore, a data launch and hackathon will take place on January 20. This event will be livestreamed to encourage broad attendance. Finally, methodological reports and recommendations written by the project team and others will be made accessible through the project website.

Questionnaire

Each of the three survey versions included the same (approximately) 25 questions on vote choice, policy preferences, racial attitudes, and feeling thermometer questions, approximately 20 existing demographic questions, and 5 to 10 questions that match the question wording of other prominent midterm election surveys. The remaining 25 questions differed across surveys (selected teams were invited to provide input into these 25 questions). The median interview time was approximately 20 minutes.

Sample

Each researcher/organization/team conducted at least 6,400 complete interviews (for a total sample size of more than 19,000). We encouraged a hybrid sampling approach with a minimum of 1,200 via a probability-based sample. The probability sample could have use methods such as face-to-face interviews based on random address-based sampling, RDD, a probability-based panel, other approaches, or some combination of these. The remaining minimum of 5,200 could have been probability-based, non-probability-based, or a mix. Samples were allocated to allow for national-level estimates as well as state-level estimates in California, Wisconsin, and Florida. We emphasized these three states because of their electoral importance, the diversity of the populations, and because the size of these states allow for valid inferences given the proposed sample sizes.

Timeline

September 7: Selections announced (up to 3 different researchers/organizations/teams)

September 14: Final questionnaire provided to researchers/organizations/teams

October 26-November 22: Midterm Data Collection

November 23-December 20: Data delivery to Project team

The Roper Center for Public Opinion Research archived and made publicly available all topline and individual-level data.

January 20: Data launch and hackathon at Cornell Tech

Researchers, survey organizations/vendors, and/or individuals, organizations, or teams representing some combination of the above were invited to apply to participate in providing data for the 2022 Collaborative Midterm Survey. 

Each researcher(s)/organization(s)/team received up to $225,000 (total of $675,000 for the entire survey) to conduct their part of the survey. All data was archived and made publicly available by the Roper Center for Public Opinion Research.

To learn more about the proposal process, view the recording of our informational webinar or visit our FAQ page. You can also email questions to midtermsurvey@cornell.edu

Proposals were evaluated on numerous criteria, including strategies for: 

  • Generating national-level insights and insights related to at least the states of California, Wisconsin, and Florida.
  • Reaching traditionally hard-to-reach populations
  • Weighting
  • Quality controls
  • Cost-effectiveness (Up to $225K per proposal)
  • Estimated data delivery time
  • Previous election polling and record of data transparency (if relevant)

Proposals were evaluated anonymously by the full project team (PIs and Senior Advisors).

Partner on Common Questions

Whether you are involved with a data collection proposal or not, one of our goals was to include some common questions and demographics with other surveys conducted around the Midterm Election.

We merged these identically worded questions across surveys to facilitate additional analyses with larger sample sizes.

Questions?

View the recording of our informational webinar, visit our FAQ page, or email our team at midtermsurvey@cornell.edu.

See recent media coverage of the Collaborative Midterm Survey in the links below.

Interested in learning more about the Collaborative Midterm Survey or interviewing one of the PIs or Senior Advisors? Contact us at midtermsurvey@cornell.edu.

Hackathon Photos

Tagged Hackathon Photos

 

Media Coverage 

About the Team

The 2022 Collaborative Midterm Survey is led by three PIs at Cornell University and four senior advisors.

Principal Investigators

  • Peter K. Enns, PI

    Professor of Government and Public Policy and the Robert S. Harrison, Director of the Cornell Center for Social Sciences at Cornell University, Co-Founder of Versasight

    Headshot of Peter Enns
  • Colleen L. Barry, co-PI

    Dean of the Brooks School of Public Policy at Cornell University

    Colleen
  • Jonathon P. Schuldt, co-PI

    Associate Professor of Communication and Executive Director of the Roper Center for Public Opinion Research at Cornell University

    Headshot of Jon Schuldt

Senior Advisors

  • Jamie Druckman

    Payson S. Wild Professor of Political Science at Northwestern University

    Headshot of Jamie Druckman
  • Sergio Garcia-Rios

    Assistant Professor and Associate Director for Research, Center for the Study of Race and Democracy at The University of Texas at Austin and the Election Polling Director at Univision Television Network

    Headshot of Sergio Garcia-Rios
  • Juliana Horowitz

    Associate Director of Research at Pew Research Center

    Headshot of Juliana Horowitz
  • David C. Wilson

    Dean of the Goldman School and Professor of Public Policy at University of California–Berkeley

    Headshot of David C. Wilson

This project is funded by the National Science Foundation (Award: 2210129) with additional support from:

  • We'd love to hear your ideas, suggestions, or questions!

    Are you
    Would you like to be contacted for further assistance?