• Graduate Program
    • Why study Business Data Science?
    • Program Outline
    • Courses
    • Course Registration
    • Admissions
    • Facilities
  • Research
  • News
  • Summer School
    • Deep Learning
    • Machine Learning for Business
    • Tinbergen Institute Summer School Program
    • Receive updates
  • Events
    • Events Calendar
    • Events archive
    • Summer school
      • Deep Learning
      • Machine Learning for Business
      • Tinbergen Institute Summer School Program
      • Receive updates
    • Conference: Consumer Search and Markets
    • Tinbergen Institute Lectures
    • Annual Tinbergen Institute Conference archive
  • Alumni

de Jong, M., Fox, J. and Steenkamp, J. (2015). Quantifying Under- and Over-reporting In Surveys Through a Dual Questioning-Technique Design Journal of Marketing Research, 52(6):737--753.


  • Journal
    Journal of Marketing Research

In recent years, marketing researchers have become increasingly interested in under- and overreporting. However, there are few suitable approaches to operationalize deviations from the truth, particularly in behavioral domains in which self-reports are usually the only viable method of choice to measure behavior or attitudes. An especially difficult situation arises if some people underreport while others overreport. This article proposes a Bayesian item response theory model to quantify under- and overreporting in surveys. The method utilizes within-person differences between answers obtained under direct questioning (no privacy protection) and randomized-response questioning (which ensures item-level privacy protection). This method has the important features of incorporating behavioral response-mode effects (e.g., privacy loss when switching from direct to randomized-response questioning, response-mode inertia effects) and allowing the direction of bias to differ across respondents. The authors provide an empirical application for excessive alcohol consumption involving 1,408 respondents from a commercial web panel. The results show that respondents are averse to decreases in privacy and that randomized response is less effective if respondents provide biased responses to earlier direct questions.