Response Bias & Indian Opinion Polls

The International Social Science Journal published a volume on Opinion Surveys in Developing Countries as far back as 1963. In it, Emily Jones wrote an elementary paper/article titled Courtesy Bias in South East Asian Surveys. Sadly, that still hasn’t been outdone much in the Indian context despite the time that’s elapsed. The basic problem can be paraphrased as, “surveying the underlings of a feudal society results in them telling the interviewer what they think the interviewer wants to hear.”

The response bias in this case is not unique to feudal societies, though the context is. An example that a well written paper on the subject considers is drug use, welfare receipts etc in a western society. However, the model suggested in it simply cannot be imported to the problem at hand.

Let’s explore the problem at hand.

Many surveys in India, including CSDS, use college students for the field work. As it happens, colleges across India happen to be over represented with Upper Caste students. The outcome of this isn’t easy to guess: most likely, an Upper Caste, male student is going end up asking many lower caste people whom they will be voting for. More likely, a UP Brahmin is likely to go to a Dalit in that state and ask a question on voting preference. The probability that the Dalit either doesn’t answer or lies outright appears non-negligible. 

These conjectures are actually borne out by the following piece of evidence: every single Opinion Poll in Uttar Pradesh in the past 10 years has consistently under estimated the BSP. In 2004, a simple average of the BSP’s Opinion Poll results were 9 percentage points lower than their actual result. In 2009, it was much closer to the actual result. One wonders if this was because Ms Mayawati was the Chief Minister then and the fear/courtesy transformed into pride. But now, we are back to 2004 in terms of who’s in power in Lucknow. A piece of data that stands out in this regard from the CNN-IBN/CSDS Poll is only 6% of UP Muslims supporting the BSP while 13% claim to support the BJP.

The only reasonable solution to the problem is more and more polling data and a detailed study of this phenomenon to model the bias well enough. Perhaps some clever student can even come up with an adjustment factor. Until then, it can only mean polling data in India is to only be bench-marked against its own past results. And only trends mean anything and absolute numbers can safely be discounted.

5 thoughts on “Response Bias & Indian Opinion Polls

  1. Jerin Francis

    Another defect that we ignore is the fudging the data. I remember reading a Reuters article illuminating how data fudging by postmen affects the inflation measurements of RBI. It is far easier to make up responses that seem to be in accordance with the popular mood than to do the grunt work of visiting remote places and interviewing people and asking them question that they would rather not answer. It would take incredible naivete to assume that the response of the opinion polls are mostly from real people. I think, this together with the response bias explains the illogical swings and trends in these surveys that defy common sense and why they never seem to get better at forecasting.

    Interesting read: How postmen, and their wives, may help set India’s monetary policy
    http://www.reuters.com/article/2014/01/28/us-india-economy-cpi-idUSBREA0R0DB20140128

  2. Gaurav

    Response bias is a very real problem in many issues, but in this context, I am not sure. I am not saying it is impossible. In fact, it probably is non-negligible as you say. But I doubt its magnitude is enough to play a significant role in skewing the polls. With decades of discourse and activism about caste equality and the success of many Dalit and OBC politicians and parties, I am not sure if the underlying response bias (or even social desirability bias) is going to be significant enough.

    I think the bigger problem is sample selection. No sample even in the west is close to truly random, but steps are taken by the reputable firms to make sure it is at least quasi-random. Of course, it is easier in the west (or at least in the US), where the society is nowhere as segmented as India’s. But still, when I look at whatever little detail Indian polling firms give on methodology, I am struck by the problems with the samples which don’t seem representative of the Indian electorate on a lot of demographic variables.

    And going beyond the electorate itself. I am also not sure how much attention Indian polling firms (or franchises of foreign polling firms operating in India) pay to the “likely” voter aspect. I suspect that’s where a huge chuck of the error creeps in. These firms are either unwilling or unable to identify who the likely voters are. So their samples probably tend to over-represent voters who just won’t go out and vote (and under-represent those who will).

    One final point. You’ve studied this in way more detail than I have, so you might know. Do polling firms in India make any statistical corrections to account for their sample selections issues? Even something as basic as Heckman correction?

  3. Pingback: Sampling Errors & Response Bias vs Human Stubbornness | Puram

Comments are closed.