Raushni Bhagia
Points of View

Points of View: Bridge Over Troubled Water

Whenever the measurement process for any medium is overhauled, it leads to an uproar. The Indian Readership Survey is only the latest example. Isn't there some way of easing the transition from one system to another?

Any change in the common currency comes with the risk of mass discontentment. Recently, the research agency in charge of the Indian Readership Survey brought out its new-and-improved technology and embraced a new methodology. The subsequent rumpus over the results was for all to see. Not everyone in the industry welcomed the results in equal measure. It's uncommon to see 18 major players unanimously bash the currency on charges of being inaccurate and inept.

Something similar happened when TAM, the television audience measurement system, was boycotted by several broadcasters, all significant players, on account of its updated universe, that is, inclusion of LC1 (less than class one) towns.

Likewise, whenever RAM, the radio audience measurement system, witnessed changes in frequency or expanded its sample, there was huge hue and cry that followed.

BARC's initial sets of data might fetch a similar reaction from the industry. At least till the dust settles.

So, how does one shift from one system of ratings to another with minimum anxiety? Is there a way of ensuring a smooth transition? Here's what our respondents had to say. Edited excerpts.

Points of View: Bridge Over Troubled Water
Points of View: Bridge Over Troubled Water
Points of View: Bridge Over Troubled Water
Points of View: Bridge Over Troubled Water
Apurva Purohit, CEO, Radio City 91.1FM

Measurement changes could be either methodology or execution related. The industry went through both and the issues that emanate from each are diverse. Each has a dramatically different impact on the industry.

Regarding methodology, consider the change of measure in the television industry from TVRs/GRP to TVT. It gave the advertiser insight about the total reach of the channel/show and therefore a better ROI. In this case, the research house had the responsibility of correlating the results with those published previously so that all stake holders had a clear perspective.

Now, consider the issues that arise in the current IRS; it is not as much a methodology related challenge as it is an execution related problem. When reputed research agencies report numbers that defy any trend or that have gross anomalies, it is squarely the responsibility of the research agency. The agency should have validated the data before publishing the results.

Indranil Roy, president, Outlook Group

Firstly, there is a need for more involved co-ordination between the publishers/broadcasters and the respective research body, right at the outset, that is, when the methodology is conceived.

Secondly, the research body must validate the data over and over, before publishing it. It must be understood by all that many business are running based on these numbers. At any given point, someone is in an advantageous position and someone is not; that will always happen. But common sense application is very important. That is exactly what went wrong in the readership survey.

Highly inflated numbers came out of few markets, while few other markets weren't recorded in the research. For example, Outlook's readership has gone up three times in Bihar. Now, the problem is - why is it going up in that market? What have I done for this result? The researchers need to speak to the on-ground personnel and understand and validate the data, time and again.

Also, there is a lack of responsibility on part of the research agencies. It's strange that about 144 magazines weren't listed in the new IRS.

Ashish Sehgal, chief sales officer, ZEEL

Whenever there are changes to be made, the research agency needs to co-ordinate the changes from time to time. Either the technology changes or the parameters that are set to calculate the currency change. The agency starts explaining the process only after the data is out and many times after discontent is expressed.

The players who have gained still make peace with it, but the ones who lose protest and there is chaos. Though in case of the readership survey, the methodology and technology changes were communicated to few in the industry, I doubt the complete process was explained to them beforehand. It is important that the players who are going to use the data are taken into confidence.

Nitin Chaudhary, business head, HT Mumbai

The preparation starts by convincing ourselves beyond doubt that the change is indeed for the better. All the bodies, RSCI, MRUC and Tech Comm, have representation across the three stakeholders - advertisers, publishers and agencies. It is pretty democratic.

The only thing one can question is the process and not the results. The process was well discussed, debated and accepted by the industry as a whole. The study wouldn't have gone ahead without that. Probably, there would be a couple of anomalies in the data, considering the scale.

This type of data (currency) shouldn't be seen as point-in-time data; it is a series of reports. Hence, it is best to wait for a couple of rounds before comparing. We call these anomalies because they don't sit in our earlier framework, the data set we were used to.

Note that the earlier set of data is over eighteen months old, and additionally, the new research takes into account the 2011 census data. A lot has changed. The number of new urban centers has become very big and conversely some cities have seen a reduction in population.

On the publishers' side also, some have invested in readership in some markets and some others have wound down their operations. If with all these changes, and the new technology and methodology that comes with the new study, the expectation is that the data will be the same as last time, then there wasn't really any point in changing the system in the first place.

Have news to share? Write to us atnewsteam@afaqs.com