Changing the Way We Deliver Analysis and Reporting

The market research industry doesn’t spend too much time reflecting on how data processing and reporting is carried out. Perhaps, it is time
that we stopped and reflected.

I would be giving my age away if I mentioned that I can remember data analysis being carried out using counter sorters, so let’s pass beyond that era of technology. For those who are curious just Google ‘counter sorter punch cards’ or visit a museum and will you learn more.

Until somewhere around 1995, data processing was usually carried out by teams of people in the ‘DP department’. It was something of a joke that the senior management didn’t really know what these people did as they worked away in dark offices in the basement. More accurately, perhaps, senior management often didn’t know how they did their work and what their problems were.

I recall a paper by Mark Katz in 1999 called ‘The Real Barriers to Technology’. He described a world where data processing teams were always under pressure both in terms of time and money with no time to think and no time to consider doing things more efficiently. He described a scenario where UKP250 (USD400) would not be spent by management to buy software that would improve efficiency by 10-20% – perhaps, because no one had time to dontevaluate the tool or because no one was in a position or capable of making a decision.

And then, the industry was saved – or, perhaps, more accurately changed. Data processing warehouses started to form, mainly in India. Now, senior managers in bigger economies could both save money and distance themselves from the onerous task of running a data processing team. At the same time as this, people leaving university were far more ‘technical savvy’ and so things that were the sole domain of the DP people was more open to question by newer recruits to our industry.

At the risk of making a sweeping statement, the result of this was that the larger research agencies in the stronger economies sent their work to India, the smaller agencies in the stronger economies bought a new breed of ‘easy to use’ (but sometimes limited) software products to handle DP themselves, while the lesser economies continued with DP teams.

Whether the rush to send work to India was a success is a multi-faceted question. The accountants would have one answer, the buyers and suppliers would have other answers.

My view is that sending work far from home works well when it is clear what you want. Once a lot of communication is needed because the work is difficult or very detailed, the benefits quickly fall. A ‘one size fits all’ approach to data processing seldom works – small, fast turnround, simple projects and big complex projects need different strategies, in my view, to more standard projects.

But, there is an even bigger change happening now. The change is the realisation that however data processing is managed, researchers need to keep close to their data. They need to understand their data, they need to present their data and, what’s more, the average researcher comes from an age when using software is not a challenge. This is all happening at a time when market research is being called upon to deliver more quickly. The advent of web surveys has meant that some projects can be commissioned one day with results available the next day. Buyers are not necessarily sympathetic to projects where delivery takes a month or more.

Alongside this change, the days of providing research buyers with large volumes of tables and charts are thankfully ending. I clearly recall a research buyers’ panel at the 2011 ESOMAR APAC Conference where the call was for quality not quantity. As a software person, I interpreted this as well-presented insights not mass production from the DP department.

So, what does this mean? It means that the best use of outsource companies is to carry out the labour-intensive or more predictable parts of the process. Rather than providing data processing, it is data management that is needed – cleaning of data, building of variables, entry of texts etc. Expert DP still needs experts to handle it. However, the tools for detailed analysis and reporting need to be at the fingertips of the researchers, who should be giving the clients what they demand – properly presented insights.

This has manifested itself in the software and services company that I run. The services we sell are now focused far more on the expert end of the spectrum. Most companies, large and small, can handle simpler projects efficiently, but should seek help for complex trackers, repetitive requirements or projects that are just very difficult or unusual in some way.

We now sell software that lets the researcher take control of the valuable data they own and produce tables, reports and charts in the form they want – to tell the story of their project using software to help them. It may sound like a subtle shift, but it’s the start of a big movement. Data processors should be preparing data for the researchers to carry out in depth analysis and construct meaningful reports.

We have come a long way since the badly organised ways of the 1990s, but we must continue to steer our production of results and deliverables in efficient ways. This means doing things using proven systems where ‘systems’ are the key to success and using ‘experts’ where expertise is the key to success. However, through all of this, the researcher should have their data for reporting at their fingertips – not in a dark basement.