It’s that time of year again: Our latest Quality Ratings report, which examines and analyses the quality of long-form thought leadership produced by the world’s leading consulting firms, has now been published.
Reflecting on the changes we have seen since working on our 2020 report, one of the most obvious is the increase in average score for resilience—the key measure for the credibility of a firm’s content. Here we’re looking for evidence that an expert on the topic has authored the content, that the publishing firm has gathered and analysed data to develop and support its point of view, and that it has clearly explained how it has gone about doing this.
The improvement in resilience score is a change we had expected to see going into 2021, as firms left behind the extreme time constraint occasioned by the pressures of the pandemic—which resulted in a greater-than-usual volume of opinion- and action-focused pieces that lacked a solid basis in research—and returned to a more “normal” approach to thought leadership production. However, with clients repeatedly telling us that data and analysis rank very high on their lists of thought leadership content preferences, there is a clear imperative for firms to achieve even more.
Of all the quality pillars, resilience exhibits the greatest variability in score, with a full 3.0 points separating the lowest- and highest-scoring reports, compared to a spread of 2.4 points for appeal and 2.0 for differentiation and prompting action. We’ve outlined how to achieve a better-than-average score of 3/5 for resilience below:
|Have you achieved the following?||What you need to do to score 3/5||Pitfalls to avoid|
|Is it clear who is delivering these views and why they are worth paying attention to?||Authors or experts named, and basic information about their credentials is obvious (e.g., role)||Providing a list of “contacts” or “contributors” rather than authors or experts can create ambiguity about whose views are being represented. But clearly identifying the person behind the content and providing information about how their experience relates to the topic of the report would achieve a score of four—surely the quickest of quick wins.|
|Is the approach to generating insights/recommendations credible and clearly explained?||Audience very likely to understand principal approach used
Approach is credible
Most sources are referenced
|Referencing the source of a complex chart or model as “x firm analysis” is not going to help your audience understand your approach. Your audience wants to know how much weight to put on your insights, and to do so, they’ll need to understand how these have been developed. Just like algebra class, you need to show your work.|
|Has the firm collected or created relevant data?||Firm has collected or created a relevant body of data.||This can come from secondary research or primary sources, but timeliness is part of relevance. Make sure your sources are fairly recent, as this is what audiences are looking for in this fast-paced world.|
|How good is the analysis of this data?||Basic approach to analysis that leads to relevant insights (e.g., simple segmentation)||In order to drive out interesting insights, consider how you can maximise your investment in research by digging into the detail of your data, perhaps by comparing and contrasting responses from different survey cohorts and assessing the root cause of those differences.|