Media analysis has developed as rapidly as the media with which it works. A rigorous approach to content analysis dates to 1927, with Harold Lasswell’s research on propaganda. Media analysis began almost immediately following invention of the printing press. Types of media analysis used today have their history in the Gutenberg Bible, but the concept has changed alongside the media being studied.

Beginning as a task of mere listening, media analysis has evolved to include identifying, evaluating and predicting, too. As media has become less linear, the ways with which media is analyzed has followed suit.

We are in the fourth generation of media analytics, where intelligence and insight are paramount. The evolution of this field has provided substantial proof to the value of such insights in developing actionable strategy. The increase in predictive analysis and AI roles across all leading analytics platforms and services is made possible through integrated work.

Understanding how media analytics has evolved is important to understand the direction in which it is currently heading.

Generation 1: The Listening Generation (2002)

The first generation of media analytics was focused on listening. During this most linear generation, a brand would plan, implement and wait for coverage before then reviewing what did and did not make an impact. Interest was focused on:

  • Volumes
  • Reach
  • Key media

The articles of measurement were websites and traditional media, with some cutting-edge platforms attempting to provide insights on chat room content. The leaders in the field at the time emerged from academic, legal and business fields, such as Lexis/Nexis, Factiva and JSTOR. Back then, real-time monitoring was a well-conceived yet poorly-implemented notion. Internet was beginning to challenge the daily news cycle, but most traditional sources were still transitioning. Thus, daily-occurring analysis had the greatest value since it was most frequent.

The first online measurement tools emerged as hit counter services, with Enterprise, Omniture, WebSideStory, Nedstat and Unica being the most popular.

Generation 2: The Response Strategy Generation (2003-2007)

The second generation of media analysis emerged following the U.S.’s initial intervention in Iraq. Military forces, human rights organizations, private operators, contractors and support services activated in an environment where embedded journalists were beginning to provide close and real-time updates; funds were available for robust investigative reporting.

Simultaneously, parent companies like Tribune, Fox and Sinclair were moving towards an integrated media model. Print, television and radio reporters faced the new expectation of cross publication. Websites were emerging as viable sources for comprehensive news.

Within the following two years, Hurricane Katrina would cement the importance of media monitoring as more than listening. Brands like Walmart, Waffle House, Home Depot, Tide, the Red Cross and others realized media listening could inform strategic action, greatly benefitting a brand. Media analysis highlighted that large brands could activate their supply chains, providing highly visible help to those in need. With the goal of providing continuous listening services, online engines such as Google increased investments in analytic capacities like Urchin and XiTi.

During the same period, social media sites like MySpace, Friendster and, in 2006, Facebook began to grow in popularity. This provided opportunity for the public to listen, respond and, most importantly, react to stimuli in real-time. In response, Radian6 and Scout Labs launched, enabling companies to measure, analyze and report on social media activity.

All the analytics brands understood that their tools were response-based. The tools were used for real time and post-hoc response strategy, not any kind of predictive work,

Generation 3: The Cyclical Generation (2008-2015)

Shifting from the Response Strategy Generation to the Cyclical Generation was subtler than the previous evolution. The internet as a unifying platform continued to evolve, so the needs and expectations for media analytics evolved. Brands created news and content, published content across multiple platforms and expected meaningful analysis. Many of them incorporated spokespeople or mascots. With a resurgence of what was previously coined ‘cool hunting’ in the 1990s, micro-targeting, a.k.a. influencer marketing, was viewed as a potentially lucrative engagement.

These trends drove a higher standard in social media analytics. Key requirements included:

  • Real-time and historical trend analysis
  • Trend alerts
  • Online traffic analysis
  • Multi-platform monitoring
  • Sentiment
  • Influencer identification
  • Audience profiling

Analysis was responsible for quantitative and qualitative activity integration across multiple social and traditional media platforms. Media analysis evaluates general activity (and those of competitors), but consumer interaction was key: who the audience was, about what they cared and to whom they listened. Media analysis was no longer devoted to response; it was now informing strategy and action and providing response strategy and listening.

This reflects the nature of media. The message-sender/message-receiver model originally drove communication analysis, but the process was more an unending cycle now. Each component was as important as the others – each stage informing the potential growth or development of the next.

This created opportunity because now analytics platforms could focus on doing one thing well. There were monitoring platforms like NetBase, Nuvi and Crimson-Hexagon, influencer identification and micro-targeting options like Traackr and Klout and real-time analysis and social media trend-alerting sites like TrendSpottr, NewsWhip, BrandWatch and Talkwalker.

Feedback was as important as coverage and content, something distinctive to this time. One part informed the rest, reflecting the media environment’s complete non-linearity. Audience was a key stakeholder in the process, not just the outcome. Another emerging challenge was the volume of information and media options available to brands and consumers; to be heard above the din, brands had to be responsive to the right audience at the right time in the right way.

Generation 4: (2016-present)

A precise, powerful approach to communications was necessary in response to the media environment cacophony. Technological advances in several fields rose to this challenge, meeting emerging analytical needs. Key topics, audiences, platforms, influencers and listening were homed in on by artificial intelligence and predictive analysis, omitting much of the noise from both media and analysis.

Building off cyclical analytics, firms integrated lessons learned with big data; to develop predictive and proactive analysis. Platforms like Quid leveraged big data to conduct topical network analysis of news previously relegated to academics. Platforms like Affinio used similar social network analysis to identify audiences into tribes of interest which provided sub-audiences, influencers and insight for targeted relationships. Marketo, Sharablee, PeoplePattern and SpreadFast used big data to model and grow audience databases, developing deeper level profiles and insights.

The trend has been to integrate social and consumer data to develop a more perfect audience profile first, then back-map a history of content and coverage to inform or predict future interests, activities, or insights. This is light-years beyond Stanford University’s 2011 SNAP project for social media predictive modeling.

The use of such powerful tools has increased the importance of analysts handling the data. While the first, second and third generations were focused on ease of use for even the least experienced individual, the complexity of new platforms places a stronger emphasis on expert interpretation of the data. There is little distinction between qualitative and quantitative analysis, as one provides a weighted value to the other. A binary sentiment score doesn’t mean much, especially when evaluated textually. While AI may be doing much of the heavy lifting calculating otherwise impossible analyses, the results are more complex; interpretation and translation are then required for laypeople.

All the data available requires human analysis to identify truly meaningful findings. As this generation advances, we expect to see continued leveraging of extreme data, brilliant development of predicative and adaptive technology and continued revolutionary creativity in the creation of analytic solutions. These advancements will require continued expertise in data analysis and interpretation; a human eye on what really matters is still needed.

By: Kate LaVail