Big Data: Does size matter?

The Big Data Pandemic by Ken Roberts, CEO of Forethought is an interesting and thought provoking read. The article sets the stage by predicting that big data will explode at an alarming rate (projected growth in 2013 was $18.1 Billion USD) and completely negate the need for Small Data such as hypothesis, surveys and sampling. Anyone working in Small Data, on reading that, would have pricked up their ears and exclaimed “hang on a minute”, which is precisely the path that Robert’s then takes by offering the other side to the argument and concluding that the jury is still out as to whether Big Data will outperform Small Data.

Although there are many advocates for big data and its contribution to the marketing environment, issues with stability and integrity of data as well as basing decisions purely on clusters of behaviors without understand what is driving things has left opinions somewhat divided. Couple that with the inability of business in general to actually implement Big Data programs, (many can’t even get to grips with Small Data so how can they be expected to embrace Big Data) we are probably still a long way from this making a significant impact, or change to, business as usual approaches.

140113_bigdata

As many of you know, I’m a huge advocate of Big Data. I’m totally fascinated by what information you can gain by using it and I think it can provide valuable insights in to identifying “what” is happening.

At the Bureau of Meteorology we use Big Data and Computer Models to provide weather forecasts. The Big Data generated by our observation networks coupled with the predictive analysis data from computer modeling can tell us with a high degree of certainty exactly what is happening, when it is happening and where it is happening. What it doesn’t necessarily do though is tell us “why” it is happening. For that we default back to our forecasters and observers on the ground to analyze what the models are outputting to help explain the “why”. In simple terms what I am talking about here is quantitative and qualitative data.

In the retail environment it is equally as complex. Big Data may well be able to provide valuable information on what people are buying and where they are buying it from, future trends etc. but is it really able to accurately predict how people feel about what they purchase, the decision making process they went through, whether they were happy with their purchase and would do it again? In an article by Chris Anderson in 2008 he argues that the why people do something or how they feel about it isn’t really that important as long as they do it, he is certainly more convinced with that than I am.

ckieagkwcaaj9gf

And that was my key take away from the Robert’s article. Big Data can be a great resource of qualitative data, but would you really want to leave it all in the hands of computer models and artificial intelligence and then base your most important business decisions on just that?

To obtain a really good understanding, I think it’s still important to qualify those results by understanding the “why” and for that, you need some human intervention. As Robert’s notes “perhaps it is not about big data versus small data but rather, big data and small data combining to produce synergistic insight.

My money is on Roberts. What do you think?

dashboard-snockered-624x418

Analytics and metrics: It’s all in the interpretation

The backdrop to this week’s post is a paper by Germann et al. (2013): Performance implications of deploying marketing analytics. The paper gives a sobering narrative of the use (or not) of analytics in marketing and the impact this can have on the improvement of the overall decision making process; not just for marketing practitioners but also for customers / audiences.

A couple of stand outs for me from the article were that in a recent study of 587 executives from large International companies, only 10% of the companies regularly deployed marketing analysis. The biggest push back on using analytics was that it slowed down the business and caused “analysis paralysis” (which is the inability to make decisions due to having to wait for data). (Peters and Waterman 1982)

The term “analysis paralysis” for me takes on a completely different form. It is not so much about not being able to make a decision due to waiting on data but similar to consumer choice paralysis, it is more about not being able to make a decision because of having too much information. Therefore, the bigger issue is not what to report on, but finding the balance between having the right data to tell the story, versus the management’s ability to actually process the information and then make decisions.

LabBratzEp-017-
Image source Lab Bratz – It’s all in the way you look at it

One of my key deliverables at work is to compile reports on what was happening across our social media landscape. High level results (vanity metrics) provide the management team with a quick snap shot of how we are travelling. But what do these figures even mean, and are they even useful?

Understanding on what to report, and how to report it, is key here. All communication, including Social Media campaigns should be based around some form of strategy. As Avinash Kaushik notes in his article Digital Marketing and Measurement Tool, the strategy should contain clear goals and objectives, so the reporting should reflect and align with this. Some reporting you can cover with vanity metrics but in most cases you have to dig deeper, much deeper to be able to fully understand and convey the results.

Simple charts, high level results and key metrics will help get the message across much better than complicated spreadsheets full of data and often a simple infographic can be used as a summary. Adding a narrative to help explain the results is very important and if your report has a lot of terms that people may not understand (such as impressions, mentions etc.) a simple glossary can work wonders.

Social Media Analytics

To help explain how Social Media analytics and metrics works I have broken it down to three specific levels of information.

  • High-Level Reporting (what the result was)
  • Analysis (establishing what is happening in the data)
  • Interpretation (determining what the results actually mean and any implications)

High-Level Reporting (Vanity metrics)

This is high-level information that is provided in the form of metrics for specific attributes such as impressions, engagement, number of likes, shares, retweets, video views etc. Management love vanity metrics. They are easy to understand and it provides some indication that “something” is happening. Although, without additional information they can be very misleading. For example, “the results for Facebook comments this week are 200% better than last week”. What a great result, right?? Well that really depends on whether everyone is commenting on the fact that they think your brand sucks or not, or maybe the week before was just the worst ever and this one isn’t much better, it’s just better than last week. It’s easy to get caught up in the big numbers and positive results without really understanding what they mean. That’s where analysis comes in.

Pointless-Social-Media-Vanity-Metrics2
The man has spoken!!!

Analysis (examining the data)

Breaking down the metrics to look at things such as; trends over time, performance against benchmarks, performance against competition, platform comparison and so on. In simple terms this is looking at the data at a deeper level to identify things that are happening within the data. Analysis is an important step in identifying “what” is happening to your Social Media activity but often lacks the “why” and this is where the interpretation of the results come in.

funny-graphs-bar-chart
Positively trending charts make people happy, but what do they actually mean?

Interpretation (deep level analysis of data and narrative explaining the results)

Data interpretation is a specific skill set that is seldom used or sufficiently resourced yet is one of the most critical stages of the whole data analysis process. It is much more than just looking at the data and picking trends or movement in the data, it is about understanding the results and their implications. It is about identifying not only what we know from the data but also what we don’t know and then filling that gap with factual data, knowledge and experience to identify opportunities and then communicate these findings in the form of a narrative to aid the decision making process.

Capture11
In a nutshell

In the section where I work we don’t have a dedicated resource for marketing analysis and interpretation. Yet the interpretation stage is where a good analyst will start questioning the results, and will prove (or disprove) theory or assumptions to provide an evidence based narrative. Without interpretation of the results by someone that understands the data and the situation it represents, it is left to the audience to work out what’s happening and that could be a recipe for disaster.

For example, during a recent campaign an announcement was made that a video that we had released on social media was the most successful yet. This statement was made entirely based on vanity stats (the number of times the video was viewed). The data was analysed further and the resulting trend did in fact show the number of video views was higher than any other video in the previous six month period. High-five everyone, break out the champagne!!!

When it came to interpret the results however this wasn’t the case. The objective of the campaign was to encourage viewers to watch the video until the end. The previous campaign had been run on YouTube and the most recent one was an embedded video on Facebook. It also transpired that the number of video views also included auto play on Facebook and a deeper dig in to Facebook insights showed that 95% of those that viewed the video did so for less than 10 secs. In terms of the objective, this was actually one of the most underperforming campaigns to date and highlights why interpretation of the results is such a critical step. Without it, the next campaign would have just followed the same course.

stats-cheese
Data, it is all in the interpretation. Reducing cheese intake saves lives!!

And here it begins …

Welcome to my blog page.

This blog has been set up as part of my Master of Marketing Course which I am studying at Monash University and will feature a number of posts relating to Social Media Marketing during the course of the semester.

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑