data

How to Test Social Data Accuracy: Self-Reported Data vs. Third-Party Data

In an increasingly more sophisticated paid social environment, where targeting criteria grow with every day though still coming from self-reported data, the need to focus on measuring performance based on audience quality and not purely based on quantitative metrics becomes imperative.

As a result the question of using self-reported social data vs. third-party data becomes key to delivering quality AND quantity. The second question that comes out of this need for looking at performance outside the social vanity metrics of new followers or people who saw a social post, is the accuracy of self-reported data across social platforms.

So how do you test social data accuracy and how do you decide if third party data might be a better choice to reach your audience across a social platform?

To answer those questions, we’ve done a little test of our own at Social@Neo recently to analyse the accuracy of employment data specifically – as offered by LinkedIn vs. Facebook, against a known data set.

How we tested:

We worked with Ogilvy Group’s HR departments in the US and the UK to collect data about Ogilvy Staffers across all companies of the group. The data set we received was split by company, age range and sex.

Both Facebook’s and LinkedIn’s advertising audience data allows us to segment users based on the following criteria – company they work for, location and sex.

So we created a target audience segment following the criteria we had available for both the known data set and the advertising platforms for the two networks – employed by one of the Ogilvy companies and based in the UK or the USA.

Each of the platforms gave us a number that reflected the number of users that matched those criteria. Here are the results against the known data set:

Screen Shot 2015-05-13 at 10.51.27 AM

As the data provided by Ogilvy Group’s HR department only included permanent employees, we expected to see a variation of +/- 5% variation.

However, the variation was as high as +/- 17%.

17% more LinkedIn users declared they work for one of the Ogilvy group companies in the USA or the UK. While the variation is quite high, given the assumptions above, the number is likely to be closer to reality, as it is higher.

17% less Facebook users declared they work for one of the Ogilvy group companies in the USA or the UK. Even with the assumptions made at the beginning of the test, the number is very low and shows a significant inconsistency between the real data and the self-reported data on the platform.

Furthermore, the percentage of women who declared they work for one of the Ogilvy group companies on Facebook is much higher than the percentage provided to us by the HR department.

 

What does this mean? Three things:

  1. If you have access to a known data set, it is always worth testing it against the data offered by the social platform and assess accuracy and the need to use third party data.
  2. As always, and across all platforms, the more niche and accurate your targeting is, the more likely it is you will end up paying more. As expected, LinkedIn data has a higher level of accuracy than Facebook when it comes to targeting based on work information. This also explains why the CPC for LinkedIn is higher as well. You pay for the higher quality of the engagements and not just for the number of clicks.
  3. While not a very detailed test, these results also give us a number as a reference point of the deviation from the truth – neither network is accurate, but because Ogilvy data includes only FTEs, and not freelancers/part-timers, you’re right in assuming that LinkedIn data looks more accurate.
-->