Should we be excited about Facebook Attribution? I certainly think so.
Not because Facebook have finally solved the attribution problem (though their efforts to enable cross-platform and cross-device analysis are definitely a big step forward).
The reason I’m excited is because an attribution competition between the two biggest advertising platforms creates opportunities for advertisers. Limitations in one can be offset by capabilities in other, and biases can be identified and mitigated through cross-referencing.
Neither of the two major free attribution services are perfect – but taken together, they can be extremely valuable to measurement and optimisation.
In this post I will explain how to integrate both Google and Facebook’s free attribution services according to an overarching strategy.
Capabilities and limitations of Google and Facebook attribution
Solution | Data-driven attribution | Cross-platform | Cross-device | Conversion Lift Testing | Verifiability |
Google Analytics | Only in paid version | Limited to Google properties and Facebook clicks (but not impressions) | Yes | Yes | Low: data is confined within Google-owned ad network |
Facebook Attribution | Yes but only for Facebook ads | Multiple partners, but some major social platforms missing | Yes | Yes | Some: cross-platform capabilities enables cross-referencing |
Data Driven Attribution: start with Google Analytic
In the paid version of Google Analytics, GA360, users have access to one of the industry’s best applications of machine learning for measurement.
Data Driven Attribution (DDA) looks at all clicks – converting and non-converting traffic – in order to estimate the value of a particular ad, or keyword, or ad group, etc. From the initial tests we’ve done at Brainlabs, DDA outperforms other attribution models, e.g. ‘Last Click’, by about 5% in terms of the incremental gains it enables.
So, if you have paid for GA360, then the first step in the process should be to use DDA to gather initial insights into how to optimise your ad spend. The results can then be compared, to some extent, against Facebook’s version of data driven attribution – but only for the portion that is being spent on Facebook ads.
If you have the free version of GA, there are still seven attribution models to choose from. Facebook has almost as many options, but these are less well tested and developed as Google’s. That’s why I’d recommend, whether you’re using the free or premium service, to start with GA.
Cross-device analysis: use both, but prioritise Facebook
The good news is that Facebook and Google both provide this function in the free version of their attribution services.
However, Facebook has a natural advantage over Google which should lead to more in-depth and useful analysis.
Facebook has really strong cross-device tracking because people are typically logged in to its properties on all their devices, and therefore they can follow you about better than other platforms.
I’ve included an example below. It’s a really impressive feature of Facebook Attribution, and one that I would highly recommend capitalising on.
Cross-platform analysis
This is where using two different attribution solutions really starts to make a difference. GA is not able to measure Facebook impression data, so the first piece of cross-platform analysis you can perform is to try to fill in the blanks based on this gap.
Facebook Attribution also has far more partners that are willing to allow Facebook to factor their channel into the overall analysis, enabling a fairly comprehensive view of a user’s online behaviour (see a list of partners here). Some major channels like LinkedIn and SnapChat are missing, and neither Facebook nor Google’s attribution services include offline media as of yet.
Also, by comparing how Google and Facebook differ (if they do differ) on the value placed on their own and the other’s ads, you can feel greater confidence in the accuracy of your attribution.
Testing: run lift tests on both platforms
Lift tests allow you to basically run a randomised control trial for you advertising.
You create a control group and a test group. The test group sees the particular ad you want to measure the value of; the control group doesn’t, but is otherwise as similar as possible to the test group. The difference in performance provides you with a decent estimate of the incremental value of a particular ad.
Working Effectively with Data Teams
For a recent case study on this, I recommend an article by Witold Wrodarczyk who used conversion lift tests for a YouTube campaign.
Of course, lift tests are one amongst multiple ways of testing the performance of your ads. Facebook, for example, also offers ‘Split Tests’ and ‘Test and Learn’ – the latter of which enables advertisers to choose from a list of questions about their account to make strategic improvements.
Building a continually optimised attribution solution
When you add all the above elements together, it should look something like this:
This is just one way of doing it – it doesn’t matter too much about the little variations, but the principle is that you can use both platforms to inform and improve the other.
Even before Facebook Attribution was launched, there was plenty of scope for testing, learning and adapting both the attribution and advertising strategy just on GA. When you add in Facebook Attribution, though, you really do start to begin to see a broader and more accurate picture of your online advertising performance. It’s a great complement to Google.
Though it will be initially adding some further complication, I strongly recommend all advertisers to make the time investment into Facebook Attribution. As multi-touch attribution keeps getting harder and harder, knowing which tools to leverage and how is essential.