If you don't have time to read, you can always watch this short version.
So Facebook's been at it again.
If you missed this story, here's the short version:
Facebook bought Onavo in 2013, ostensibly a VPN app whose user promise was to “keep you and your data safe” and “block potentially harmful websites and secure your personal information.”
What Onavo really did was transfer huge streams of device data back to Facebook: which apps are being used, when, how long for etc.
In 2018, Apple banned Onavo from its App Store as it contravened guidelines:
"apps should not collect information about which other apps are installed on a user’s device for the purposes of analytics or advertising/marketing and must make it clear what user data will be collected and how it will be used."
This week, it came to light that Facebook Research had recruited users - many as young as 13 - and paid them $20 a month to install an app with similar functionality to Onavo.
Actually, the new app was just the old Onavo re-badged and distributed through Apple's back door (its policies DO allow this functionality for corporate apps aimed at employees - they need to be downloaded separately and are not available through the App Store).
Facebook hogged the headlines, but they're not alone. They may be the most cynical offenders, but Google has also been caught up in it:
If you're a marketer, researcher or digital analyst you're probably very familiar with the types of projects Google and Facebook were doing.
Many of us have used similar methods to understand path-to-purchase journeys, measure media consumption and generate insights into mobile usage.
Behavioural tracking solutions come in many flavours. Typically, users give permission for software installed on their desktop, tablet or smartphone to capture and share some of the data they generate: which websites they visit, which apps they use or - on mobile devices - where they go in the physical world.
Panel providers like Netquest and Respondi have sub-sets of users opted-in to share some of this data. It can be analysed alongside survey responses to build a combined picture of attitudes and behaviours.
Verto Analytics has its own behaviourally-tracked panel for ad hoc projects and syndicated reports; RealityMine, Wakoopa, Embee and DDMR can all help you build tailored projects or panels to collect this passively tracked data.
Its use is growing exponentially, and if you're not using it now you soon will be.
So some of you reading the reports about Facebook and Google will think, "What's wrong with that? The users consented and are being paid. It's a media-driven hatchet job."
You're going to have to shift your thinking.
The privacy backlash is a gathering storm, and it will take companies down. You won't have to be as reckless and arrogant as Cambridge Analytica.
Apple - who really needs some new shine on their waning star - is about to weaponise privacy in the fight against Facebook and Google. And they have the most effective PR and media influence engine on the planet (their own privacy SNAFUs notwithstanding).
GDPR was only the start of taking privacy seriously. Getting compliant was painful - but for many brands it's just a more elaborate series of tick boxes and popups that most consumers ignore.
It's all compliant with the letter of the law even if it's counter to the spirit.
So what does this all mean if you want to run some passive tracking research?
Here's some advice.
Think like a lawyer
Expect the worst outcome and do everything you can to mitigate it.
A media exposé that damages your brand? A class action suit that costs a fortune? Being escorted from your desk by HR and security?
As marketers, we often see the legal team as an annoying constraint on what we want to do. But instead of looking at them as the compliance police, we should think of them as our conscience.
The real point of all these regulations is to make the data exchange with users fair and transparent - and we need to keep that front and centre when planning new projects.
Don't abdicate responsibility
There's a long-running ad campaign for Febreze air freshener - #Noseblind - whose core idea is that we all get so used to bad smells that we no longer notice them.
For years, research buyers have claimed not to notice the faint odour of turd wafting out of online survey data.
But ignoring these smells won't be an option when gathering personal data from privacy-aware consumers.
If you're Samsung or Primark, you invest heavily to make sure you don't have conflict minerals or slave labour in your products.
If you work with personal data, it's incumbent on you to satisfy yourself that your supply chain is ethical and your vendors are fulfilling their privacy and quality obligations.
The Facebook Research project was actually recruited through Applause - but they're not the ones taking the heat.
Demonstrate clearly that you value privacy
OK, this is the big one - and it means going way beyond the tick box approach.
The Facebook / Google defence for the tracking apps above is essentially "users were fully informed and they consented".
If you've ever clicked through all the GDPR permission toggles on certain publisher sites, you'll realise how farcical the concept is. You could spend a day reading Ts & Cs from all the integrated adtech tools and still not feel fully informed.
Clearly it's about the spirit as much as the letter when dealing with passive tracking data.
So how do you make the whole value exchange more meaningful, transparent and trust-based?
1. Humanise the people behind the research (YOU)
Zuckerberg famously said that we now live in a post privacy world.
That does my head in.
It only applies in one direction: users sacrifice privacy, but brands assert their right to it aggressively - by refusing to speak to journalists, dishing up platitudes from the PR team or issuing terse legal statements.
As researchers, we get to know intimate details about users, but they get nothing of us.
We should to reduce this asymmetry by sharing something of ourselves with the users we ask so much from.
At the start of a project, show your face: record a video for your participants; talk directly to the people whose data you want; tell them about your team, your job and why your company needs to understands its users.
If this feels alien, it's because we've all allowed data-centric projects to separate us from real people.
And if those real people appreciate there's another real person asking for their help they'll be far more understanding and less likely to kick up a stink in the media.
2. Explain what you'll do with their data
Tell them about the data you want them to share and why it will help you.
Use simple, clear, relatable language.
Don't hide behind generics or legalese. Use words and phrases your Nan* would understand.
"We want to know how much time you spend using different apps. This helps us come up with better ideas for our own apps. For example if we know people do a lot of messaging, we might decide to make a chat tool for our website."
And get your words sense-checked in some qualitative cognition testing.
*Noun. British colloquialism for Grandmother.
3. Don't collect more than you need
Here's what a leading security researcher said about the Facebook Research app:
“If Facebook makes full use of the level of access they are given by asking users to install the Certificate, they will have the ability to continuously collect the following types of data: private messages in social media apps, chats from instant messaging apps – including photos/videos sent to others, emails, web searches, web browsing activity, and even ongoing location information by tapping into the feeds of any location tracking apps you may have installed.”
This has the potential to be horribly invasive - and it clearly runs counter to the spirit of GDPR, which says you need a proper reason to collect all the personal data you ask for.
And all this data is a nightmare to work with. Big trawler fishing jobs are just a bad idea all round.
4. Augment the tick box with something meaningful
You can't escape some sort of consent box ticking. That's how audit trails work.
But you can help mitigate the risk of users not realising what they signed up for: get them to play back what they've agreed to.
Give each user a succinct, 3-4 sentence Plain English (or equivalent) version of their agreement with you - and have them record a video as they read it out:
"I understand that I will be sharing XXX data with you. You will be able to see which apps I use and when I make phone calls. I know what you want this for and I'm happy to share it in exchange for payment."
5. Keep an eye on personal data exchanges
Personal data exchange models have emerged over the last couple of years. You can expect to see a lot more activity in this areas as privacy-consciousness grows.
Research data is also moving in this direction.
Citizen Me is a self-service research platform that combines surveys with passively-captured data from its panel of over a million users. Members can choose to exchange (for each individual data point and each individual project) their behavioural and profiling data along with their answers to surveys.
The user is entirely in control, and is transparently rewarded for each exchange they make.
PY Insights by PowrofYou won the 2019 IIeX Innovation EU competition, and provides passive metering technology and services on both its own proprietary panel and - via API - on third party survey panels. The proprietary panel operates on a 'fair data exchange' basis, with users able to choose whether or not to share for each data source.
Measure Protocol won the 2018 IIEX Innovation Competition, and is building "a marketplace for person-based data where individuals are paid fairly and data usage is secure and transparent".
And Veriglif is building another blockchain-based ecosystem to tie together the different sources of consumer data from survey panels, DMPs and other consumer data providers.
More of these trust-based platforms will emerge as users expect fairer and more transparent ways of controlling their own data.
These suggestions all look like adding extra work.
Sorry about that.
But if you want to use behavioural data and avoid a privacy backlash, you need to invest the time to do it properly.