We have become very use to the concept of being followed around the Internet when we're online, although that very process will be changing over the coming years with the phasing out of third party data and the need for firms to have first party data in order to stay in touch with you.

Right now though, another highly valuable trove of data is being collected on all us, this time in the physical world  - imagine an electronic billboard with a camera that can scan your features to identify your age and gender, and then flash up an ad specifically targeting your demographic.  

Facial recognition technology is currently developing at pace for adoption as both a law enforcement tool for governments and in applications for companies wanting to target individual customers. How these software systems are used — or misused — looks set to become a huge debating point in the next few years.

Privacy rules are struggling to keep up with the enhanced ability of software to screen large groups of people in real time, amassing data without permission and risking running roughshod over the most basic civil liberties.

For instance, Amazon.com Inc. indefinitely extended a moratorium on allowing police forces to use its Rekognition algorithms, which a January 2019 study by two artificial intelligence researchers showed made more mistakes when used on people with darker skin, particularly women.  Anyone that is familiar with AI will already understand how algorithmic bias is a real thing but is actually a result of the humans that created the algorithm in the first place.

Given the technological advances of the last couple of decades, it won't be too far in the future when we experience personalised advertising in the physical world, which will be great for marketers, but as we advance to that stage it's of the utmost importance that the companies delivering the tech behind facial recognition address all and any underlying biases. 

This has the power to surpass anything we have come to know it terms of privacy so lets make sure it's as fair and even as it needs to be.... please.