7-Eleven took photos of some Australian customers’ faces without consent, privacy commissioner rules

Up to 3.2m facial images collected over a 10-month period from people who used in-store tablets to fill out feedback surveys

Convenience store giant 7-Eleven has disabled facial recognition technology used in 700 of its Australian stores as customers filled out feedback surveys after the privacy commissioner found it interfered with their privacy.

Up to 3.2m facial images had been collected over a 10-month period.

In June 2020, 7-Eleven rolled out tablets in its 700 stores across New South Wales, the ACT, Victoria, Queensland and Western Australia to allow customers to fill in surveys.

Each tablet had a built-in camera that took photos of customers both when they started and completed the survey. The photos were uploaded to an Australian-hosted server, where the facial image was converted to an encrypted algorithmic faceprint, and a person’s approximate age and gender were recorded based on an assessment of the faceprint.

It was then cross-referenced with all other faceprints generated by the tablet in the previous 24 hours, and if there were any matches, they were flagged for review.

The facial images were held by 7-Eleven for seven days and the company said the faceprints “effectively expired” after 24 hours, but did not say whether they had been deleted.

Between June 2020 and March 2021, 1.6m surveys were completed in 7-Eleven stores, but the company shut down the review system in September after the privacy commissioner, Angelene Falk, shared a preliminary finding that the collection of the images had interfered with the privacy of customers who had completed surveys.

https://twitter.com/i/web/status/1274892763599736833

In a final decision released on Thursday, Falk said the large-scale collection of such sensitive biometric information “was not reasonably necessary for the purpose of understanding and improving customers’ in-store experience” – and was obtained “without consent”.

Falk said she accepted “implementing systems to understand and improve customers’ experience is a legitimate function for 7-Eleven’s business [but] any benefits to the business in collecting this biometric information were not proportional to the impact on privacy”.

7-Eleven had initially tried to defend the practice, telling the commissioner the facial recognition technology was designed to prevent staff members or others from completing multiple surveys in one day. However, the company did not hand over any information about how many surveys were found not to be genuine.

The company also argued people had the option of not using the tablets and the company had posted signs in stores stating: “By entering the store you consent to facial recognition cameras capturing and storing your image.” The privacy policy on 7-Eleven’s website also mentioned it might collect biometric information but did not connect this with the feedback tablets.

The company argued the facial images and faceprints “were not personal information because they are not used to identify, monitor or track any individual”. The arguments were all rejected by the commissioner.

Ahead of the final decision, 7-Eleven disabled the image capture component of the survey system. Falk also ordered 7-Eleven to destroy all faceprints within 90 days.

A 7-Eleven spokesperson said the company accepted the decision but argued the system was “used by many businesses across the retail sector” and was entirely voluntary – with no other personal information collected.

“All images taken by the system in our stores have been permanently deleted,” the company said.

Anna Johnston, the principal at Salinger Privacy, told Guardian Australia the case “clearly shows that putting things in your privacy policy does not equal consent” and the argument that someone’s face was not personal information did not wash.

“The very selling point of facial recognition technology from a vendor’s perspective, their sales pitch, is facial recognition technology is great because we can distinguish people uniquely, with a high degree of accuracy. To turn around and say that that is not personal information is, I think, a bit of a stretch,” Johnston said.

The Human Rights Law Centre senior lawyer Kieran Pender said facial recognition technology raised significant human rights and privacy issues.

“Given these risks, the increasing use of facial recognition technology without proper safeguards – including by law enforcement and as part of home quarantine apps – is alarming,” he said.

“The Morrison government should urgently enact laws that provide robust safeguards against the potential misuse of facial recognition technology.”

Extracted from The Guardian

Scroll to Top