The Federal Trade Commission (FTC) has approved a Final Consent Order against Everalbum, a cloud photo storage app, for misrepresenting its privacy practices and violating Section 5 of the FTC Act. The order requires the destruction of facial recognition models and algorithms that the company created based on user photos, without the users’ permission.
Section 5(a) of the FTC Act provides that “unfair or deceptive acts or practices in or affecting commerce . . . are . . . declared unlawful.” 15 U.S.C. Sec. 45(a)(1).
As the FTC notes,
“Deceptive” practices are defined in the Commission’s Policy Statement on Deception as involving a material representation, omission or practice that is likely to mislead a consumer acting reasonably in the circumstances. An act or practice is “unfair” if it “causes or is likely to cause substantial injury to consumers which is not reasonably avoidable by consumers themselves and not outweighed by countervailing benefits to consumers or to competition.” 15 U.S.C. Sec. 45(n).
As the FTC’s complaint states,
Since 2015, Everalbum has provided Ever, a photo storage and organization application, to consumers. Ever is available as both an iOS and Android mobile application (“app”), as well as in a web and desktop format. Globally, approximately 12 million consumers have installed Ever.
Ever allows consumers to upload photos and videos to Ever’s cloud servers from sources such as the user’s mobile device, computer, or accounts with social media services, such as Facebook or Instagram, or cloud-based storage services, such as Dropbox or One Drive. By storing photos and videos on Ever’s servers, consumers can free up storage space on their devices. Ever uses automated features to organize users’ photos and videos into albums by location and date.
In 2017, Everalbum launched a “Friends” feature that uses face recognition technology to group photos by the faces of the people who appear in them.
For many users, this feature was enabled by default, without any affirmative action by users.
The company initially used publicly available face recognition technology to power the feature. It then started to develop its own face recognition technology, using images it extracted from its users’ photos.
Everalbum offered users who no longer wished to use the app the ability to deactivate their accounts. When a user tried to delete their Ever account, they would receive a message like this one: “We’re sorry to see you go! If you choose to deactivate your account, you will permanently lose access to [##] photos and [##] albums.”
The company also stated: “[Y]ou can deactivate your account at any time by signing into our app, going to ‘Settings’ > ‘General Settings’ > ‘Deactivate’. Please note that this will permanently delete all photos and videos stored on your account as well.”
However, according to the FTC’s complaint,
Contrary to the statements Everalbum has made that account deactivation will result in Everalbum deleting the user’s photos and videos, until at least October 2019, Everalbum did not, in fact, delete the photos or videos of any users who had deactivated their accounts and instead retained them indefinitely. Everalbum began implementing in October 2019 a practice of deleting all the photos and videos associated with Ever accounts that have been deactivated for more than three months.
The FTC ordered that the company delete and destroy any photos and videos that it collected from users who requested deactivation of their accounts.
The FTC also ordered deletion or destruction of any “Affected Work Product”: “any models or algorithms developed in whole or in part using Biometric Information Respondent collected from Users of the “Ever” mobile application.”
Everalbum shut down in August, 2020, citing competition from Apple and Google photo storage services
Paravision, Everalbum’s enterprise brand, is still in business and recently appointed a Chief AI Ethics Advisor.
A class action lawsuit against Paravision, alleging privacy violations, was settled in February.