If you’ve been on Facebook in the past couple of days, your timeline has likely been flooded with photos of your friends looking like older versions of themselves. It’s all thanks to FaceApp, a popular AI-powered app that allows users to upload photos of themselves and use different filters to manipulate them.
In addition to the “old” filter, users can use FaceApp to see themselves as the opposite gender, with different hairstyles, with different facial expressions and more. It has topped Google and Apple’s download charts, and even celebrities have gotten in on the craze.
Check out Carrie Underwood and her husband, Mike Fisher, as senior citizens in the photo posted to the singer’s Instagram account:
Pop singers the Jonas Brothers have one of the most popular FaceApp pics, with over 1.6 million “likes”:
“Dawson’s Creek” actor Busy Phillips also got a peek into her future:
“Bridesmaids” actor Chris O’Dowd tried his hand at it, too:
And comedian Whitney Cummings also joined in:
Although FaceApp may seem like harmless fun, it’s raised some privacy concerns.
The app, which is owned by St. Petersburg, Russia-based Wireless Lab and has been around since 2017, may upload users’ photos to the cloud for processing, which could be a concern if the image ended up being stored and shared outside the app.
In a statement from the company to TechCrunch in response to concerns, it had this to say about cloud storage:
“We might store an uploaded photo in the cloud. The main reason for that is performance and traffic: we want to make sure that the user doesn’t upload the photo repeatedly for every edit operation. Most images are deleted from our servers within 48 hours from the upload date.”
The company said that it relies on Amazon and Google cloud services, in an effort to quell concerns that user data is now being shared on the Russian dark web.
In addition, users voiced concern that the iOS (Apple) version of the app was able to upload images even if the app settings had been set to deny camera roll access. It turns out this is more of an iOS loophole than one specific to FaceApp.
The company says that they only upload the photo selected by a user and that they never access any of the other images from a user’s device.
Because 99% of users don’t log in, the app doesn’t have access to identifying information, and cybersecurity experts seem to be confident that the app doesn’t access more than it says it will.
They promise that they do not sell any user information to third parties and that user data is not transferred to Russia, despite the fact that their research and development team is located there.
Concerns about the app’s security spread like wildfire this week before the company had a chance to respond.
Senator Chuck Schumer of New York asked the FBI and the Federal Trade Commission to open an investigation into the app.
BIG: Share if you used #FaceApp:
Because millions of Americans have used it
It’s owned by a Russia-based company
And users are required to provide full, irrevocable access to their personal photos & data pic.twitter.com/cejLLwBQcr
— Chuck Schumer (@SenSchumer) July 18, 2019
Additionally, the Democratic National Committee sent a security alert to the campaigns vying for a 2020 presidential nomination and asked them not to use the app.
“It’s not clear at this point what the privacy risks are, but what is clear is that the benefits of avoiding the app outweigh the risks,” said DNC chief security officer Bob Lord, according to NPR. “If you or any of your staff have already used the app, we recommend that they delete the app immediately.”
What To Do
If you have cybersecurity concerns and would like to have your image removed from the app, you can click “Settings,” then “Support.” Use the “Report a bug” function and put the word “Privacy” on the subject line. FaceApp’s statement said they are giving priority to these requests but are “currently overloaded.”
If you choose to use the app, as with all apps, you should do so with caution, according to experts.
“People should be savvy about when apps and memes and games are encouraging everyone to engage in the same way,” Karen O’Neill, a tech consultant, told The Washington Post. “It puts the data in a vulnerable state that becomes something that can train facial recognition and other kinds of systems that may not be intended the way people are using it.”