Introduction
Whether you believe it’s the future or hope it’s a short-lived trend, you’ve probably heard of the Metaverse before. It’s a sort of virtual reality crossed with social media, and Facebook already plans to rely on the Metaverse in future updates to the platform…which makes sense, since Meta Platforms, Inc. is the parent company of Facebook, Instagram and WhatsApp.
Plenty of internet users think that the next stage of the Internet age will focus more on Meta, too. We hear about new companies planning to launch in the Metaverse every day. Recently, though, Meta has been in the news for a much more sinister reason.
A new FTC order is further regulating what data Meta can see and sell; and what protections they need to add to their cybersecurity posture.
Meta’s Problem with Data Privacy
You may remember back in 2020, the Federal Trade Commission instituted a privacy order that added to the regulations imposed on Meta’s current privacy program and required a third party auditor to assess their success in remediation. Many of these rules trickled down to Meta’s child companies, such as expanding Facebook’s security structure for protecting users’ private information and restricting what they can do with the data gleaned from facial recognition software. The FTC fined them too: Meta paid $5B in penalties for the pleasure.
Now the FTC is back and accusing Meta of failing to comply with the 2020 order. Amongst the complaints brought against them, include inadequately managing minors’ communication while using the Messenger Kids app, despite what parents were told, and giving its software developers more information on ALL users than they were led to believe at sign-up.
This latest order brought by the FTC implements additional regulations on how Meta must maintain data privacy. It also alleges that Meta violated the Children’s Online Privacy Protection Act Rule and seeks remediation for that, too.
New FTC Remediation
Additional rules imposed on Meta Platforms because of this new lawsuit would include:
- Limiting what data it collects from all its platforms, including virtual reality
- Preventing them from profiting off data collected from minors
- Requiring parental consent to collect a child’s data
- Adding more safeguards in place for user information, especially their facial ID which can be used in a variety of cyber-attacks
- Preventing new products and services from launching until an assessor signs off that its privacy program passes the law’s requirements
- During any future company acquisition or merger, Meta must ensure they comply with both the FTC rule as well as any privacy requirements that the new partner was already beholden to
- Generally strengthening the privacy compliance regulations laid out in the 2020 ruling
Meta has been given 30 days to respond to the FTC investigation. Among the platforms that might affect you include Meta-owned Facebook, Instagram, WhatsApp and Oculus.
Conclusion
Given that this is the second violation, we can expect fines and bans to grow if more privacy suits are brought against Meta going forward. This FTC ruling also sets a precedent for future businesses investing in artificial intelligence without taking the proper measures to protect user privacy in the process. Digital progress should not, and does not need to, come at the expense of data security.
How safe are your favorite social media platforms? How secure will the next application that you’re really excited to go live be?
Ask yourself these questions as you’re making new social profiles and connecting with other people around the web. Excitement and buzz can make you eager to jump right onto new trends, but it’s important to take a step back and remember your cybersecurity best practices first!
References