Technology giant Meta on Thursday began sending thousands of young Australians a two-week warning to downland their digital histories and delete their accounts from Facebook, Instagram, and Threads before a world-first social media ban on accounts of children younger than 16 takes effect. The Australian government recently announced that the three Meta platforms, plus Snapchat, TikTok, X, and YouTube, must take reasonable steps to exclude Australian account holders younger than 16 beginning on Dec. 10. California-based Meta is now the first of the targeted tech companies to outline how it will comply with the law, per the AP.
Meta contacted thousands of young account holders via SMS and email to warn that suspected children will start to be denied access to the platforms starting on Dec. 4. "We will start notifying impacted teens today to give them the opportunity to save their contacts and memories," Meta said in a statement. The company said young users could also use the notice period to update their contact information "so we can get in touch and help them regain access once they turn 16." Meta has estimated there are 350,000 Australians ages 13 to 15 on Instagram and 150,000 in that age bracket on Facebook.
Account holders 16 and older who were mistakenly given notice that they'd be excluded can contact a chosen service to verify their age by providing government-issued identity documents or a "video selfie," Meta said. Terry Flew of the University of Sydney's Center for AI, Trust, and Governance says such facial-recognition technology had a failure rate of at least 5%. "In the absence of a government-mandated ID system, we're always looking at second-best solutions around these things," Flew tells the Australian Broadcasting Corp.
The government has warned platforms that demanding all account holders prove they are older than 15 would be an unreasonable response to the new age restrictions. The government maintains the platforms already had sufficient data about many account holders to ascertain they weren't young children. Failure to take reasonable steps to exclude young kids could earn platforms fines of up to $32 million.