Programmatic

Navigating new territories: New tools for online safety

In today’s digital age, our lives are increasingly intertwined with the online world. From socialising and shopping to learning and working, the internet has become an indispensable part of the daily routine of all New Zealanders. In fact, according to the 2024 Global Digital Report, 87.4% of Kiwis are active social media users and spending an average of 1 hours and 59 minutes on various platforms. Facebook and Instagram are top of the list in Aotearoa, followed by X and TikTok. 

People spending more time on social media brings a responsibility that cannot be ignored: ensuring online safety for users. At Meta NZ, we believe every person deserves the freedom to explore digital spaces confidently with a measure of control over what they encounter and how they engage. 

We have made substantial investments in products designed to enhance online safety across our platforms with three goals in mind: ensuring safety on our platforms; keeping accounts secure; and educating people on how to protect their information. These investments enable us to leverage advanced technologies such as artificial intelligence and machine learning to detect and mitigate harmful content and behaviour in real time.

Our dedication to online safety goes beyond simply investing in technology. It’s about taking real action. That’s why we’ve got thousands of people at Meta focused on making our platforms safer. We block millions of fake accounts from being created every day and have introduced many new tools and features aimed at protecting people online such as the Security Check Up feature, log in alerts in all our platforms, parental supervision tools and age verification. Here’s some more you should know about:

Expanded age verification feature on Facebook

If Facebook users attempt to edit their date of birth from below 18 to above 18, they will be asked to verify their age by either choosing to upload a video selfie or by submitting an ID.

Expanded Instagram ‘Limits’ for teens

‘Limits’ now gives teens an easy way to hide interactions from people who aren’t in their Close Friends list without letting others know their content isn’t being shown, to better safeguard young people from cyber bullying. Meta is also expanding its Restrict feature, which lets teens take action on a specific bully, without having to block them. 

Stricter message settings for teens on Instagram and Facebook

Instagram has enhanced teen safety by defaulting message settings to prevent unwanted contact. Adults over 19 cannot message teens who don’t follow them, and new restrictions will stop anyone teens don’t follow from messaging them. This applies to users under 16 in New Zealand.

Nudity protection in DMs

When someone receives an image containing nudity, it will be automatically blurred under a warning screen, meaning the recipient isn’t confronted with a nude image and they can choose whether or not to view it. We’ll also share a message encouraging the recipient not to feel pressure to respond, with an option to block the sender and report the chat. This feature is designed not only to protect people from seeing unwanted nudity in their DMs, but also to protect them from scammers who may send nude images to trick people into sending their own images in return.

At Meta we understand that building a safer digital community requires proactive measures and continuous innovation and we believe that every effort we make is meaningful progress towards a digital future where people can express themselves, and connect with others in an environment where they feel safe. We’re committed to evolving and improving every step of the way, so we’ll continue to introduce new safety tools and features.

As our platforms are chosen by Kiwis to spend their time online, we are committed to continuing our innovation journey and leading the way.

The post Navigating new territories: New tools for online safety appeared first on stoppress.co.nz.

stoppress.co.nz