Notice: Trying to access array offset on value of type null in /srv/pobeda.altspu.ru/wp-content/plugins/wp-recall/functions/frontend.php on line 698

id=»article-body» class=»row » section=»article-body» data-component=»trackCWV»>

What’s happening

Apple’s announced a new Safety Check feature to help potential victims in abusive relationships.

Why it matters

This is the latest example of the tech industry taking on tough personal technology issues that don’t have clear or easy answers.

What’s next

Apple is communicating with victim-survivor advocacy organizations to identify other features that can help people in crisis.

Among the long-requested and popular new features Apple plans to bring to the iPhone this fall, like as well as a function to find and , is one that isn’t just a convenience — using it could mean life or death.

On Monday, Apple announced Safety Check, , designed to aid domestic violence victims. The setting, coming this fall with iOS 16, is designed to help someone quickly cut ties with a potential abuser. Safety Check does this by helping a person quickly see with whom they’re automatically sharing sensitive info like their location or photos. But in an emergency, it also lets a person simply and https://bvespirita.com/ quickly disable access and information sharing to every device other than the one in their hands.

Notably, the app also includes a prominent button at the top right of the screen, labeled Quick Exit. As the name implies, it’s designed to help a potential victim quickly hide that they’d been looking at Safety Check, in case their abuser doesn’t allow them privacy. If the abuser reopens the settings app, where Safety Check is kept, it’ll start at the default general settings page, effectively covering up the victim’s tracks.

<div class="videoPlayer " data-component="videoPlayer" data-video-player-options='{"config":{"policies":{"default":11417438},"tracking":{"can_partner_id":"canPartnerID","comscore_id":"3000085","comscore_home":"3000085","comscore_how_to":"3000078","comscore_news":"3000078","comscore_reviews":"3000087","comscore_videos":"3000088","comscore_sense_id":"cnetvideo","comscore_sense_home":"cnethome","comscore_sense_how_to":"cnethowto","comscore_sense_news":"cnetnews","comscore_sense_reviews":"cnetreviews","comscore_sense_videos":"cnetvideo","nielsen_cid":"us-200330","nielsen_vcid":"c07","nielsen_vcid_reviews":"c05","nielsen_vcid_home":"c07","nielsen_vcid_news":"c08","nielsen_vcid_how_to":"c09","nielsen_vcid_videos":"c20"},"uvpConfig":{"mpx_account":"kYEXFC"}},"playlist":[{"id":"16d6dd56-1ae5-4a23-8ce3-a795ba90dd88","title":"Watch Everything Announced at Apple\u0027s WWDC 2022 Event","description":"At WWDC 2022, Apple revealed a ton of new software including iOS 16, iPad OS 16 and Mac OS Ventura. The company also introduced a new M2 processor for a redesigned MacBook Air and MacBook Pro.","slug":"watch-everything-announced-at-apples-wwdc-2022-event","chapters":{"data":[],"paging":{"total":0,"limit":15,"offset":0}},"datePublished":"2022-06-06 19:35:44","duration":877,"mpxRefId":null,"ratingVChip":"TV-14","primaryTopic":{"id":"752de8fe-6106-4537-bcb9-7a933ad51d48"},"author":{"id":"","firstName":"","lastName":""},"primaryCollection":{"id":"040fa0bc-bf08-43dc-ac3d-ee7869a9fc85","title":"CNET News website

«Many people share passwords and access to their devices with a partner,» Katie Skinner, a privacy engineering manager at Apple, said at the company’s WWDC event Monday. «However, in abusive relationships, this can threaten personal safety and make it harder for victims to get help.»

Safety Check, and the careful way in which it was coded, are part of a larger effort among tech companies to stop their products from being used as tools of abuse. It’s also the latest sign of Apple’s willingness to wade into building technology to tackle sensitive topics. And though the company says it’s earnest in its approach, it’s drawn criticism for some of its moves. Last year, the company announced efforts to detect child exploitation imagery on some of its phones, tablets and computers, a move that critics worried

Still, victim advocates say Apple’s one of the few large companies publicly working on these issues. While many tech giants including Microsoft, Facebook, Twitter and Google have built and implemented systems and behavior on their respective sites, they’ve struggled to build tools that stop abuse as it’s happening.

Unfortunately, the abuse has gotten worse. A survey of practitioners who work on domestic violence conducted in November 2020 found that 99.3% had clients who had experienced «,» according to the , which worked on the report with Curtin University in Australia. Moreover, the organizations learned that reports of tracking of victims had jumped more than 244% since they last conducted the survey in 2015. 

Amid all this, tech companies like Apple have increasingly worked working with victim organizations to understand how their tools can be both misused by a perpetrator and helpful to a potential victim. The result are features, like Safety Check’s Quick Exit button, that advocates say are a sign Apple’s building these features in what they call a «trauma-informed» way.

«Most people cannot appreciate the sense of urgency» many victims have, said , executive director of the National Center for Victims of Crime. «Apple’s been very receptive.»

Multiple lock screens displayed on and next to an iPhone.Multiple lock screens displayed on and next to an iPhone.

Apple says there are more than a billion iPhones being used around the world.

Apple/Screenshot by CNET

Tough issues

Some of the tech industry’s biggest wins have come from identifying abusers. In 2009, Microsoft helped create image recognition software called PhotoDNA, which is now used by social networks and websites around the world to when it’s uploaded to the internet. Similar programs have since been built to help identify known , livestreams of  and other things that large tech companies try to keep off their platforms.

As tech has become more pervasive in our lives, these efforts have taken on increased importance. And unlike adding a new video technology or increasing a computer’s performance, these social issues don’t always have clear answers.

In 2021, Apple made one of its first public moves into victim-focused technology when it announced new features for its iMessage service designed to analyze messages sent to users marked as children to . If its system suspected an image, it would blur the attachment and warn the person receiving it to make sure they’d wanted to see it. Apple’s service would also point children to resources that could help them if they’re being victimized through the service.

At the time, Apple said it built the message-scanning technology with privacy in mind. But activists worried Apple’s system was also designed to alert an identified parent if their child chose to view the suspected attached image anyway. That, some critics said, could incite abuse from a potentially dangerous parent.

Apple’s additional efforts to detect potential child abuse images that might be synchronized to its photo service through iPhones, iPads and Mac computers was criticized by security experts who .

Still, victim advocates acknowledged that Apple was one of the few device companies working on tools meant to support victims of potential abuse as it’s happening. Microsoft and Google didn’t respond to requests for comment about whether they plan to introduce features akin to Safety Check to help victims who might be using Windows and Xbox software for PCs and video game consoles, or Android mobile software for phones and tablets.

iPhones showing child safety messagesiPhones showing child safety messages

Apple introduced a system for child safety in iMessages last year.

Apple

Learning, but much to do

The tech industry has been working with victims organizations for over a decade, seeking ways to adopt safety mindsets within their products. Advocates say that in the past few years in particular, many within the tech giants, staffed in some cases with people from the nonprofit world who worked on the issues the tech industry was taking on. 

Apple started consulting with some victims rights advocates about Safety Check last year, asking for input and ideas for how to best build the system. 

«We are starting to see recognition that there is a corporate or social responsibility to ensure your apps can’t be too simply misused,» Karen Bentley, . And she said that’s particularly tough because, as technology has evolved to become easier to use, so has the potential for it to be a tool of abuse.

That’s part of why she says Apple’s Safety Check is «brilliant,» because it can quickly and easily separate someone’s digital information and communications from their abuser. «If you’re experiencing domestic violence you’re likely to be experiencing some of that violence in technology,» she said.

Though Safety Check has moved from an idea into test software and will be made widely available with the iOS 16 suite of software updates for iPhones and iPads in the fall, Apple said it plans more work on these issues. 

Unfortunately, Safety Check doesn’t extend to ways abusers might be tracking people using devices they don’t own — such as if someone slips one of Apple’s $29 AirTag trackers into their coat pocket or onto their car . Safety Check also isn’t designed for phones set up under child accounts, for people under the age of 13, though the feature’s still in testing and could change.

«Unfortunately, abusers are persistent and are constantly updating their tactics,» said Erica Olsen, project director for , a program from the National Network to End Domestic Violence that trains companies, community groups and governments on how to improve victim safety and privacy. «There will always be more to do in this space.»

Apple said it’s expanding training with its employees who interact with customers, including sales people in its stores, to know how features like Safety Check work and be able to teach it when appropriate. The company has also created guidelines for its support staff to help identify and help potential victims.

In one instance, for example, AppleCare teams are being taught to listen for when an iPhone owner calls expressing concern that they don’t have control over their own device or their own iCloud account. In another, AppleCare can guide someone on how to remove their Apple ID from a family group.

Apple also updated its  in January to instruct people how to reset and regain control of an iCloud account that might be compromised or being used as a tool for abuse.

Craig Federighi, Apple’s head of software engineering, said the company will continue expanding its personal safety features as part of its larger commitment to its customers. «Protecting you and your privacy is, and will always be, the center of what we do,» he said.