<p><strong>New Delhi: </strong>Apple Inc. has revealed a new tool that will scan U.S. iOS phones for images of child sexual abuse, while child protection groups have welcomed the new technology many security researchers have pointed out that this can be misused.</p> <p>Apple also plans to scan encrypted messages of its users for sexually explicit content as an additional child safety measure alarming privacy advocates.</p> <p><strong>ALSO READ: <span style="color: #e03e2d;"><a style="color: #e03e2d;" href="https://ift.tt/3jrNUi2 Temple Attack In Pakistan: PM Imran Khan Commits To 'Restore Mandir' As India Expresses 'Grave Concerns'</a></span></strong></p> <p>The tool called "neuralMatch" detects known images of child sexual abuse, will scan images before it is uploaded to the iCloud. In case it finds a match, the image will be reviewed by a human, if it is confirmed to be pornography the user’s account will be disabled, and the National Center for Missing and Exploited Children notified.</p> <p>Now, there are concerns regarding privacy, Matthew Green, a top cryptography researcher at Johns Hopkins University, said that innocent people can be framed by sending them images that could trigger the system, fool Apple’s algorithm, and alert law enforcement. He said that researchers have been able to trick such systems pretty easily. </p> <p>Various tech companies have like Microsoft, Google, Facebook, and others have for years been sharing digital fingerprints of known child sexual abuse images. Apple has used those to scan user files stored in its iCloud service, which is not as securely encrypted as its on-device data, for child pornography. Apple, however, has been under pressure from the authorities for access to information to help with crimes such as terrorism or child sexual exploitation. It was one of the major companies to embrace “end-to-end” encryption.</p> <p>Researchers are also concerned about government surveillance, especially of dissidents or protesters. Washington-based nonprofit Center for Democracy and Technology called on Apple to abandon the changes, which it said effectively destroy the company’s guarantee of “end-to-end encryption.” Scanning of messages for sexually explicit content on phones or computers effectively breaks the security, it was quoted saying by AP. </p> <p>However, child rights activists and protection groups have praised the tool.</p> <p>"Apple’s expanded protection for children is a game-changer,” John Clark, the president and CEO of the National Center for Missing and Exploited Children, said in a statement according to an Associated Press report. “With so many people using Apple products, these new safety measures have the lifesaving potential for children.”</p> <p>Similarly, Julia Cordua, the CEO of Thorn, a nonprofit founded by actors Demi Moore and Ashton Kutcher, which uses technology to help protect children from sexual abuse said that Apple’s technology balances “the need for privacy with digital safety for children.” </p>
Friday, August 6, 2021
Apple To Scan Images Of Sexual Abuse; Welcomed By Child Protection Groups But Raises Concern Of Misuse
About Waqas Ahmad Javed
alistarbot is a blogger resources site is a provider of high quality blogger template with premium looking layout and robust design. The main mission of alistarbot is to provide the best quality blogger templates.
World News
Tags:
World News
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment