University of Massachusetts Researcher Develops App Safety Assessment Project
Computational Model Evaluates Customer Reviews for Child Safety Concerns
Avoiding Inappropriate Apps Made Easier by App Danger Project
Existing Methods Insufficient, University of Massachusetts Researcher Says
Brian Levine, a computer scientist at the University of Massachusetts Amherst, is frequently approached by his 14-year-old daughter with a common question: “Can I download this app?” His response involves sifting through hundreds of customer reviews in the App Store to identify any mentions of harassment or child sexual abuse. Wondering why there aren’t more resources available to help parents make quick decisions, Mr. Levine spent the last two years designing a computational model that assesses customer reviews of social apps. Alongside a team of researchers, he created the App Danger Project, a searchable website that offers clear guidance on the safety of social networking apps.
The project analyzes user reviews regarding sexual predators and provides safety assessments for apps with negative reviews. It highlights reviews that mention sexual abuse, excluding those that don’t specifically address child-safety concerns. While the research team didn’t follow up with reviewers to verify claims, it closely examined each review to identify potential risks.
Increasingly, predators are exploiting apps and online platforms to solicit explicit images. Last year, law enforcement received around 7,000 reports of children and teenagers coerced into sending nude photos and then blackmailed. Incidents of “sextortion” more than doubled during the pandemic. Recognizing the lack of keyword searches in Apple’s and Google’s app stores, Mr. Levine believes the App Danger Project can complement existing services like Common Sense Media by identifying apps that need better user regulation. The website is free to access, but donations to the University of Massachusetts are encouraged to cover costs.
Mr. Levine and a team of computer scientists investigated over 550 social networking apps available on Apple and Google platforms, specifically focusing on the number of reviews mentioning child sexual abuse. They found that roughly a fifth of the apps had two or more complaints about such material, with 81 offerings across both stores garnering seven or more negative reviews. Their research builds on previous reports about apps involving unwanted sexual interactions, including a New York Times report in 2019 that highlighted how predators utilized video games and social media platforms for hunting purposes.
Given Apple and Google’s financial interest in app distribution—the companies retain up to 30% of app store sales—three apps with multiple user reports of sexual abuse generated $30 million in sales last year: Hoop, MeetMe, and Whisper. However, the investigation conducted by Mr. Levine and his team suggests that such apps have not done enough to monitor user behavior. They are urging Apple and Google to provide parents with more comprehensive information about the risks certain apps may pose, as well as take more proactive measures to address apps with a track record of abuse.
Both Apple and Google claim to scan user reviews with their own computational models and investigate allegations of child sexual abuse. When apps violate their policies, they are removed. Additionally, apps have age ratings and parents can use software to restrict downloads. The companies have also developed tools to aid app developers in monitoring child sexual material. Despite their efforts, Mr. Levine believes that more can be done, and he questions why problematic apps are still available for download despite the companies’ ability to identify potential issues.
The App Danger Project identified a significant number of reviews indicating that Hoop, a social networking app, may not be safe for children. For example, out of 32,000 reviews since 2019, 176 included reports of sexual abuse. Hoop’s chief executive acknowledges the challenges faced by the original founders in dealing with bots and malicious users but assures users that the situation has improved following changes in management.
The Meet Group, which owns MeetMe, emphasizes that it does not tolerate abuse or exploitation of minors. They use artificial intelligence tools to identify predators and report them to law enforcement. Whisper did not provide a comment.
Sergeant Sean Pierce, who leads the San Jose Police Department’s task force on internet crimes against children, believes that some app developers avoid investigating complaints about sextortion to minimize legal liability. The law states that they are not required to report criminal activity unless they discover it themselves. Ensuring user safety remains a challenge due to the anonymous nature of many apps that connect users for conversations, hindering law enforcement’s ability to verify incidents.
Although Apple and Google make hundreds of reports annually to the US clearinghouse for child sexual abuse, they do not specify if any of those reports are related to apps. Whisper, however, was named alongside several other apps as a primary tool used by Joshua Breckel to collect explicit images from victims aged 10 to 15. Breckel was sentenced to 35 years in jail for extortion and child pornography. Chris Hoell, a former federal prosecutor, believes that comprehensive evaluations of reviews, such as those conducted by the App Danger Project, can provide parents with valuable tools to protect their children from similar issues.