Communications Litigation Today was a Warren News publication.
Research Exposes Flaws

Amazon Deploys Additional Checks for Alexa Skills Certification

Amazon began additional checks for certifying Alexa’s voice-control capabilities, or skills, a spokesperson said in a statement Tuesday. In a presentation before the FTC that day, Clemson University Graduate Research Assistant Christin Wilson described Alexa’s certification process as “improper and disorganized." The agency held PrivacyCon, a conference with researchers meant to help identify consumer risks and better target enforcement efforts, Consumer Protection Bureau Director Andrew Smith said.

The research Wilson co-wrote examined 132 skills that “intentionally violate Alexa’s policy requirements,” all of which “passed the certification.” Researchers reported “31 problematic skills with policy violations and 26 broken skills (out of 362) under the kids category.

Customer trust is our top priority and we take violations of our Alexa Skill policies seriously,” Amazon said. “We conduct security and policy reviews as part of skill certification and have systems in place to continually monitor live skills for potentially malicious behavior or policy violations.” Offending skills are blocked and deactivated, the spokesperson emailed. “We are constantly improving these mechanisms and have put additional certification checks in place to further protect our customers.”

Smith noted the agency’s focus on mobile health apps, discussing the use of contact tracing apps during the pandemic. Consumers are increasingly using mobile health apps, and tracing could add a “whole new dimension to that trend,” he said. Expanded access to health information could have an enormous benefit to consumers, but data flow increases the opportunity for data compromise, he said: The FTC “won’t hesitate to take action when companies misrepresent what they’re doing with consumers’ health information or otherwise put health data at undue risk.”

The commission has received more than 131,419 consumer complaints involving COVID-19, Smith told the Senate Consumer Protection Subcommittee Tuesday. He noted the agency provided privacy and online security tips during the pandemic. Smith repeated a request from Chairman Joe Simons that Congress enact privacy legislation granting the agency “civil penalty authority, targeted Administrative Procedure Act rulemaking authority, and jurisdiction over nonprofits and common carriers.”

Significant privacy risks are associated with mental health apps, said a recent report from Harvard Medical School Research Training Fellow John Torous and Beth Israel Deaconess Medical Center Research Assistant Sarah Lagan. Findings showed about 30% of the top-returned apps in a search for “bipolar” lacked a privacy policy, and 33% of the apps said they share personal information with third parties. People should be educated on what data is shared and why it’s valuable, and they should know what they are giving up, said Torous.

Harvard Medical School professor Kenneth Mandl wants comprehensive privacy legislation. “Transparency in apps’ sharing policies with regards to research use and monetization can empower patients to decide to share more data with good actors and avoid those apps unwilling to meaningfully disclose their practices,” he and co-authors wrote in a recent report.

Developers should disclose data sharing practices and allow users to choose what data are shared, said University of Toronto assistant professor Quinn Grundy. Mandl highlighted some strengths of the Apple Health app, saying it was built with a rigorous, privacy-first perspective. Apple doesn’t mine the data, he said; the data is encrypted and left with the consumer.