Privacy Concerns Arise Over Collection of Voice Data for Profit

Partager sur facebook
Partager sur twitter
Partager sur linkedin
Partager sur email

The rise of voice-assisted products in homes and workplaces has led to a wave of innovation in the private sector, refining order-taking at fast-food restaurants, replacing handheld devices typically used by warehouse workers, and refining smart home devices that adapt to a user’s vocal tics, according to experts and privacy advocates who spoke with ABC News.

But the collection of voice data also fuels targeted marketing based on personal information harvested from the recordings and risks data breaches that could put a person’s voice in the hands of cybercriminals looking to impersonate them, they added.

« This has become a real problem as more and more people use voice-activated devices like Alexa and Siri, » Marc Rotenberg, founder and executive director of the nonprofit Center for AI and Digital Policy, told ABC News. « There’s a ticking time bomb with the collection of voice records. »

Indeed, opponents fear that voice-assisted products could collect more private data than users realize, allowing companies to profit from statements made at home or at work through carefully crafted ads or the sale of personal information.

A consumer’s voice could be used to reveal a wealth of knowledge about them, including their height, weight, ethnicity, personality traits, and possible health issues.

Last June, TikTok updated its privacy policy, expanding the data the company collects to include voice recordings.

Companies that collect voice data could use that information to sell products directly to consumers, or pass the data on to advertisers, said Turow, of the University of Pennsylvania.

« As we move into a world where people use voice rather than typing in their daily lives, marketers want to know: What can I get from this person’s voice? » he said.

Rotenberg, of the Center for AI and Digital Policy, warned that collecting audio data could also allow malicious actors to access a person’s voice, allowing them to commit fraud or other crimes through identity theft.

Last October, Forbes reported that a criminal using this tactic, known as « deepfake audio, » convinced a Hong Kong-based bank to send $35 million to a criminal the bank thought was a corporate lawyer.

Apple and Google did not immediately respond to a request for comment on the potential theft and abuse of voice data.

Despite the growing use of voice-assist technology, laws protecting the collection of audio data remain limited, according to lawyers and advocates who spoke with ABC News.

The United States has no federal law governing such data, leaving regulation primarily at the state level. So far, four states have passed laws related to voice data collection: California, Texas, Washington and Illinois

Clearly, then, there is an urgent need to limit voice data collection before voice-assisted products become more widely adopted.

Évaluez votre niveau
de conformité

En quelques clics,
lancez sans engagement
et en toute conformité un
audit flash !

Pour recevoir votre audit flash gratuit et sans engagement, merci de bien vouloir remplir ce formulaire :