A woman using a smart home assistantRossHelen/Shutterstock
Freely available software that can mimic a specific individual’s voice produces results that can fool people and voice-activated tools such as smart home assistants.
Security researchers are increasingly concerned by deepfake software, which uses artificial intelligence to alter videos or photographs to map one person’s face onto another.
Emily Wenger at the University of Chicago and her colleagues wanted to investigate audio versions of these tools, which use a sample of a person’s voice to generate realistic speech,  after reading about …

Leave a Reply

Your email address will not be published. Required fields are marked *

Previous post 4 Pros and Cons of Investing in a New Cryptocurrencies
Comparison of QUIK 7 and MetaTrader 5 terminals Next post Comparison of QUIK 7 and MetaTrader 5 terminals