The technology, called Fugatto (Foundational Generative Audio Transformer Opus 1), can modify voices and generate novel sounds and is targeted at producers of music, films and video games. However, the company doesn’t plan to publicly release it any time soon.
“Any generative technology always carries some risks, because people might use that to generate things that we would prefer they don’t. We need to be careful about that, which is why we don’t have immediate plans to release this.” said Bryan Catanzaro, vice president of applied deep learning research at Nvidia.
The advent of AI, although highly beneficial, has allowed fraudsters to create “deepfakes” – fake images, videos or audio, which often involves celebrities or politicians, according to the Conversation. A recent case was reported by ET in May, regarding OpenAI, which was forced to apologize to actress Scarlett Johansson for using her voice or something very similar.
The fear of such technologies is not just limited to celebrities and politicians. Earlier this year, ET had reported about voice over artists who were getting random calls to “steal their voices”. The Association of Voice Artists (AVA) in India had reached out to the Federation of Indian Chambers of Commerce & Industry (Ficci) to present its concerns to the IT ministry to work out policies in order to prevent commercial exploitation.
Large players like Meta too are in a dilemma over this matter. In May, ET had reported how Meta said that generative AI deception is held in check for now. “We’re not seeing generative AI being used in terribly sophisticated ways, but we know that these networks are going to keep evolving their tactics as this technology changes,” David Agranovich, Meta’s threat disruption policy director, had said.
Discover the stories of your interest
Given these concerns, the creators of such generative AI models including Nvidia are being careful about preventing abuse of this technology. Big players like OpenAI and Meta are yet to introduce similar AI models to the public as well.