OpenAI, the maker of Microsoft-backed consumer-facing artificial intelligence (AI) service ChatGPT, may have scored something of an own-goal with the unveiling of Voice Engine, billed as “a model for creating custom voices”. While OpenAI’s blog on Friday highlights the legitimate use of voice cloning, sometimes referred to as ‘deepfake voice’, such as providing reading assistance to non-readers and children, its widespread availability could soon metamorphose into a cybersecurity nightmare. Deepfake voice and video software are already being used by cybercriminals to mimic the voices of senior executives to commit financial fraud and other crimes. But the widespread availability and marketing of deepfake voice software is now set to make cybercrime a virtual cottage industry where any number can play. It will open the floodgates to a whole new generation of cybercriminals, terrorists, pranksters, and disgruntled employees.
Sign in to your account