Scammer Uses AI to Clone Voice of Queensland Premier Steven Miles in Bitcoin Investment Scam

When advertising executive Dee Madigan received a private message on social media from Queensland Premier Steven Miles, her curiosity was piqued. As a long-time friend of Miles, Ms. Madigan immediately knew she wasn’t actually conversing with the prominent political leader.

“I was stuck at home with Covid and bored, so I decided to have a little fun,” she recounted to news.com.au.

The message came from a fake profile on X, formerly known as Twitter. It was clear to Ms. Madigan that she was dealing with a scammer, who quickly steered the conversation towards financial investments.

“It was all pretty standard stuff,” Ms. Madigan explained. “I said I was interested in investing but pretended to have trouble setting up a trading account, just to annoy the guy. I wanted to see how far he would go.”

In an attempt to draw out the scammer, she suggested they jump on a phone call so he could walk her through the process. “I figured that would be the end of it,” she said, expecting the scammer to drop the act once a direct conversation was required.

However, the scam took an unexpected turn. Instead of abandoning the charade, the scammer agreed to the call, raising Ms. Madigan’s suspicions further. When the call came through, she was shocked to hear what sounded exactly like the voice of Steven Miles on the other end. Unbeknownst to her, the scammer was using advanced AI voice cloning technology to mimic the Premier’s voice convincingly.

The realistic imitation underscored the sophisticated lengths to which scammers will go to deceive their targets. This incident highlights the growing threat posed by AI-driven scams, which are becoming increasingly difficult to detect.

Ms. Madigan’s experience serves as a cautionary tale about the evolving tactics of cybercriminals. It emphasizes the need for heightened vigilance and skepticism when dealing with unsolicited investment opportunities, especially those involving familiar figures. The use of AI to clone voices adds a new layer of complexity to these scams, making it even more crucial for individuals to verify the identity of anyone requesting personal or financial information.

Exit mobile version