With talk about integrating artificial intelligence and the cryptocurrency industry mostly focusing on how AI can help the crypto industry combat scams, experts are failing to pay attention to the fact that it could have the complete opposite effect. In fact, Meta recently warned that hackers appeared to be taking advantage of OpenAI’s ChatGPT in attempts to gain entry into users’ Facebook accounts.
Meta reported blocking more than 1,000 malicious links masked as ChatGPT extensions in March and April alone. The platform went as far as calling ChatGPT “the new crypto” in the eyes of scammers. In addition, searching the keywords “ChatGPT” or “OpenAI” on DEXTools, an interactive crypto trading platform tracking a number of tokens, collectively reveals over 700 token trading pairs that mention either of the two keywords. This shows that scammers are using the hype around the AI tool to create tokens, despite OpenAI not announcing an official entry into the blockchain world.
Social media platforms have become popular channels for promoting new scam coins online. Scammers take advantage of the widespread reach and influence of these platforms to generate a significant following within a short period. By leveraging AI-powered tools, they can further amplify their reach and create a seemingly loyal fanbase consisting of thousands of people. These fake accounts and interactions can be used to give the illusion of credibility and popularity to their scam projects.
Related: Think AI tools aren’t harvesting your data? Guess again
Much of crypto works on social proof-of-work, which suggests that if a cryptocurrency or project appears popular and has a large following, it must be popular for a reason. Investors and new buyers tend to trust projects with greater and more loyal followings online, assuming that others have done enough research prior to investing. However, the use of AI can challenge this assumption and undermine social proof-of-work.
Now, just because something has thousands of likes and genuine-looking comments does not necessarily mean it is a legitimate project. This is just one attack vector, and AI will give rise to many others. One such example is “pig butchering” scams, where an AI instance can spend several days befriending someone, usually an elderly or vulnerable person, only to end up scamming them. The advancement of AI technologies has enabled scammers to automate and scale fraudulent activities, potentially targeting vulnerable individuals in the cryptosphere.
Scammers may use AI-driven chatbots or virtual assistants to engage with individuals, provide investment advice, promote fake tokens and initial coin offerings or offer high-yield investment opportunities. Such AI scams can also be very dangerous because they are able to mimic human-like conversations to a T. In addition, by leveraging social media platforms and AI-generated content, scammers can orchestrate elaborate pump-and-dump schemes, artificially inflating the value of tokens and selling off their holdings for significant profits, leaving numerous investors with losses.
Related: Don’t be surprised if AI tries to sabotage your crypto
Investors have long been warned to look out for deepfake crypto scams, which use AI technologies to create very realistic online content that swaps faces in videos and photos or even alters audio content to make it seem as if influencers or other well-known personalities are endorsing scam projects.
One very prominent deepfake that affected the crypto industry was a video of former FTX CEO Sam Bankman-Fried directing users toward a malicious website promising to double their crypto.
Earlier this year, in March 2023, the so-called AI project Harvest Keeper scammed its users out of around $1 million. In addition, around the same time, projects started to emerge on Twitter calling themselves “CryptoGPT.”
However, on a more positive note, AI also has the potential to automate the boring, monotonous aspects of crypto development, acting as a great tool for blockchain experts. Things that every project requires, like setting up Solidity environments or generating base code, are made easier through leveraging AI technology. Eventually, the barrier to entry will be lowered significantly, and the crypto industry will be less about development skills and more about whether or not one’s idea has genuine utility.
In some niche cases, AI will have a surprising way of democratizing the processes that we currently assume are only beholden to an elite class — in this case, well-studied senior developers. But with everyone having access to advanced development tools and launchpads in crypto, the sky’s the limit. With AI making it easier for projects to scam people, users must exercise caution and due diligence prior to investing in a project, such as watching out for suspicious URLs and never investing in something that has sprung up seemingly out of nowhere.
This article is for general information purposes and is not intended to be and should not be taken as legal or investment advice. The views, thoughts and opinions expressed here are the author’s alone and do not necessarily reflect or represent the views and opinions of Cointelegraph.
Leave A Comment