Social Junk
What the hell is going on in Instagram comments?
Between horny spam bots, get-rich-quick schemes, and misinformation, the social media giant is at war with trash — and the trash is winning.
Whenever Travis Scott or Kylie Jenner share a photo with their millions of followers on Instagram, it’s almost inevitable that the first comment on it will be something along the lines of “Watch my squirting videos I just uploaded and prepare a tissue to clean your sperm later.”
It’s the kind of graphic, explicit comment which has become ubiquitous on the Facebook-owned app, particularly on pages from celebrities and other accounts with large audiences.
Behind these comments, there are bot networks spreading spam with the intent to get people to sign up for dubious porn services or pyramid schemes that promise to make them rich. They’re also running scams with offers to get someone that coveted verified blue check mark in exchange for money or, in less harmful cases, the bots simply spam with follow-for-follow requests in hopes of gaining more followers on certain meme accounts.
Cat and mouse
This phenomenon isn’t completely new: these porn and scam bots have been heavily flooding Instagram comments for years. But, despite Instagram’s continuous pledge that it is working to mitigate the issue, this type of spam behavior remains largely unchanged on the platform — and it seems like it only keeps getting worse. If anything, as the game of cat-and-mouse unfolds between Instagram and the bot networks, it is the spammers who seem to be getting smarter in their approach to avoid the app’s automated safety systems. And they’ve expanded to Twitter, too.
At first, these bot accounts would try to lure people to follow them by leaving comments such as “We gonna ignore the fact that I've GOT A HUGE BOOTY?" or "DON'T LOOK at my STORY, if you don't want to M A S T U R B A T E !” on high-profile. That includes not just Travis Scott or Kylie Jenner, but also the comments section of Chrissy Tiegen, LeBron James, Kim Kardashian, and basically any celebrity you can think of, as well as media accounts like ESPN or SkySports.
Just as I’m writing this, for instance, Sneakernews (which has over 9 million followers) posted a new picture on its Instagram and one of the first comments says “you will be happy with me look at my story.” Then, if you go to the profile of the person who left that account, it’s filled with pictures of a half-naked woman, whose bio reads: “ready to serve sex chat | sex video calls | walk together and drink together | have sex.” That’s followed by a “check directly on my link,” which redirects you to dubious websites like “Fuckbuddy” and “Livecam Masturbie” that eventually ask you to sign up for an account and enter personal information, including email address and credit card number.
While this behavior is common for these porn bots, they have changed their strategy over time. Some of them are now using private profiles instead of public, or leaving less racy, more basic comments like “WOW,” as they try to find different ways to evade Instagram’s takedown systems. Others have moved from offering sex services to financial ones. For example, in one of the more than dozen spam accounts Input discovered and reviewed, a comment from what appeared to be a normal user read: “Success is not final, failure is not fatal it’s the courage to continue that counts. I earn $7500 in 7 days with $1500 continuously in every of my trades, contact Mr Walker and have a change of story for a better life @lachlanwalker_.”
There are estimates that there could be 150 million fake accounts on the app
Interestingly enough, the account tagged in that comment is someone claiming to be an “Options Analyst” who promises people to help them “earn up to $10,000 weekly simply by investing.” Another comment followed the exact same pattern, but with a Bitcoin twist: “I never believed Bitcoin and binary options was real until i was introduced to: @fritzparkington i started with 1000 now I’m making 9100 every week.” Similar to the other account tagged in a spam comment, “Fritz Parkington” claims to be an “ACCOUNT MANAGER” who offers “GUARANTEED PROFIT ON TRADES.”
Whether it be porn or pyramid schemes that these accounts are trying to sell, this inauthentic practice continues to become more prevalent on Instagram, and it is unclear how many of them are out there or who is behind these networks. There are estimates that there could be as many as 150 million accounts on the app, according to analytics agency Instascreener — which isn’t completely far-fetched when you consider that Instagram has more than 1 billion monthly active users per month.
Can the bots be stopped?
“It's important to us that the interactions people have on Instagram are genuine, and we're working hard to keep the community free from spammy behavior,” a Facebook spokesperson told Input. “Services that offer to boost an account's popularity via inauthentic likes, comments and followers, aren't allowed, and we're developing technology to remove this activity from Instagram.” The company also said it removed a couple of the accounts we reported for being fake and violating its Community Guidelines and Terms of Use, including one that said “WOW!!” and was the top and most-liked comment on a recent post from Travis Scott.
To Instagram’s credit, while many of the bot comments live long enough to garner hundreds or thousands of likes from actual humans who get a kick out of them, its systems do eventually find and remove the spam content. That said, the problem is so widespread that it has become a meme of its own: An Instagram account by the name of Bot Police, which has nearly 400,000 followers, was created solely for the purpose of helping report “those fools to take em down and take em out.” Other ordinary people, meanwhile, will mock the bots by leaving comments like “These bots faster than Travis’s [Scott] Bugatti” on celebrity pages.
Facebook says reducing inauthentic activity on Instagram is a priority, noting that it will keep rolling out safety measures to reduce and stop fake accounts, using a mix of artificial intelligence and human moderators. The company added that, as its tools continue to remove inauthentic likes, follows, and comments from accounts relying on third-party apps to boost their reach, these spam methods will become less effective on Instagram. Facebook said it will take sometime to filter them out – and it may not be possible to completely get rid of them — but it said it’s working hard on this area and that it is committed for the long term.
Right now, Facebook has another, more serious matter on its hands though, which is reducing the spread of misinformation around the novel coronavirus (COVID-19) on Instagram. Let's just hope whatever algorithms driving the porn and spam bots don't turn their attention to fake news, because not unlike America and the coronavirus at this moment, on Instagram, the worst may be ahead of us.