Instagram is reported to have been asking some of its users to verify they are human by providing a video selfie showing multiple angles of their face. The social media platform, owned by Meta, has long struggled with bot accounts which typically leave spam messages, harass people and can also be used to artificially inflate like and follower counts.
In a follow-up tweet, Instagram has indeed agreed they are doing this to combat the high number of bots currently present in their platform. XDA Developers first reported Instagram doing this last year, but they ran into technical problems, halting the undertaking. Instagram however claim that they will not be collecting any biometric data.
A suspected user gets the prompt, “We need a short video of you turning your head in different directions. This helps us confirm that you’re a real person and confirm your identity” After shooting the short video, a short message is displayed asking you to submit it. “Thanks for completing these steps. Submit this video to help us confirm that you’re a real person and confirm your identity”
Despite the seemingly straightforward process, various users on Twitter are reporting glitches with some unable to record the video, while others are getting errors when submitting their video selfies.
Other users have been trying to create shady accounts to trick Instagram’s algorithm into asking them to send a verification video selfie, with varying success. Instagram refers to suspicious behaviour as including, but not limited to, following a lot of accounts within a short period of time and also commenting on a lot of posts within the same short time.
“One of the ways we use video selfies is when we think an account could be a bot. For example, if the account likes lots of posts or follows a ton of accounts in a matter of seconds, video selfies help us determine if there’s a real person behind the account or not,” Instagram Comms wrote on Twitter.
Despite Meta making it clear that they are not using facial recognition features for the verification process, this move has rightly caused some jitters online as accounts belonging to young children have also been affected meaning Instagram might have also asked for children’s footage in the verification process. Nevertheless, Instagram and Meta in general are not new to controversy and have said the videos will be deleted after 30 days of being submitted.
Not long ago, the platform was in the cross-hairs of critics as they made plans to develop a similar app to Instagram that was targeting children under the age of 12 years. The subsequent criticism made them shelve their plans and backtrack on the project, although it has not been ruled out entirely from being resurrected in the future with better marketing.
At the end of the day, reducing the number of bots on the platform will need an automated process, as Instagram is currently relying on its employees to manually verify the suspected accounts. The sheer number of these bot accounts makes this undertaking a fool’s errand unless they come up with a better solution.









