Advertisement

Technology News

Updated on: Tuesday, December 17, 2019, 08:23 PM IST

Instagram has trained AI against to warn you for offensive captions

Advertisement

Instagram has announced that starting today, it will warn users whenever they’re about to post a ‘potentially offensive’ caption in their post. The AI has been trained to identify potentially offensive words and flag it to the user stating that it looks “similar to others that have been reported.”

This warning, according to Instagram, may give a change to users to stop posting sensitive content and probably reconsider what they were about to post. Users get this warning through a pop-up screen which also has an option to edit the content or learn why the content is inappropriate. Though it also gives them an option to post it as is, in case they feel it is appropriate.

With the help of this alert, Instagram aims to control cyber bullying which is very common on most social media platforms like Facebook, Instagram, Twitter, and YouTube, etc.

Advertisement

Instagram has earlier rolled out a similar feature for comments where AI warns you about a possibly offensive comment and offer a chance to the user to re-phrase the comment. Another feature that lets users ‘shadowban’ their bullies was introduced earlier in October.

Advertisement

The results of these features have been promising and these nudges have made users reconsider their words when given a chance, claimed Instagram in a blog post. Earlier the company announced a feature that restricts users under the age of 18 from viewing posts from influencers who promote plastic surgery and other weight loss products.

(To receive our E-paper on whatsapp daily, please click here. We permit sharing of the paper's PDF on WhatsApp and other social media platforms.)

Advertisement
Published on: Tuesday, December 17, 2019, 08:23 PM IST
Advertisement
Advertisement
Advertisement
Advertisement
Advertisement