Twitter has rolled out improved prompts on iOS and Android that will encourage users to pause and reconsider a potentially harmful or offensive reply – such as insults, strong language, or hateful remarks – before tweeting it.
In other words, the upgraded feature is better at spotting “strong language” and now takes into account your relationship with the person you’re messaging.
“For example, if two accounts follow and reply to each other often, there’s a higher likelihood that they have a better understanding of preferred tone of communication,” Anita Butler and Alberto Parrella from Twitter said in a joint statement on Wednesday.
In 2020, Twitter first tested prompts that encouraged people to pause and reconsider a potentially harmful or offensive reply before they hit send.
“Starting today, we’re rolling these improved prompts out across iOS and Android, starting with accounts that have enabled English-language settings,” the company informed.
Early tests revealed that if prompted, 34 per cent of people revised their initial reply or decided to not send their reply at all.
“After being prompted once, people composed, on average, 11 per cent fewer offensive replies in the future,” Twitter said.
If prompted, people were less likely to receive offensive and harmful replies back.
Since the early tests, here’s what we’ve incorporated into the systems that decide when and how to send these reminders:
Twitter said it will continue to explore how prompts – such as reply prompts and article prompts – and other forms of intervention can encourage healthier conversations on Twitter.
Next
« Prev Post
« Prev Post
Previous
Next Post »
Next Post »
Subscribe to:
Post Comments (Atom)
ConversionConversion EmoticonEmoticon