Nasty replies on Twitter will require a little more thought to send.
The tech company said Wednesday it was releasing a feature that automatically detects "mean" replies on its service and prompts people to review the replies before sending them.
"Want to review this before Tweeting?" the prompt asks in a sample provided by the San Francisco-based company.
Twitter users will have three options in response: tweet as is, edit or delete.
The prompts are part of wider efforts at Twitter and other social media companies to rethink how their products are designed and what incentives they may have built in to encourage anger, harassment, jealousy or other bad behavior. Facebook-owned Instagram is testing ways to hide like counts on its service.
Twitter began testing the prompts a year ago and said it was now rolling them out widely, first to English-speaking users.
In the tests, it found that if prompted, 34 percent of people revised their initial reply or did not reply at all. After being prompted once, people composed on average 11 percent fewer offensive replies in the future, according to the company.
The tests helped to train Twitter's algorithms to better detect when a seemingly mean tweet is just sarcasm or friendly banter, the company said.