Facebook has rolled out a new safety measure over user-uploaded and shared photos and videos that show graphic violence, according to the Guardian.
The measure comprises warning messages that pop up before content depicting violence is viewed. The social networking site started putting the measure in place in December.
Videos of such nature previously would auto-play on Facebook. With the new precaution, users need to click on the videos before they are played.
Another feature of the measure is that graphic videos and images are hidden away from users who have identified themselves as below 18 years of age, according to BBC.
Users, however, can easily find their way around the age restriction by declaring a false birth year, the Guardian notes.
Explicit videos that are not necessarily graphically violent are not tagged with the warning, according to the Guardian report. Moreover, only videos that have been identified by other users as being violent in nature are subject to the measure.
Meanwhile, content that is "shared for sadistic pleasure or to celebrate or glorify violence," as Facebook's terms and conditions says, are automatically banned from the website.
Facebook said in a statement that it counts on its users to practice discretion in uploading and sharing certain photos or videos.
"When people share things on Facebook, we expect that they will share it responsibly, including choosing who will see that content," a Facebook spokesperson said.
"We also ask that people warn their audience about what they are about to see if it includes graphic violence. In instances when people report graphic content to us that should include warnings or is not appropriate for people under the age of 18, we may add a warning for adulst and prevent young people from viewing the content," the spokesperson continued.
Among the first videos covered by the new measure is the footage of policeman Ahmed Merabet being shot dead in the Charlie Hebdo attacks last week in Paris, according to Australian Muslim Times.