Facebook has rules that bar precisely these kinds of pictures, but they generally are enforced only when members complain about them, not through advance screening done by the company. Photos come in by the millions every month; Facebook says its users share 30 billion pieces of content every month. They also grant the company nearly unlimited rights to use that data any way it wants.
From an article about an EMT who posted a victim’s corpse on Facebook. The company removed it, obviously, but presumably it passed by the eyes of at least one screener during the deletion process.
How long can people stand to do this? And do they get counseling? Same question for screeners at YouTube, Flickr. (And good God, 4Chan.)