Try uploading a nude picture to your facebook private conversation or type something highly problematic about terrorism or drug trade and check how that goes. Facebook is very much responsible for what activity they make possible.
Bells will go ringing and your account will get flagged for review. As it should be when it concerns serious offenses.
For those of you who still think that they are not responsible for what happens in their conversation system: times have changed. Over the last decade I have seen more than a few competing websites go down, get raided, prosecuted for what was posted by their members. Do you really think that you can allow your members to commit very serious offenses on your website? If you have a site for kids then its your job to be proactive against pedophiles. If you have a site for addiction then its your job to keep it free from drug trafficking. If you have a site about firearms, then you better prevent international arms trade going down on your website. etc. If your website does not fall in one such danger zone then you are lucky.
But to state that members their message should never be read seems as utopian to me as people who believe that everything on the web should be free.
I am not saying that anyone should read their members general private messages, but serious abuse of a website and breach of laws should be dealt with. Active and proactive. At least on big boards dealing with topics prone to very serious problems.
I use scripts that screen the messages of my members and flag up potentially problematic. To keep privacy up, I am the only one who has access to need to skim & review flagged messages. I clearly advertise this to my members. I hate to do it but its necessary until I find ways to fully automatize the process. In the last decade I have encountered Charles Manson Family members preying on female members, murderers, stalkers, international drug traffickers, right wing extremist nuts, scammers, pedophiles, hackers, live suicides, to name few nuggets you can encounter on a big board. Access to the conversation system has saved lives and has prevented serious legal problems.
Ideally this should indeed not be, so it would be nice if someone comes up with a better approach to deal with abuse in conversations.