- 8 February 2018
It takes a nation to protect the nation
Mr. Zuckerberg and Ms. Sandberg,
Thank you for your many positive accomplishments. Please reverse the policy under which Facebook shut down a non-violent page of secularists and atheists in Pakistan. This shutdown is not making the world “a more open place.”
Those who wish to crush the human rights of Pakistani secularists protested to Facebook about this non-violent page of an oppressed minority. Facebook supported the bullies -- and the non-violent Saaenji page was deactivated:https://www.facebook.com/%C2%ADsaaenji
This page is in the Urdu language. It was an act of bravery even to create the page.
Meanwhile Facebook has long allowed a page which provides specific image instructions on to how to stone people to death according to God’s law:https://www.facebook.com/photo.php?fbid=311857638916798&set=a.1...
When the stoning page was repeatedly reported to Facebook, Facebook replied that the level of graphic violence is acceptable – yet Facebook backed down to bullies who seek to oppress non-violent minorities. The Pakistani secular page is a voice of non-violent rationalism. Mr. Zuckerberg and Ms. Sandberg, based on your history, we are confident that you are people of good will. This violation of human rights cannot be your personal intent.
We are confident that you will condemn, and reverse, this unconscionable policy immediately. We request that:
1) the non-violent Saaenji page be reinstated;
2) the page advocating violence be shut down (or at least have consistent guidelines -- See #1); and
3) policies are created to prevent bullies -- mob rule -- from crushing the rights – and the voices -- of those in a peaceful minorities.
Sean Faircloth, Director of Strategy & Policy
Richard Dawkins Foundation for Reason & Science
This article contains descriptions of child sexual abuse and other acts readers may find disturbing.
"It's mostly pornography," says Sarah Katz, recalling her eight-month stint working as a Facebook moderator.
"The agency was very upfront about what type of content we would be seeing, in terms of how graphic it was, so we weren't left in the dark."
In 2016, Sarah was one of hundreds of human moderators working for a third-party agency in California.
Her job was to review complaints of inappropriate content, as flagged by Facebook's users.
She shared her experience with BBC Radio 5 live's Emma Barnett.
"They capped us on spending about one minute per post to decide whether it was spam and whether to remove the content," she said.
"Sometimes we would also remove the associated account.
"Management liked us not to work any more than eight hours per day, and we would review an average of about 8,000 posts per day, so roughly about 1,000 posts per hour.
"You pretty much learn on the job, specifically on day one. If I had to describe the job in one word, it would be 'strenuous'.
"You definitely have to be prepared to see anything after just one click. You can be hit with things really fast without a warning.
"The piece of content that sticks with me was a piece of child pornography.
"Two children - the boy was maybe about 12 and the girl about eight or nine - standing facing each other.
"They weren't wearing pants and they were touching each other. It really seemed like an adult was probably off camera telling them what to do. It was very disturbing, mostly because you could tell that it was real.
"A lot of these explicit posts circulate. We would often see them pop up from about six different users in one day, so that made it pretty challenging to find the original source.
"At the time there was nothing in the way of counselling services. There might be today, I'm not sure."
Sarah says she would probably have taken up counselling if it had been offered.
"They definitely warn you, but warning you and actually seeing it are different.
"Some folks think that they can handle it and it turns out they can't, or it's actually worse than they expected."
"You become rather desensitised to it over time. I wouldn't say it gets any easier but you definitely do get used to it.
"There was obviously a lot of generic pornography between consenting adults, which wasn't as disturbing.
"There was some bestiality. There was one with a horse which kept on circulating.
"There's a lot of graphic violence, there was one when a woman had her head blown off.
"Half of her body was on the ground and the torso upwards was still on the chair.
"The policy was more stringent on removing pornography than it was for graphic violence."
"I think Facebook was caught out by fake news. In the run-up to the US election, it seemed highly off the radar, at least at the time I was working there.
"I really cannot recall ever hearing the term 'fake news'.
"We saw a lot of news articles that were circulating and reported by users, but I don't ever recall management asking us to browse news articles to make sure that all the facts were accurate.
"It's very monotonous, and you really get used to what's spam and what's not. It just becomes a lot of clicking.
"Would I recommend it? If you could do anything else, I would say no."
The BBC shared Sarah's story with Facebook.
In response, a Facebook spokesman said: "Our reviewers play a crucial role in making Facebook a safe and open environment.
"This can be very challenging work, and we want to make sure they feel properly supported.
"That is why we offer regular training, counselling, and psychological support to all our employees and to everyone who works for us through our partners.
"Although we use artificial intelligence where we can, there are now over 7,000 people who review content on Facebook, and looking after their wellbeing is a real priority for us."
So a moderator reviews 1000 pages per hour, so spends about 15 seconds on each. That would explain why they simply judge pages on the basis of pre-conceived prejudices and opinions, which have been implanted by the MSM, other social media, and the current cultural narrative.
This podcast contains more information about the Facebook moderators.
ByPaul Joseph Watson
on May 8, 2019
Facebook Calls me ‘Dangerous’. Imagine My Shock. No, Really…
Facebook will shortly be put on legal notice about the harm that their actions have caused and will be mandated to turn over all information and internal discussions as to why I was designated as a “dangerous” person and why I was banned.
Last week I was permanently banned by Facebook for being a “dangerous person”. I found out about it not through Facebook, which failed to even send me a single email, but through media reports.
They’ve put me in the same category as Louis Farrakhan, a man who compared Jews to termites and once described Adolf Hitler as a “very great man”.
To whom am I a danger, precisely? Mark Zuckerberg?
The Instagram (owned by Facebook) ban was even “funnier” given my page consisted mainly of selfies and videos of myself and my girlfriend feeding ducks. Super dangerous.
But as humorous as it is, I take exception to being defamed as a “dangerous person”.
To whom am I a danger, precisely? Mark Zuckerberg? A billionaire who wants to create a cult out of 2.4 billion people? A creepy oligarch who wants to dictate the thoughts that can be expressed by a third of the earth’s entire population? Who’s the bigger danger?
Yes, Kinana, I was aware that when you made that new forum, it should really have been an addition to this one. I was tempted to delete and move it here, but didn’t, in case you’d already posted links to it.
A cross post once and awhile is fine. :)
Although at present, Facebook is currently the worst for censorship, the evil cabal are now moving onto censoring other platforms, the excellent Chateau Heartiste blog has shut down ; http://voxday.blogspot.com/2019/05/always-read-fine-print.html
Content warning: This story contains descriptions of violent acts against people and animals, accounts of sexual harassment and post-traumatic stress disorder, and other potentially disturbing content.
Potential Youtube Solutions - universal video catalogue ; https://www.youtube.com/watch?v=zru9sLqrNH4&feature=em-uploademail
© 2023 Created by Netcon. Powered by