Two former Microsoft employees are suing the company because they say the disturbing content they had to view for their jobs caused them to experience post traumatic stress disorder (PTSD).
Both men are suing the tech giant for damages, alleging disability discrimination, violations of the Consumer Protection Act and negligence. The amount is to be decided during trial, according to the complaint filed in district court.
SEE ALSO:Nokia finally returns to the smartphone marketWhen they confronted their employer about the trauma they experienced, Microsoft told them to play video games or take more smoke breaks, instead of providing adequate mental health services, the suit alleges.
Greg Blauert and Henry Soto's lawsuit, filed on Dec. 30 of last year, says that for years they had to watch horrifying videos on the internet in order to help keep Microsoft's platforms free from content that would disturb its users or break the law.

Blauert, Soto and their families are suing for what they say are irreversible psychological damages. Both plaintiffs applied for worker's compensation and were denied.
The two ex-employees allege that Microsoft had a comprehensive mental healthcare plan for members of a similar department, the Digital Crimes Unit, but that it neglected to extend the benefit to their Online Safety Team.
When Blauert attempted to get help, his superiors told him that "limiting exposure to depictions, taking walks and smoking breaks, and redirection [of] his thoughts by playing video games would be sufficient to manage his symptoms,” according to the suit.
Microsoft said it disagrees with the plaintiffs claims. "Microsoft takes seriously its responsibility to remove and report imagery of child sexual exploitation and abuse being shared on its services, as well as the health and resiliency of the employees who do this important work," a spokesperson for the company said.
The lawsuit sheds light on an extremely taxing job in the tech industry that rarely gets attention: content moderation — the removal of offensive or disturbing material.
Social networks and sites where users generate content, like YouTube, Facebook and Twitter all have armies of people who sift through disturbing imagery for a living.
Their job is deceptively powerful — they're the ones who decide what should or shouldn't be removed from a platform, effectively policing free speech on the internet.
They exercise important leverage over issues like human rights, government dissent and privacy.
But the job of content moderation is also difficult and taxing.
Their days are spent debating whether a video depicting something like a beheading should be given a place on some of the world's most popular websites. It's complicated to decide if a piece of content — even if it's grotesque — is newsworthy enough to stay.
Tech companies have chosen to marginalize content moderators, even though they play a crucial role with global impact. Their work is not often performed at a fancy office in the bay area, but overseas.
Countries with large English-speaking populations and low labor costs, like the Philippines and India are baring the brunt of the work.
While content moderation is sometimes compared to working in a call center, their work is actually the reason that we can use services like YouTube and Facebook without encountering things like bestiality. They also help sites maintain a pristine image.
Content moderators also often save lives. When child pornography is uploaded for example, they're responsible for contacting appropriate authorities, like The National Center for Missing & Exploited Children.
Neither Soto nor Blauert have returned back to work.
This is the full statement Microsoft provided Mashable:
We disagree with the plaintiffs’ claims. Microsoft takes seriously its responsibility to remove and report imagery of child sexual exploitation and abuse being shared on its services, as well as the health and resiliency of the employees who do this important work.
Microsoft applies industry-leading technology to help detect and classify illegal imagery of child abuse and exploitation that are shared by users on Microsoft Services. Once verified by a specially trained employee, the company removes the imagery, reports it to the National Center for Missing & Exploited Children, and bans the users who shared the imagery from our services.
This work is difficult, but critically important to a safer and more trusted internet. The health and safety of our employees who do this difficult work is a top priority. Microsoft works with the input of our employees, mental health professionals, and the latest research on robust wellness and resilience programs to ensure those who handle this material have the resources and support they need, including an individual wellness plan. We view it as a process, always learning and applying the newest research about what we can do to help support our employees even more.
TopicsMicrosoft
(责任编辑:百科)
This 'sh*tpost' bot makes terrible memes so you don't have to
Mysterious monolith is missing, and people have theories
Lenovo's ThinkPad X1 Fold is finally ready to pre
Amazon delivers a killing blow to the pro
'The Flying Bum' aircraft crashes during second test flightCarlos Beltran made a very interesting hair choice
Apparently the Texas Rangers' Carlos Beltran is taking hair advice from another Carlos in sports --
...[详细]Facebook wants NYU to stop sharing political ad data it keeps secret
Facebook isn't happy with New York University.It's because of NYU Ad Observatory, a newly launched r
...[详细]'Dexter' announces return as limited series in 2021
In the grand tradition of post-Screamhorror, Dexter is back for one last scare. On Wednesday, Showti
...[详细]How fantasy football exploded online and kept Yahoo relevant
InTales of the Early Internet, Mashable explores online life through 2007 — back before social
...[详细]J.K. Rowling makes 'Harry Potter' joke about Olympics event
LONDON -- For anyone who isn't familiar with the Olympics omnium events in the velodrome, the points
...[详细]The climate hasn't hit a 'point of no return'
Climate 101 is a Mashable series that answers provoking and salient questions about Earth’s wa
...[详细]Facebook will temporarily ban political ads after the election
Hot damn, we can all rest easy. In a bold and daring display of courage, Facebook announced Wednesda
...[详细]13 best tweets of the week, including election memes, Gritty, and 'The Notebook'
I don't need to tell you that this has been a WEEK. We've been locked in a perpetual Tuesday as the
...[详细]Richard Branson 'thought he was going to die' in bike accident
Virgin Group founder Richard Branson was recently injured in a serious bike accident while cycling o
...[详细]#SharpieGate: Debunked conspiracy theory leads to scary situation in Arizona
Donald Trump’s supporters are spreading a conspiracy theory involving Sharpies in an attempt t
...[详细]Airbnb activates disaster response site for Louisiana flooding

Bernie Sanders predicted Trump's election night chaos with terrifying precision
