SASB Eyes Standards on Internet Content

Guidelines are expected next year on data privacy, freedom of speech, and harmful content.


The Sustainability Accounting Standards Board (SASB) is setting standards for content governance on social media platforms, the board decided last week in a meeting. 

SASB will develop guidelines around data privacy, freedom of speech, and harmful content, the organization said. The decision comes as investor concerns about mismanagement of user-generated content and advertisements by technology companies are growing. Of course, the board has no power to enforce such standards, but hopes to use them to hold companies to account.

“These problems are not new and they’re not going away,” Greg Waters, the technology and communications sector analyst at SASB, said in a presentation. The group hopes to complete the project by the end of next year. 

Investors are worried that internet companies, bearing the brunt of responsibility when it comes to moderating content on their platforms, will lose advertising dollars from mismanaging harmful data. Content management also requires costly investments into machine learning or additional workers to handle the backlog.

About three-quarters of the top 20 internet media and services companies have exposure to issues concerning user-generated content, including Facebook and Google parent Alphabet, which together hold a duopoly on internet real estate. The two hold about 60% of all online advertising revenue in the US. 

A string of child exploitation controversies on Google’s YouTube in recent years has spurred a number of advertiser boycotts. And revelations that user data was harvested from Facebook by Cambridge Analytica for targeted political campaigning have dogged the former social media darling since 2016. 

The top 20 list does not include other companies that own social media businesses, such as Microsoft’s LinkedIn platform or Oracle, which aims to strike a deal with video sharing app TikTok Global and its Chinese owner Bytedance. The negotiations are coming in the midst of a controversy involving the Beijing regime and the Trump administration.

SASB will not develop content moderation guidelines for worker safety, which the research team decided was out of scope of the standards-setting project, though members said they will continue to review risks around mental health. 

In recent years, contract workers at content moderation firms have reported suffering trauma after reviewing hours of violent videos involving murders, rapes, and suicides. This week, a former YouTube moderator sued the company for failing to adequately protect workers from the mental toll of the job.

“As we continue to build out the human capital pipeline in terms of the tech sector broadly, this is going to be very high on my list of things to continue to evaluate,” Waters said. 

Related Stories: 

Is Setting ESG Standards Too Puzzling? Get Wide Feedback, SASB Says 

Sustainability Leaders to Work on Common Corporate Reporting Standard

Increasing Body of Evidence Bolsters Case for ESG Investing

Tags: , , , , , , , , , , ,

«