top of page

Meta/Facebook

Child Safety Report

The U.S. Surgeon General report on Social Media and Youth Mental Health called this a “urgent public health issue.” Social media impacts children’s brains differently than adult brains. It also poses physical and psychological risks that many children and teens are unprepared for, including sextortion and grooming, hate group recruitment, human trafficking (for any means), cyberbullying and harassment, exposure to sexual or violent content, invasion of privacy, self-harm content, and financial scams, among others. 

In 2023 we filed a Child Safety Report shareholder resolution at Meta. It asks the company to adopt targets and publish an annual report that includes quantitative metrics appropriate to assess whether Meta has improved its performance globally regarding child safety impacts and actual harm reduction to children on its platforms.

Our resolution received 16.27% support, although it should be noted that CEO Mark Zuckerberg gets 10 votes for one share, while regular shareholders get one vote one share - consequently he controls over 60% of the vote. Yet, when you look at the non-management controlled vote our resolution received majority support of 53.8%. The 817 million shares voted for us were valued at more than $216 billion based on the stock closing price on the day of the annual meeting and was more than the combined share holdings of the company's four largest institutional investors - Vanguard, BlackRock, Fidelity and State Street. 

 

Child Sexual Exploitation Online

In 2022 there were nearly 32 million reported cases of online child sexual abuse material (CSAM),

85% of which stemmed from Meta platforms including Facebook, WhatsApp, Messenger and Instagram. . This represents an increase of 69% from Facebook’s nearly 16 million reports in 2019 when shareholders first raised this issue with the company. Meta’s plan to apply end-to-end encryption to all its platforms without first stopping CSAM could effectively make invisible 70% of CSAM cases that are currently being detected and reported.

 

Meta’s rush to expand end-to-end encryption across its platforms, without addressing the issue of child sexual exploitation, has led to an immense backlash and poses legitimate risk to children worldwide. Governments, law enforcement agencies and child protection organizations have harshly criticized Meta’s planned encryption, claiming that it will cloak the actions of child predators and make children more vulnerable to sexual abuse. Pending legislation in Congress and other countries could make Meta legally liable for CSAM. The company is facing increasing regulatory, reputational and legal risk due to this issue.

 

Meta executives have admitted that encryption will decrease its ability to report CSAM, saying, “if it’s content we cannot see then it’s content we cannot report.” Yet Meta is intent on applying encryption as soon as possible and feels that this is a valid tradeoff for the risk of increased child sexual abuse. Meta’s CEO Mark Zuckerberg stated, “Encryption is a powerful tool for privacy, but that includes the privacy of people doing bad things. When billions of people use a service to connect, some of them are going to misuse it for truly terrible things like child exploitation, terrorism, and extortion.”

As the world’s largest social media company and the largest source of reported child sexual exploitation online, Met’s actions will, for better or worse, have a major impact on global child safety.

 

Our shareholder resolutions have asked the company to assess the risk of increased sexual exploitation of children as it develops and offers additional privacy tools such as end-to-end encryption.

Proponents of this resolution are not opposed to encryption and recognize the need for improved online privacy and security. There are a number of technological developments that should allow anti-CSAM practices to coexist with encryption. These need to be tested and applied before Meta goes forward with its encryption plans. Support for this resolution is not a vote against privacy, but is a message to management that it needs to take extra precautions to protect the world’s most vulnerable population – children. 

Shareholder resolutions & statements

 

Our 2022 shareholder resolution received 17.3% of the vote. This represents 56.7% of the non-management controlled vote (up from 43% in 2020). The resolution garnered the support of over 910 million shares valued at about $167 billion (based on the closing stock price of the day of the annual meeting).

 

Our 2021 shareholder resolution received 17.25% of the vote, representing 56% of the non-Zuckerberg controlled vote. The resolution garnered the support of nearly 980 million shares worth about $321 billion in stock value. Our 2020 resolution at Facebook received a 12.6% vote—which was about 43% of the non-management controlled stock—with 712 million shares worth over $163 billion.

Read the 2022 Meta/Facebook shareholder resolution2021 Facebook shareholder resolution and 2020 Facebook shareholder resolution asking the company to report on “the risk of increased sexual exploitation of children as the Company develops and offers additional privacy tools such as end-to-end encryption.” Our initial 2020 resolution was filed after an exponential surge in online child sex imagery. The volume of online child sexual abuse material continues to grow dramatically.

 

Sarah Cooper, a survivor of sex trafficking who was contacted by a predator through Facebook Messenger, spoke at the 2022 and 2021 Facebook annual shareholder meeting in which she described the link between Facebook and her own horrific ordeal. Read or listen to Sarah's 2021 statement. Proxy Impact CEO, Michael Passoff spoke at the 2020 Facebook annual shareholder meeting. Read Michael's statement on the business and social risk from Facebook's encryption plan.

Proxy memo / Exempt solicitation

 

Shareholders can file detailed information with the SEC to solicit support for their shareholder resolution. This document is officially called an "exempt solicitation," but is often referred to as a proxy memo.

Read the 2022 Meta-Facebook exempt solicitation which provides details on the link between social media and child sexual abuse, Meta’s central role, the impact of end-to-end encryption on CSAM, the false choice between privacy or child protection, the financial risk to Meta, law enforcement and child protection agencies' call for a delay to encryption, Meta's response, and a rebuttal to Meta’s proxy statement.

Press releases

 

Read our Facebook press releases:

Articles

Read our articles on Facebook and child sexual exploitation:​

bottom of page