STOCK TITAN

Meta (META) proxy push asks report on tying child-safety metrics to executive pay

Filing Impact
(Neutral)
Filing Sentiment
(Neutral)
Form Type
PX14A6G

Rhea-AI Filing Summary

Meta Platforms faces layered legal, regulatory and reputational risks tied to child safety that proponents say warrant a Compensation Committee report on linking child-safety performance to executive pay. The proposal requests a feasibility report assessing whether executive incentives can be aligned with measurable child-protection outcomes.

The filing cites two recent jury verdicts ($375 million; $6 million), an MDL with 2,407+ cases, a Delaware insurance ruling denying defense coverage, and a April 29, 2026 preliminary EU Digital Services Act finding that could carry fines up to $12 billion. Proponents state they hold over $800 million of Class A stock and ask shareholders to vote FOR Proposal #10.

Positive

  • None.

Negative

  • None.

Insights

Request asks the Compensation Committee to evaluate measurable child-safety metrics tied to pay.

Meta's proposal response centers on governance: proponents request a report assessing feasibility of integrating child-safety performance into executive compensation. The filing cites the absence of specific, weighted child-safety metrics in current bonus frameworks and highlights prior majority shareholder support for related proposals.

Dependencies include the committee's ability to define objective metrics and weightings; the filing preserves that the resolution does not mandate a specific pay design. Subsequent disclosures could clarify metric design, weight, and implementation timing.

Proponents frame recent verdicts, MDL volume, insurer denial, and an EU DSA finding as a material risk stack.

The filing cites a $375 million New Mexico verdict, a $6 million Los Angeles verdict, >2,407 MDL cases, an insurance denial ruling, and a April 29, 2026 preliminary EU DSA breach finding with potential fines up to $12 billion. It argues these outcomes could drive court-ordered product redesign and revenue effects.

Material impact depends on future MDL trial outcomes, confirmation of the EU DSA finding, and whether insurers or regulators impose quantified remedies; the filing requests a compensation-linked accountability assessment as a governance response.

MDL case count 2,407 cases federal multidistrict litigation as of March 2026
Potential EU fine up to $12 billion 6% of worldwide turnover per preliminary DSA finding (April 29, 2026)
Ad revenue dependence 98% of revenue company revenue sourced from advertising
NEO target bonus change 75% to 200% of base salary target bonuses increased by Proxy Statement reference
Proponent stake $800 million Proponents hold over this amount in Meta Class A stock
MDL legal
"federal multidistrict litigation (MDL) which includes more than 2,407 active cases"
Digital Services Act (DSA) regulatory
"preliminary findings that Instagram and Facebook are in breach of the Digital Services Act"
abatement plan legal
"bench trial phase asking Meta to pay $3.7 billion in an abatement plan"
CSAM other
"reporting of online child sexual abuse materials (CSAM) nearly doubled from 2019 to 2023"
CSAM means images, videos or other recordings that show sexual abuse or exploitation of children. It matters to investors because companies that host, distribute or fail to block such material face legal penalties, heavy compliance costs, damage to reputation and loss of customers—similar to a retailer being held responsible for a dangerous product on its shelves, which can trigger expensive recalls and lasting trust problems.
algorithmic amplification technical
"reduced algorithmic amplification for young users required by DSA guidelines"

 

SECURITIES & EXCHANGE COMMISSION 
WASHINGTON, D.C. 20549
NOTICE OF EXEMPT SOLICITATION

 

NAME OF REGISTRANT:  Meta Platforms Inc.

 

NAME OF PERSONS RELYING ON EXEMPTION: Proxy Impact

 

ADDRESSES OF PERSONS RELYING ON EXEMPTION:  100 Beekman St., Apt 27C. New York, NY 10038

 

Written materials are submitted pursuant to Rule 14a-6(g)(1) promulgated under the Securities Exchange Act of 1934. Submission is not required of this filer under the terms of the Rule, but is made voluntarily in the interest of public disclosure and consideration of these important issues.

 

   
 

 

 

 

 

Meta Platforms Inc.

 

Proposal #10 — Report on the Feasibility of Linking Child Safety Improvements to Executive Compensation

 

Annual Meeting May 27, 2026

 

 

Contact: Michael Passoff, CEO, Proxy Impact, michael@proxyimpact.com

 

 

RESOLVED CLAUSE: Shareholders request the Board's Compensation Committee publish a report (at reasonable expense, within a reasonable time, and omitting confidential or proprietary information) assessing the feasibility of integrating performance on improving child safety into Meta's senior executive compensation program, which it describes in its annual proxy materials.

 

SUMMARY

 

Meta faces escalating litigation, regulatory, and operational risks related to child safety practices, including adverse jury verdicts, expanding global regulation, increasing pressure for platform redesign and documented harms to children and teens. These developments may create material financial and governance risks for shareholders.

 

Meta's executives have been aware of the damage their products cause to children and teens for nearly a decade, yet have not taken protective action, resulting in the current and growing legal and regulatory risks facing the company. This resolution asks the Compensation Committee to assess the feasibility of linking child safety performance to executive pay to incentivize executives to take action to prevent further harms to children and teens and protect the company against legal damages and regulatory risk.

 

Key concerns include:

 

Legal Liability

 

Two landmark lawsuits resulted in adverse jury verdicts in March 2026, totaling $375 million in New Mexico (with $3.7 billion requested in abatement), and $6 million in California.
These lawsuits set a precedent for over 2,400 active lawsuits from children, families, school districts, and 42 state attorneys general that are active and with more lawsuits expected.
Meta's own insurers refuse to cover these claims, putting the financial responsibility for compensation in these, pending, and future cases directly on the company.

Regulatory Risk

A preliminary finding regarding Meta breaching the EU Digital Services Act regarding children under 13 was issued 29 April 2026 with a potential fine of up to $12 billion, and billions more in additional fines until Meta is in compliance.
15+ countries are enacting or advancing bans on social media access for minors.

Reputational Risk

Meta has received a constant stream of negative media regarding its child safety issues such as reports that Meta’s AI chatbots engage in sexually explicit roleplay with minors, and Instagram links vast pedophile networks per Wall Street Journal investigations.

Structural Revenue Risk

·Ongoing litigation and emerging regulation may require platform redesign, including changes to recommendation systems, engagement-maximizing features, age-verification systems, and data practices for minors.

 

 

Proxy Impact: Proposal #10 Report on the Feasibility of Linking Child Safety Improvements to Executive Compensation. May 6, 2026. Page 1

  
 

 

·Platform redesign could reduce user engagement, limit behavioral data collection, increase compliance costs, and weaken advertising-targeting efficiency.
·Approximately 98% of Meta’s revenue is derived from advertising and material changes to engagement metrics or data availability may directly affect revenue growth and operating margins.

 

 

This proposal requests a report focused on the feasibility of linking child safety outcomes to executive pay so that Meta's compensation structure reflects the company's stated commitment that “At Meta, child protection is always a top priority.” 1 The proponents of this resolution hold over $800 million of Meta Class A stock.

 

 

LEGAL, INSURANCE, REGULATORY, AND REPUTATIONAL RISKS FACING META FROM ITS CHILD SAFETY POLICIES

 

U.S. Litigation: Two Landmark Jury Verdicts in Two Days and over 2,000 pending cases

 

On March 24, 2026, a New Mexico jury ordered Meta to pay $375 million in civil penalties for misleading users about the safety of its platforms while children were targeted by sexual predators.2 A bench trial phase began May 4, 2026 with the NM Attorney General asking that Meta be declared a public nuisance and to pay $3.7 billion in an abatement plan, as well as make design changes to Meta’s platforms including mandatory age verification, predator removal, and restrictions on encrypted messaging for minors.3

 

On March 25, 2026, a Los Angeles jury found Meta and YouTube negligent for designing platforms harmful to young people, awarding $6 million. It is the first civil action to hold Meta accountable for causing addiction and mental health problems.4 Experts have described the back-to-back verdicts as Big Tech's 'Big Tobacco moment.’5

 

Meta’s stock dropped 8% after the lawsuit verdicts.6 These cases are considered ‘Bellwether lawsuits’ which serve as a test case for a larger group of similar lawsuits and can heavily influence future decisions. This is significant as Meta is facing federal multidistrict litigation (MDL) which includes more than 2,407 active cases as of March 2026, brought by individuals, school districts, and attorneys general from 42 states.7 The first federal trial is scheduled for Oakland in summer 2026.

 

Insurance Denial

 

The Delaware Superior Court ruled in March 2026 that Meta's own insurers, including Hartford Casualty Insurance Company, are not obligated to defend child safety claims, reasoning that the claims arise from intentional design choices and are therefore not covered by standard Commercial General Liability policies. Any damages awarded will flow directly to Meta's balance sheet.8

 

EU Digital Services Act — Preliminary Breach Finding

 

On April 29, 2026, the European Commission announced preliminary findings that Meta's Instagram and Facebook are in breach of the Digital Services Act (DSA) for failing to prevent children under 13 from accessing its services.9 The Commission found that children can enter false birth dates with no verification, reporting tools require up to seven clicks to access, and Meta disregarded readily available scientific evidence on the vulnerability of younger children. An estimated 10–12% of children under 13 are on Instagram or Facebook. If the preliminary findings are confirmed, a fine of up to 6% of Meta's total worldwide annual turnover — estimated at up to $12 billion — may be imposed, along with ongoing periodic penalties of up to 5% of Meta’s worldwide annual turnover until compliance is met.10

 

International Platform Bans

 

Meta has been at the center of concerns about social media’s negative impacts on minors and at least 15 countries have enacted or are advancing legislation banning or restricting minors' social media access. Australia's ban for under-16s came into force in December 2025 and it is considering fines up to $49.5 million AUD (~$32M USD) against Meta and other social media companies for non-compliance.11 France passed an under-15 ban in January 2026; Indonesia banned under-16's in March 2026; Malaysia's ban took effect January 2026. Currently, Denmark, Spain, Norway, Slovenia, Portugal, and the UK are all advancing legislation.12 Six EU nations have formed a coalition to embed a mandatory under-16 restriction in the upcoming EU Digital Fairness Act. 13 These bans directly threaten Meta's youngest-user pipeline — the cohort with the highest lifetime advertising value — and impose compliance costs that compound with every new jurisdiction.

 

 

Proxy Impact: Proposal #10 Report on the Feasibility of Linking Child Safety Improvements to Executive Compensation. May 6, 2026. Page 2

  
 

 

Reputational and Talent Risk

 

Meta’s child safety issues have been widely covered by the media including CEO Zuckerberg testifying in front of Congress,14 investigative reports by the Wall Street Journal15] [16 and New York Times,17] [18 former high-level employees publicly criticizing Meta’s child safety practices on book tours and at multiple Congressional hearings;19the 2023 Surgeon General’s report calling social media’s impact on children’s mental health a “urgent public health issue;”20 and the current worldwide coverage of the child safety trials which will be ongoing with thousands of pending lawsuits. Additionally, child advocacy groups have increased public demonstrations and protests targeting Instagram and Facebook.21

 

Sustained reputational damage from child safety failures creates a toxic employer brand — which can be particularly damaging in a competitive labor market.

 

 

KEY CONCERNS REGARDING META'S CHILD SAFETY PRACTICES

 

Addictive Design

 

Social media impacts adolescent and children’s brains differently than adult brains.22 Meta’s design features promote high-dopamine stimulus for an age group that is not biologically able to regulate its own attention. Addictive design features include the infinite scroll, which eliminates natural stopping cues; algorithmic reward schedules which uses unexpected content to maximize seeking behavior; stimulus speed, which reduces attention span; social feedback, such as “likes” and comments which triggers social comparison and hypersensitivity; and ephemeral content, which creates an artificial scarcity and "Fear of Missing Out" (FOMO), compelling daily engagement. Underlying all of the above is Meta’s knowledge that its product design was causing harm and that it repeatedly failed to make improvements in any substantial way.23

 

Failure to Disclose Harms

 

Internal documents show that Meta knew of potential mental and physical harms to minors and failed to disclose them. As early as 2017, Meta identified that its products were addictive to children. In 2019, Meta halted an internal study after it found that users who stopped using Instagram and Facebook for a week showed lower rates of anxiety, depression and loneliness.24 In 2021, Meta's own internal research documented that Instagram's beauty filters had negative impacts on teen self-image and links to body dysmorphia, eating disorders increased depression, anxiety, and suicidal ideation. A formal proposal to ban these filters was rejected by Meta CEO Mark Zuckerberg.25

 

Hiding the Problem

 

Meta‘s reporting of online child sexual abuse materials (CSAM) nearly doubled from 15.8 million cases in 2019 to 30.7 million in 2023, which accounted for 85% of all reports to the National Center for Missing and Exploited Children.26 Meta’s plan to apply end-to-end encryption to multiple platforms prompted a letter from top U.S., U.K. and Australian law enforcement officials to CEO Zuckerberg warning that encryption would enable millions of child sex images and videos to go undetected, endangering children and shielding predators.27 Despite this, Meta began applying end-to-end encryption in 2024 and this functionality reduced the number of CSAM reports by 6.9 million from the year before it was applied.28

 

 

Proxy Impact: Proposal #10 Report on the Feasibility of Linking Child Safety Improvements to Executive Compensation. May 6, 2026. Page 3

  
 

 

In a 2025 Congressional hearing, two former Meta researchers testified that Meta’s virtual reality products exposed minors to bullying, nudity, and sexual content and that Meta disregarded and censored their research, and in at least one case, deleted evidence of sexual harassment.29

 

Guardrails Abandoned for AI Growth

 

On April 26, 2025, the Wall Street Journal reported that Meta AI chatbots engaged in sexually explicit roleplay as underage characters — including with reporters who identified themselves as minors. Internal evidence showed Meta loosened AI safety guardrails to increase traffic despite staff objections which were overruled by CEO Mark Zuckerberg.30 Generative AI child sexual exploitation is growing at an astonishing rate. The National Center for Missing and Exploited Children received 4,700 reports in 2023 and 1.5 million reports in 2025.31

 

Algorithmic Amplification of Exploitation

 

A 2023 Wall Street Journal investigation found that Instagram's recommendation algorithms actively guided pedophiles toward sellers of child sexual abuse material.32

 

No Measurable Accountability

 

Across all of these issues, the core criticism from investors, law enforcement, child safety organizations, and regulators is consistent: Despite mounting evidence that Meta’s child safety impacts are getting worse, Meta continues to announce child safety policies and practices but provides no quantitative data to demonstrate their effectiveness — hence this resolution's call for metrics-linked executive accountability.

 

 

LEGALLY MANDATED PLATFORM REDESIGN IS LINKED TO REVENUE STREAM

 

The litigation is no longer only a financial liability — it is becoming a direct driver of mandatory platform redesign. The New Mexico bench trial phase which began May 4, 2026, seeks court-ordered changes including effective age verification, removal of predators, and restrictions on encrypted messaging for minors.33 These are the precise platform features that Meta has resisted changing commercially.

 

The Los Angeles trial verdict, finding negligent design, directly implicates Meta's core product architecture — infinite scrolling, algorithmic amplification, and engagement-maximizing features. A federal multidistrict litigation (MDL) trial scheduled for Oakland in summer 2026 will set legal precedents for thousands of active cases.

 

The EU Digital Services Act preliminary finding of April 29, 2026 separately identifies a required set of design changes including: accounts for minors are private by default; reduced algorithmic amplification for young users; disabled compulsive-use features such as streaks, autoplay, and push notifications; and strengthened moderation and reporting tools.34 If finalized, these requirements could require substantial operational and product changes across Meta’s EU operations.

 

Meta receives 98% of its revenue from advertising. Potential court-ordered or regulatory-driven platform redesign could affect engagement metrics, data collection practices, and advertising revenue generation which are at the heart of Meta’s advertising business. As of February 2026, Meta’s stock traded at about 24 times forward earnings, a valuation based on an assumption of continued advertising growth and behavioral data collection, along with artificial intelligence development.35

 

Court or regulatory mandated product redesign, combined with laws limiting minors’ access to social media, along with the growing reputational risk of Meta not being a safe place for younger users – may reduce engagement metrics and ad revenue, and impact AI development.

 

 

Proxy Impact: Proposal #10 Report on the Feasibility of Linking Child Safety Improvements to Executive Compensation. May 6, 2026. Page 4
  
 

 

THE IMPACT OF META'S CHILD SAFETY PRACTICES ON INVESTMENT PORTFOLIOS

 

Meta acknowledged on its 2025 and 2026 earnings calls that child safety issues pose the risk of material financial losses. The legal cases, social media bans, regulatory shifts, and pending cases, and regulations will only increase the materiality of child safety to Meta's business model.

 

CEO Mark Zuckerberg testified personally in the New Mexico trial, which resulted in a $375 million verdict against the company.36 This is no longer an abstract reputational risk. It is a CEO-level accountability event with a nine-figure jury outcome.

 

For index fund investors, the concern extends beyond Meta itself. Meta and Alphabet (Google/YouTube) together account for approximately 8–9% of the S&P 500's total market capitalization, at a time when the top 10 companies account for roughly 40% of the index — approximately double the concentration level of 2015–2016.37 Financial impairment of Meta from litigation reserves, forced platform redesign, or structural revenue regulation would not remain isolated in individual portfolios (given that other companies are being targeted with similar litigation, child safety MDL trajectory may also be viewed as a material risk factor in other tech sector holdings) but impact investors at all levels - from institutional and high-net worth investors to the retirement funds of middle class workers.

 

The concurrent risk layers — a preliminary EU DSA finding carrying up to a $12 billion fine, 2,407+ MDL cases with two adverse jury verdicts in the past two months, insurance carriers refusing to defend, and a bench trial phase seeking mandatory platform changes — represent a risk stack that is growing faster than Meta's announced remediation efforts.

 

 

THE FINANCIAL RISKS FROM META'S CHILD SAFETY PRACTICES

 

Direct Financial Exposure

 

The two March 2026 jury verdicts of $375 million (with $3.7 billion requested in abatement) and $6 million, represent only the beginning of a litigation cycle that mirrors, structurally, the trajectory of tobacco and opioid litigation. As the federal MDL trial proceeds in Oakland in summer 2026, and as 42 state Attorneys General cases advance, settlement pressure across the full docket will increase. Each MDL trial verdict that goes against Meta reduces its negotiating position in the remaining pool of cases and increases their financial burden.

 

Insurance Non-Coverage

 

Meta's insurers are refusing to defend child safety claims, arguing the damages arise from intentional design choices. The full financial exposure of any adverse verdict, settlement, or regulatory penalty will flow directly to Meta's balance sheet.[38]

 

Regulatory Fine Exposure

 

The EU Digital Services Act preliminary finding announced April 29, 2026, carries a maximum fine of 6% of Meta's total worldwide annual turnover — potentially up to $12 billion based on 2025 revenues, and up to 5% of Meta's total worldwide annual turnover for payments during non-compliance.39 The Children’s Online Privacy Protection Act (COPPA) violations in the U.S. carry civil penalties of up to $53,088 per violation per day.40 The Senate’s proposed GUARD ACT, which seeks to reduce harmful AI interactions with minors, stipulates civil penalties of up to $250,000 per violation, and criminal liability fines of up to $250,000 per violation.41 Meta has billions of underage users on its platforms.

 

Revenue Structural Risk

 

Proposed U.S. legislation including a trio of bills called the UnAnxious Generation package would impose a 50% tax on digital advertising revenue derived from minors.42 International bans are already removing the youngest, highest-life time value (LTV) user cohort from Meta's addressable advertising base due to the company's refusal to adjust its algorithm or enforce policies to reduce harm to children, while compliance infrastructure costs compound with each new jurisdiction implementing a ban.

 

 

Proxy Impact: Proposal #10 Report on the Feasibility of Linking Child Safety Improvements to Executive Compensation. May 6, 2026. Page 5

  
 

 

META’S OPPOSITION STATEMENT FAILS TO ADDRESS THE RESOLUTION

 

Meta's Board Response to this resolution rests on four bullet-point assertions. Each is addressed in turn below.

 

"At Meta, we prioritize helping keep young people safe online."

The gap between Meta's stated priority and its actual performance is precisely the concern shareholders are raising. Meta platforms pose physical and psychological risks that many children and teens are unprepared for including sextortion and grooming, hate group recruitment, human trafficking, cyberbullying and harassment, exposure to sexual or violent content, invasion of privacy, body shaming, self-harm content, and financial scams, among others. Internal company documents released during the recent lawsuits show that Meta knew that harm was being done and did little or nothing to stop it.

 

"We believe that the current structure of our compensation program supports our objectives..."

Meta's Board claims its existing compensation structure — which it describes as emphasizing 'long-term incentives' tied to 'company priorities' — is sufficient to drive action on child safety. Yet the current structure contains no specific, measurable, weighted child safety performance condition of any kind. A pay structure that is discretionary, with no objective metrics attached, cannot be said to 'support' any specific company priority — child safety or otherwise.

 

Additionally, the Board's claim that its program 'adequately incentivizes our executives to act on our company priorities' is directly contradicted by the outcomes. Meta's child safety performance during the period this compensation structure was in place has resulted in:

two jury verdicts against the company – one totaling $375 million (with a request for a $3.7 billion abetment), and another totaling $6 million;
a preliminary EU Digital Services Act breach finding carrying potential fines of up to $12 billion;
2,407+ active MDL lawsuits;
Meta's own insurers refusing to defend child safety claims.

 

If the compensation structure is incentivizing adequate child safety action, the outcomes do not reflect it.

 

"Our compensation, nominating & governance committee is best positioned to determine the structure of our executive compensation program."

Proponents fully agree that the Compensation Committee is best positioned to determine the structure of executive pay. Numerous large-cap companies' Compensation Committees link safety performance directly to executive bonuses in sectors where safety risk is existential — energy, mining, pharmaceuticals. Meta operates in a sector where the safety risk is now producing nine-figure jury verdicts.

 

The Compensation Committee is indeed positioned to design an appropriate structure; shareholders are asking it to demonstrate that capacity.

 

"Our board of directors believes the requested report is unnecessary and would not provide additional benefit to our shareholders in light of our existing practices."

The Board offers no analysis to support this conclusion. It does not identify what 'existing practices' it believes render the report unnecessary, nor does it explain what 'additional benefit' standard it is applying or why the requested report fails to meet it.

 

It is not credible to assert that no shareholder benefit would flow from the Compensation Committee assessing the feasibility of aligning executive financial incentives with reducing the growing legal, regulatory and reputational risks stemming from Meta’s failed child safety policies and practices.

 

 

Proxy Impact: Proposal #10 Report on the Feasibility of Linking Child Safety Improvements to Executive Compensation. May 6, 2026. Page 6
  
 

 

WHY SHAREHOLDERS SHOULD SUPPORT A REPORT ON THE FEASIBILITY OF LINKING CHILD SAFETY IMPROVEMENTS TO EXECUTIVE COMPENSATION

 

The Governance Gap

 

Meta's Proxy Statement says its compensation program is designed to 'align the interests of our executives with those of our shareholders in the overall success of our company.' 43 Yet Meta's CEO receives $1 in base salary, no performance bonus, and no equity awards tied to any measurable outcome.44 For other executives, the Compensation Committee has relied on discretionary payments without objective performance metrics.

 

A coalition of 36 institutional investors representing $3.6 trillion in assets under management sent a formal letter to Meta in October 2024 specifically citing the absence of performance-based pay conditions as a governance failure.45 The investors also raised concern about inadequate child safety practices.

 

In 2021, 2022, 2023 and 2024, a majority of Meta's independent [non-management] shareholders voted for child safety related proposals — 56%, 57% 54% and 59% respectively — yet all were blocked by CEO Mark Zuckerberg's supervoting share structure.46 The company has still not published the child safety related quantitative metrics and performance targets that shareholders have strongly supported since 2020.

 

The Current Bonus Structure

 

Meta's Proxy Statement list four bonus pay priority categories in 2025, one of which is Make progress on societal issues related to our business.” Bonus pay is at the Board’s discretion and none of these categories were assigned any specific weighting or dollar amount of the target bonus. The Proxy states that the 2025 bonuses were determined based on the increase in total revenue, continued growth in users, the launch of wearable products and the development of its AI program. 47 Societal issues, despite being listed as one of the four priorities, was absent from the current bonus considerations.

 

In 2025, target bonuses for Named Executive Officers (excluding Zuckerberg) increased from 75% to 200% of base salary, creating a substantial bonus pool to which meaningful child safety conditions could be applied.48 On March 25, 2026 – the day of the second child safety jury verdict – Meta filed an revised executive bonus plan offering mega-grants of up to $900 million and which tied senior executive’s pay to stock price appreciation that can only be fully realized if the company reaches a market capitalization of $9 trillion (500% growth) by 2031.49

This suggests that achieving this exponential growth will be a priority, regardless of the changes that need to be implemented to improve child safety.

 

Online child safety concerns will not go away, and all indications are that the demands for action will intensify. The potential costs from lawsuits and fines can scale up quickly. Even small daily fines for violations can balloon into large amounts given the billions of under 18 users on Meta platforms. And if the U.S. lawsuits and EU regulations are successful then there is the likelihood that they will be adopted by other countries. Product redesign will require significant effort and costs and will likely need to be incorporated into AI development as well.

 

Child safety is an issue that needs to be dealt with, and incentivizing executives to ensure this may be the most efficient and economical means of doing so.50

 

 

Proxy Impact: Proposal #10 Report on the Feasibility of Linking Child Safety Improvements to Executive Compensation. May 6, 2026. Page 7

  
 

 

What the Research Says

 

A 2025 study across 40 countries and 32 industries found that ESG-linked executive compensation is associated with a statistically significant reduction in the gap between a company's stated commitments and actual practice.51 A separate academic study confirms that ESG pay linkage fails only when targets are vague, low-weighted, or disconnected from financial consequences — all of which can be avoided in this case.52

 

The financial argument for linking Meta executive pay to child safety outcomes is unusually strong compared to most ESG pay proposals:

 

·First, the liability is not hypothetical — Meta is already facing millions of dollars of legal fines with thousands of active lawsuits awaiting decisions, and with billions of dollars of potential regulatory penalties, and is being acknowledged as material in Meta's own earnings calls.

 

·Second, the current compensation structure contains zero performance-based accountability mechanisms — a governance failure that institutional investors representing trillions in AUM have formally objected to in letters to CEO Zuckerberg over consecutive years.

 

·Third, the data is there. Meta tracks CSAM reports, underage account removals, user time, harassment reports, and regulatory compliance status, among many other data points, allowing child safety performance to be measured precisely enough to serve as a compensation metric.

 

·Fourth, the academic evidence on when ESG pay linkage works is unambiguous: it requires specific, measurable, adequately weighted metrics tied to outcomes that are financially material to the company. Child safety regulatory fines and litigation reserves meet all four criteria in a way that most ESG metrics do not. Consequently, there is an opportunity to make child safety improvements financially significant for the leadership team.

 

 

Assessing Performance-Linked Compensation

 

This resolution does not mandate a specific compensation structure. It asks the Compensation Committee to assess feasibility

 

The resolution before shareholders is the least invasive available accountability mechanism, but the signal it sends — that the board takes child safety seriously enough to consider making it financially material for senior executives — would have immediate effects on regulatory relationships, litigation optics, and institutional investor confidence. A board actively assessing child safety performance linkage has a materially stronger position in discussions with regulators, class counsel, and institutional shareholders than one that has demonstrably ignored the issue for years.

 

Given the litigation and regulatory trajectory, the failure to explore a child safety pay link is a risk management failure that is becoming increasingly difficult for a fiduciary to defend.

 

 

We ask that you Vote FOR Proposal #10 — Report on the Feasibility of Linking Child Safety Improvements to Executive Compensation

 

 

 

THE FOREGOING INFORMATION MAY BE DISSEMINATED TO SHAREHOLDERS VIA TELEPHONE, U.S. MAIL, E-MAIL, CERTAIN WEBSITES AND CERTAIN SOCIAL MEDIA VENUES, AND SHOULD NOT BE CONSTRUED AS INVESTMENT ADVICE OR AS A SOLICITATION OF AUTHORITY TO VOTE YOUR PROXY. THE COST OF DISSEMINATING THE FOREGOING INFORMATION TO SHAREHOLDERS IS BEING BORNE ENTIRELY BY THE FILER. PROXY CARDS WILL NOT BE ACCEPTED BY THE FILER. PLEASE DO NOT SEND YOUR PROXY TO THE FILER. TO VOTE YOUR PROXY, PLEASE FOLLOW THE INSTRUCTIONS ON YOUR PROXY CARD.

 

 

Proxy Impact: Proposal #10 Report on the Feasibility of Linking Child Safety Improvements to Executive Compensation. May 6, 2026. Page 8

  
 

 

Endnotes


1 Meta website: Safety Center “Online Child Protection - At Meta, child protection is always a top priority,” May 6. 2026. https://www.meta.com/safety/topics/online-child-protection/

2 NY Times, "Meta ordered to pay $375m over child safety violations” March 24, 2026.

https://www.nytimes.com/2026/03/24/technology/meta-new-mexico-child-safety-violations.html

3 The Guardian, “New Mexico proposes $3.7bn fine for Meta and sweeping changes to its social platforms,” May 6, 2026. https://www.theguardian.com/technology/2026/may/05/new-mexico-meta-court-fine

4 NBC News, "Jury finds Meta and YouTube negligent in landmark lawsuit on social media safety," March 25, 2026. $6 million awarded; Meta found 70% responsible. https://www.nbcnews.com/tech/tech-news/verdict-reached-landmark-social-media-addiction-trial-rcna263421

5 Forbes “Big Tech’s Big Tobacco Moment: What the Meta and YouTube Ruling Means For You,” March 31, 2026. https://www.forbes.com/sites/anishasircar/2026/03/31/big-techs-big-tobacco-moment-what-the-meta-and-youtube-ruling-means-for-you/

6 MSN, “Meta Stock Dropped 8% Today. Here is Why and What You Need to Know,” March 26, 2026. https://www.msn.com/en-us/money/general/what-investors-need-to-know-about-the-verdict-experts-say-jeopardizes-the-social-media-business/ar-AA1ZtzBf

7 Motley Rice, MDL tracker, In re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, MDL No. 4:22-MD-03047-YGR (N.D. Cal.); as of March 2026, filings exceeded 2,407. https://www.motleyrice.com/social-media-lawsuits

8 Insurance Journal, “Meta Loses Insurance for Defense in Major Social Media Addiction Litigation,” March 23, 2026. https://www.insurancejournal.com/magazines/mag-features/2026/03/23/862428.htm

9 European Commission, Preliminary Findings re: Meta Instagram and Facebook, Digital Services Act, April 29, 2026. https://ec.europa.eu/commission/presscorner/detail/en/ip_26_920

10 Euronews, "EU finds Meta in breach of digital rules over children on Instagram and Facebook," April 29, 2026. Potential fine up to 6% of global annual turnover, estimated up to $12 billion. https://www.euronews.com/next/2026/04/29/eu-finds-meta-in-breach-of-digital-rules-over-children-on-instagram-and-facebook

11 Built In, "Which Countries Are Banning Social Media for Children?" March 3, 2026. Australia fines can reach A$49.5 million (~$32M USD) per serious breach. https://builtin.com/articles/social-media-bans-children

12 TechCrunch, “These are the countries moving to ban social media for children,” April 23, 2026.

https://techcrunch.com/2026/04/08/social-media-ban-children-countries-list/

13 Visual Capitalist, "Which Countries and States are Banning Kids' Social Media?" February 23, 2026. Six EU nations formed a coalition in early 2026 to coordinate bans. https://www.visualcapitalist.com/minimum-age-laws-social-media-world-map/

14 CBS News, Mark Zuckerberg accused of having "blood on his hands" in fiery Senate hearing on internet child safety,” January 31, 2024.

https://www.cbsnews.com/news/mark-zuckerberg-meta-x-child-exploitation/

15 Wall St Journal, “Facebook Knows Instagram Is Toxic for Teen Girls, Company Documents Show,” September 14, 2021. https://www.wsj.com/tech/personal-tech/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-show-11631620739

16 Wall St. Journal, “Meta’s ‘Digital Companions’ Will Talk Sex With Users—Even Children,” April 26, 2025.

https://www.wsj.com/tech/ai/meta-ai-chatbots-sex-a25311bf

17 New York Times, “The Internet Is Overrun with Images of Child Sexual Abuse. What Went Wrong?” September 29, 2019. https://www.nytimes.com/interactive/2019/09/28/us/child-sex-abuse.html

18 New York Times, “Meta Rejected Efforts to Improve Children’s Safety, Documents Show,” January 31, 2024.

https://www.nytimes.com/2024/01/31/technology/meta-childrens-safety-documents.html

19 ABC News, Facebook's own research found Instagram fuels eating disorders in young people,” October 6, 2021. https://www.abc.net.au/triplej/programs/hack/facebook-whistleblower-says-instagram-content-hurts-teens/13573020

20 CNN, “Social media presents ‘profound risk of harm’ for kids, surgeon general says, calling attention to lack of research

‘” May 24, 2023. https://www.cnn.com/2023/05/23/health/social-media-kids-surgeon-general-advisory-wellness

21 TechCrunch, “Parents who lost children to online harms protest outside of Meta’s NYC office,” April 24, 2025. https://techcrunch.com/2025/04/24/parents-who-lost-children-to-online-harms-protest-outside-of-metas-nyc-office/

22 American Psychological Association, “Why young brains are especially vulnerable to social media,” August 3, 2023.

https://www.apa.org/news/apa/2022/social-media-children-teens

23 CRVScience, “Addiction by Design: The Landmark Case Against Meta and Google,” February 18, 2026.

https://www.crvscience.com/post/addiction-by-design-the-landmark-case-against-meta-and-google#google_vignette

24 The Tech Oversight Project, “Meta’s Unsealed Internal Documents Prove Years of Deliberate Harm and Inaction to Protect Minors,” November 22, 2025. https://techoversight.org/2025/11/22/meta-unsealed-docs/

25 The Hill, “Zuckerberg ‘vetoed,’ ‘ignored’ plans to boost teen well-being on Meta platforms, lawsuit alleges,” November 9, 2023.

https://thehill.com/policy/technology/4302145-zuckerberg-vetoed-teen-mental-health-facebook-instagram/

26 National Center for Missing and Exploited Children, CyberTipline 2024 https://ncmec.org/content/dam/missingkids/pdfs/cybertiplinedata2024/2024-reports-by-esp.pdf

27 US, UK, Australia law enforcement leader’s “Open Letter: Facebook’s ‘Privacy First’ Proposals,” October 8, 2019. https://www.justice.gov/archives/opa/press-release/file/1207081/dl?inline=

 

 

Proxy Impact: Proposal #10 Report on the Feasibility of Linking Child Safety Improvements to Executive Compensation. May 6, 2026. Page 9

  
 

 

 

28 International Business Times, “Unsealed Court Documents Reveal Meta Staff Flagged 7.5 Million Annual Child Abuse Reports That Would Vanish After Messenger Encryption,” March 15, 2026. https://www.ibtimes.co.uk/unsealed-court-documents-reveal-meta-staff-flagged-75-million-annual-child-abuse-reports-that-1785609

29 TechCrunch, “Meta Suppressed Children’s Safety Research Four Whistleblowers Claim,” September 8, 2025. https://techcrunch.com/2025/09/08/meta-suppressed-childrens-safety-research-four-whistleblowers-claim/

30 Wall Street Journal, "Meta AI Chatbots Engaged in Sexual Roleplay With Minors," April 26, 2025. https://www.wsj.com/tech/ai/meta-ai-chatbots-sex-a25311bf

31 NCMEC, CyberTipline Data 2024: 1,325% increase in reports involving Generative AI in 2024, rising from 4,700 to 67,000 reports. https://www.missingkids.org/gethelpnow/cybertipline/cybertiplinedata

32 Wall Street Journal, "Instagram's Algorithm Connects Vast Pedophile Network," June 2023. https://www.wsj.com/articles/instagram-vast-pedophile-network-4ab7189

33 Associated Press, “New Mexico seeks child safety restrictions on Meta apps and algorithms in trial’s 2nd phase.” May 4, 2026.

https://apnews.com/article/meta-new-mexico-social-media-child-safety-0ee254c8e8b37c9c28cea591d661cbc4

34 TechPolicy Press, "EU Intensifies Child Safety Enforcement, Flags Gaps in Meta Age Checks," April 29, 2026. DSA 2025 Guidelines require private accounts by default, reduced algorithmic amplification, and disabled compulsive-use features for minors. https://www.techpolicy.press/eu-intensifies-child-safety-enforcement-flags-gaps-in-meta-age-checks/

35 Winvesta, “Meta’s child safety lawsuit: What New Mexico’s legal action means for investors.” February 18, 2026. https://www.winvesta.in/blog/us-market-news/metas-child-safety-lawsuit-what-new-mexicos-legal-action-means-for-investors

36 ABC News, "Meta hit with $375 million verdict in New Mexico child safety case," March 26, 2026. https://abcnews.com/GMA/Family/meta-hit-375-million-verdict-new-mexico-child/story?id=131393490

37 Guinness Global Investors, "Is There a Rising Concentration Risk in the S&P 500?" August 2025. Top 10 S&P 500 constituents = ~40% of total index market capitalization. https://www.guinnessgi.com/insights/sp-500-concentration-risk

38 Insurance Journal, “Meta Loses Insurance for Defense in Major Social Media Addiction Litigation,” March 23, 2026. https://www.insurancejournal.com/magazines/mag-features/2026/03/23/862428.htm

39 Euronews, "EU finds Meta in breach of digital rules over children on Instagram and Facebook," April 29, 2026. Potential fine up to 6% of global annual turnover, estimated up to $12 billion. https://www.euronews.com/next/2026/04/29/eu-finds-meta-in-breach-of-digital-rules-over-children-on-instagram-and-facebook

40 FTC, COPPA Rule amendments, January 2025. Civil penalty up to $53,088 per violation per day. https://www.ftc.gov/news-events/news/press-releases/2024/08/ftc-investigation-leads-lawsuit-against-tiktok-bytedance-flagrantly-violating-childrens-privacy-law

41 Global Policy Watch, “Senate Judiciary Committee Advances GUARD Act Regulating Minor Use of AI,” May 8, 2026 https://www.globalpolicywatch.com/2026/05/senate-judiciary-committee-advances-guard-act-regulating-minor-use-of-ai/

42 AInvest.com, "Big Tech's Legal and Regulatory Risks in Emerging Youth Protection Laws," December 2025. Proposed UnAnxious Generation package includes 50% tax on digital advertising revenues derived from minors. https://www.ainvest.com/news/big-tech-legal-regulatory-risks-emerging-youth-protection-laws-assessing-long-term-shareholder-strategic-resilience-2512/

43 Meta Platforms, Inc. DEF 14A (Proxy Statement), April 2026. https://d18rn0p25nwr6d.cloudfront.net/CIK-0001326801/968f6478-70a7-4426-af16-6190d30a390f.pdf

44 Meta Platforms, Inc. DEF 14A (Proxy Statement), April 2026. https://d18rn0p25nwr6d.cloudfront.net/CIK-0001326801/968f6478-70a7-4426-af16-6190d30a390f.pdf

45 Sarasin & Partners, Pre-Declaration of Votes, May 23, 2025; coalition letter on behalf of 36 investors with $3.6 trillion AUM, October 2024. https://sarasinandpartners.com/row/stewardship-post/meta-platforms-pre-declaration-of-votes/

46 Proxy Impact, “Meta/Facebook” child safety resolutions,’ May 2026, https://www.proxyimpact.com/facebook

47 Meta Platforms, Inc. DEF 14A (Proxy Statement), April 2026. https://d18rn0p25nwr6d.cloudfront.net/CIK-0001326801/968f6478-70a7-4426-af16-6190d30a390f.pdf

48 The Register, "Meta Pumps Executive Bonuses," February 22, 2025. Target bonus raised from 75% to 200% of base salary for NEOs. https://www.theregister.com/2025/02/22/meta_pumps_executive_bonuses/

49 TheNextWeb.com “Meta grants executives up to $921 million in stock options on the same day it lays off 700 workers,” March 27, 2026. https://thenextweb.com/news/meta-9-trillion-valuation-executive-stock-options-layoffs

50 Forbes, A Safer Meta Starts with How Its Executives Are Paid,” March 26, 2026 https://www.forbes.com/sites/bhaktimirchandani/2026/03/26/a-safer-meta-starts-with-how-its-executives-are-paid/

51 Eliwa et al., "Aligning Pay with Purpose: ESG-Linked Compensation and ESG Decoupling," Business Strategy and the Environment, October 2025. 36,055 firm-year observations across 40 countries. https://onlinelibrary.wiley.com/doi/full/10.1002/bse.70258

52 "Regulatory and Investor Demands to Use ESG Performance Metrics in Executive Compensation: Right Instrument, Wrong Method," Journal of Corporate Governance, Taylor & Francis, 2024. https://www.tandfonline.com/doi/full/10.1080/14735970.2024.2350139

 

 

Proxy Impact: Proposal #10 Report on the Feasibility of Linking Child Safety Improvements to Executive Compensation. May 6, 2026. Page 10

 

 

 

 

FAQ

What does Meta Proposal #10 ask shareholders to do (META)?

Vote FOR Proposal #10 requesting the Compensation Committee publish a feasibility report on linking child-safety performance to executive pay. The report is to assess metrics, weighting, and implementation at reasonable expense and timing, omitting confidential or proprietary information.

Who filed this exempt solicitation and what stake do they report?

The filing was submitted by Proxy Impact (contact: Michael Passoff). Proponents state they hold over $800 million of Meta Class A stock and that dissemination costs are borne by the filer, not by Meta.