STOCK TITAN

Investors Raise Concerns to Meta Regarding Child Safety on Social Media Platforms

Rhea-AI Impact
(Neutral)
Rhea-AI Sentiment
(Negative)
Tags
Rhea-AI Summary

Meta shareholders, led by Lisette Cooper, PhD, from Fiduciary Trust International and Proxy Impact, have filed a resolution for Meta's annual meeting on May 29, 2024. The proposal urges Meta's Board to set targets within a year to mitigate child safety risks on its platforms and to publish an annual report on progress. Supported by Institutional Shareholder Services (ISS) and Glass Lewis, the resolution aims to address dangers such as cyberbullying, grooming, and exposure to harmful content. Cooper highlights the long-term benefits for investor security, emphasizing Meta's responsibility amid rising legal and regulatory pressures globally.

Positive
  • Resolution aims to protect children on Meta platforms and improve long-term financial stability.
  • Institutional Shareholder Services (ISS) and Glass Lewis recommend voting in favor of the resolution.
  • Potential for improved public perception and trust in Meta’s commitment to user safety.
  • Alignment with new legislation in the U.S., U.K., and E.U. could mitigate future financial and legal penalties.
  • Focus on setting clear targets and performance metrics for evaluating child safety improvements.
Negative
  • Meta faces significant legal and regulatory challenges globally regarding child safety issues.
  • Social media platforms linked to widespread child safety concerns, including exploitation and mental health risks.
  • End-to-end encryption on Facebook Messenger may reduce the visibility of child abuse reports.
  • Meta fined over $400 million by Ireland’s Data Protection Commission for child safety breaches.
  • Potential for continued lawsuits and penalties if child safety measures are deemed insufficient.

Meta's current situation regarding child safety presents a multifaceted challenge that deserves detailed scrutiny. Institution of child safety measures could potentially safeguard Meta against regulatory and legal repercussions. However, from a business perspective, it is important to note that implementing these measures may initially incur high financial costs due to the necessary changes in technology, policies and staffing. In the short term, these expenses could affect Meta's operating margins.

Furthermore, increasing scrutiny and legal obligations could influence advertisers' perceptions and impact advertising revenue, which forms a significant part of Meta's income. On the flip side, successfully addressing child safety concerns might significantly enhance Meta's reputation, leading to long-term benefits. Enhanced user trust could foster greater user engagement and loyalty, thereby potentially increasing the user base and advertising revenues over the long haul.

Finally, the involvement of multiple regulatory bodies across different regions means that Meta must navigate a complex legal landscape. Failure to comply with these regulations could result in hefty fines and further damage to the company's reputation. In contrast, proactive compliance could alleviate some regulatory pressures and position Meta as a responsible leader in the tech industry.

Analyzing the legal ramifications of Meta's current situation requires understanding multiple layers of jurisdictional oversight. The new legislation in the U.S., U.K. and the E.U. imposes stringent requirements on tech companies to monitor and remove child sexual abuse material. Failing to do so could lead to significant fines and damages, as evidenced by Meta's €405 million fine by Ireland’s Data Protection Commission in 2022 for not safeguarding children’s information on Instagram.

The proactive measures called for in the shareholder resolution align with these legal requirements and can serve as a compliance strategy to mitigate legal risks. Adopting the proposed targets and metrics could help Meta demonstrate its commitment to regulatory compliance, potentially reducing the likelihood of future lawsuits and penalties. However, the implementation of these measures must be diligent and transparent to withstand legal scrutiny.

Moreover, shifting towards more transparent practices, as suggested by the resolution, can have the added benefit of reducing the incidence of reactive, crisis-driven legal responses, which often result in higher financial and reputational costs. In the long term, consistent compliance and proactive measures will be important for maintaining operational stability and shareholder confidence.

From a financial standpoint, the proposed resolution could have mixed implications for Meta's short-term and long-term performance. In the immediate future, the financial burden of implementing new safety measures and the potential for increased operational costs due to regulatory compliance cannot be ignored. This could lead to a squeeze on profit margins and may have a negative impact on quarterly earnings, which are closely watched by investors.

However, in the longer term, improving child safety on Meta’s platforms could yield substantial benefits. Enhanced public perception and investor confidence can translate into stable user growth and potentially more resilient advertising revenue. Furthermore, reducing legal and regulatory risks could prevent costly penalties and litigations, preserving Meta's financial health over time.

Overall, while the short-term financial outlook may face some pressures due to increased spending, the long-term view presents a potential for stronger, more sustainable growth driven by improved trust and compliance. Investors should weigh these factors carefully, considering both immediate financial performance and the broader impacts on Meta’s market position and reputation.

Meta Shareholders Represented by Proxy Impact—Including Lead Filer Lisette Cooper, PhD, Vice Chair of Fiduciary Trust International—Seek to Protect Children and the Long-Term Financial Performance of Meta

NEW YORK--(BUSINESS WIRE)-- Fiduciary Trust International, a global wealth manager and wholly-owned subsidiary of Franklin Templeton, announces that Meta shareholder Lisette Cooper, PhD, vice chair, has filed a resolution for shareholders to vote upon at Meta’s annual meeting in Menlo Park, CA on May 29, 2024.

The proposal, filed on behalf of Dr. Cooper and other Meta shareholders by Proxy Impact, calls on Meta’s Board of Directors to, within one year, adopt targets for reducing dangers and threats to children on its global social media platforms, as well as quantitative metrics for assessing the company’s improvement in this area. The resolution also calls for Meta’s Board of Directors to ensure these targets and performance metrics are published in an annual report, enabling investors and stakeholders to judge how effective Meta’s tools, policies, and actions for protecting children have been.

Institutional Shareholder Services (ISS) and Glass Lewis, the two largest proxy advisory services, both recommend voting for the resolution.

“Meta is the largest social media company in the world, with billions of users, but its platforms—including Facebook, Instagram, Messenger, and WhatsApp—have been shown to pose a variety of physical and psychological risks to children and teens,” said Lisette Cooper, PhD, vice chair of Fiduciary Trust International. “As a parent, and an investor, with a deep personal connection to this issue, I support this shareholder resolution as a meaningful step to encourage Meta’s leadership to do more to protect the young people who use its platforms—which we believe will also protect the long-term security of shareholders’ investments.”

Meta’s social media platforms have been linked to many dangers to the physical and mental wellbeing of children and teenagers. These range from sextortion, grooming, and human trafficking to cyberbullying, harassment, exposure to sexual or violent content, depression, anxiety, self-harm, and self-image distortion.

  • The National Center for Missing and Exploited Children reported that its CyberTipline received nearly 36 million reports of online exploitation of children in 2023, including child sexual abuse material, child sex trafficking, and online enticement—and almost 31 million of them came from Meta platforms.
  • A Wall Street Journal investigation published in June 2023 found that Meta’s algorithms for Instagram guide pedophiles to sellers of child sexual abuse materials, essentially “connecting a vast pedophile network.”
  • Meta has also begun end-to-end encryption of Facebook Messenger, despite warnings from law enforcement and child safety organizations that doing so will hide millions of reports of child sexual abuse materials—masking the actions of predators, and making children more vulnerable.
  • In the wake of the U.S. Surgeon General’s Advisory on social media and youth mental health, 42 U.S. state attorneys general have filed lawsuits against Meta, claiming Facebook and Instagram algorithms are designed to intentionally make the platforms addictive, and that they harm young people’s mental health.
  • In September 2022, Meta was fined €405 million, or just over $400 million, by Ireland’s Data Protection Commission for not safeguarding children’s information on Instagram.

“The Internet is like the Wild West for children and teens. Meta and other social media companies need to do more to prevent their technology from being weaponized against their youngest users,” said Michael Passoff, chief executive officer of Proxy Impact. “The more tech companies try to evade responsibility for the harm caused by algorithms designed to maximize user engagement, the more the world is fighting back. Shareholders in Meta and other social media companies can make an enormous difference by raising their voices against business practices that treat children as collateral damage.”

If Meta does not sufficiently address child safety issues, it faces potential financial, regulatory, and legal penalties under new legislation in the U.S., U.K., and European Union.

  • The E.U.’s Digital Services Act and Digital Markets Act, which went into effect in February 2024, will require companies like Meta to identify, report, and remove child sexual abuse materials.
  • The U.K.’s Online Safety Act of 2023 includes measures to keep children and other online users safe from harmful and fraudulent content.
  • In this country, the REPORT Act was signed into law on May 7, 2024. The legislation will strengthen the capabilities of the National Center for Missing and Exploited Children’s national tipline to collect reports of online exploitation, and require the reports and evidence to be preserved for a longer period—thereby giving law enforcement more time to investigate and prosecute.

Dr. Cooper’s daughter Sarah is a founding member of the Brave Movement and has been deeply involved in the Heat Initiative’s campaign as a survivor/lived experience expert. She is a survivor of child sexual abuse by an older man who misrepresented himself on Facebook Messenger. Sarah Cooper has spoken at two of Meta’s previous annual meetings.

Dr. Cooper is also a member of the Interfaith Center on Corporate Responsibility’s working group on child safety and technology. Since 2019, Proxy Impact and Dr. Cooper have worked with members of the Interfaith Center on Corporate Responsibility to empower investors to utilize their leverage to encourage Meta and other tech companies to strengthen child safety measures on social media.

About Fiduciary Trust International

Fiduciary Trust International, a global wealth management firm headquartered in New York, NY, has served individuals, families, endowments and foundations since 1931. With over $102 billion in assets under management and administration as of March 31, 2024, the firm specializes in strategic wealth planning, investment management and trust and estate services, as well as tax and custody services. The New York-based firm and its subsidiaries maintain offices in Coral Gables, FL, Boca Raton, FL, Fort Lauderdale, FL, West Palm Beach, FL, St. Petersburg, FL, Radnor, PA, Lincoln, MA, Los Angeles, CA, San Mateo, CA, San Francisco, CA, Washington, DC, Wilmington, DE, Reston, VA, and Atlanta, GA. For more information, please visit fiduciarytrust.com, and for the latest updates, follow Fiduciary Trust International on LinkedIn and X: @FiduciaryTrust.

About Franklin Templeton

Franklin Resources, Inc. [NYSE: BEN] is a global investment management organization with subsidiaries operating as Franklin Templeton and serving clients in over 150 countries. Franklin Templeton’s mission is to help clients achieve better outcomes through investment management expertise, wealth management and technology solutions. Through its specialist investment managers, the company offers specialization on a global scale, bringing extensive capabilities in fixed income, equity, alternatives and multi-asset solutions. With more than 1,500 investment professionals, and offices in major financial markets around the world, the California-based company has over 75 years of investment experience and over $1.6 trillion in assets under management as of April 30, 2024. For more information, please visit franklintempleton.com and follow us on LinkedIn, Twitter and Facebook.

About Proxy Impact

Proxy Impact provides shareholder engagement and proxy voting services that promote sustainable and responsible business practices. For more information, visit www.proxyimpact.com.

Copyright © 2024 Fiduciary Trust International. All rights reserved.

Rebecca Radosevich: 212-632-3207

rebecca.radosevich@franklintempleton.com

Sabrina Scarpa: 973-309-0051

ft@jconnelly.com

Source: Fiduciary Trust International

FAQ

What is the resolution filed by Meta shareholders about?

The resolution calls on Meta's Board to set targets to reduce child safety risks on its platforms and publish an annual report on progress.

When will Meta's annual meeting take place?

Meta's annual meeting will be held on May 29, 2024, in Menlo Park, CA.

Who supports the child safety resolution at Meta?

Institutional Shareholder Services (ISS) and Glass Lewis both recommend voting for the resolution.

What are some of the dangers children face on Meta platforms?

Dangers include cyberbullying, grooming, human trafficking, exposure to harmful content, depression, and anxiety.

How many reports of online child exploitation were linked to Meta platforms in 2023?

Almost 31 million reports of online child exploitation were linked to Meta platforms in 2023.

What financial penalty did Meta face for child safety breaches on Instagram?

Meta was fined over $400 million by Ireland’s Data Protection Commission in September 2022.

What is the impact of end-to-end encryption on Facebook Messenger?

End-to-end encryption could reduce the visibility of child abuse reports, making children more vulnerable.

What are the new legislations impacting Meta’s child safety measures?

The E.U.’s Digital Services Act, the U.K.'s Online Safety Act, and the U.S.'s REPORT Act are new legislations impacting Meta's child safety measures.

Franklin Resources, Inc.

NYSE:BEN

BEN Rankings

BEN Latest News

BEN Stock Data

12.42B
313.06M
40.24%
48.81%
4.15%
Investment Advice
Finance and Insurance
Link
United States of America
SAN MATEO

About BEN

franklin resources, inc. [nyse:ben] is a global investment management organization with subsidiaries operating as franklin templeton (www.franklinresources.com). the products, services, information and materials referenced in this site may not be available to residents in certain jurisdictions. consult with an investment professional or contact your local franklin templeton office for more information. this site and the information contained herein is not intended to constitute an offer to sell or an invitation or solicitation of an offer to buy any product or service by franklin templeton. nothing in this website should be construed as investment, tax, legal or other advice. all investments involve risks, including potential loss of principal.linkedin is owned by a third party unaffiliated with us. we are not responsible for linkedin’s privacy, security, or terms of use policies that control this service, nor their content, software, or tools (or those of any third party’s) that are