PX14A6G 1 r59220px14a6g.htm

 

Notice of Exempt Solicitation

 

NAME OF REGISTRANT: Alphabet Inc.

NAME OF PERSONS RELYING ON EXEMPTION: Loring, Wolcott & Coolidge Fiduciary Advisors, LLP

ADDRESS OF PERSON RELYING ON EXEMPTION: 230 Congress Street, Boston, MA 02110

WRITTEN MATERIALS: The attached written materials are submitted pursuant to Rule 14a-6(g)(1) (the “Rule”) promulgated under the Securities Exchange Act of 1934, in connection with a proxy proposal to be voted on at the Registrant’s 2022 Annual Meeting.

 

Dear Shareholders,

 

The Sustainability Group of Loring, Wolcott & Coolidge, Robeco, and NEI Investment (“Proponents”) have filed a shareholder proposal requesting that the Audit and Compliance Committee commission an independent Human Rights Impact Assessment report, conducted by a reputable third party at reasonable cost, evaluating the efficacy of Alphabet's existing policies and practices to address the human rights impacts of its content management policies to address misinformation and disinformation across its platforms – item 16 in the company’s 2022 proxy statement.

 

Alphabet Inc. (“Alphabet” or “the Company”) argues its existing policies, oversight, and transparency are sufficient. However, in response to the Company’s request for no action relief, the SEC concluded that “it appears that the Company’s public disclosures do not substantially implement the Proposal.”1

 

Misinformation around the efficacy and safety of COVID-19 vaccines, and the disinformation perpetuated by the Russian government regarding its war of aggression in Ukraine, are top-of-mind examples of the profound risks that content mismanagement can pose to society. At the same time, Alphabet faces fundamental business risk if its content management policies do not meet the expectations of users, advertisers, and regulators. To its credit, Alphabet has been transparent about its recognition of the fact that the “spread of disinformation is antithetical to our mission and is potentially harmful to our business success”.2 The Company also has relevant policies and provides some narrative disclosure around its approach to this issue, including some outcomes and enforcement actions. However, Alphabet has failed to provide investors with an independent assessment of the efficacy of the Company’s existing policies and oversight structures. To rectify this, the proposal asks for an expert assessment, using the well-established form of a Human Rights Impact Assessment, consistent with the Company’s existing commitment to protect human rights by conducting robust due diligence. A vote in favor of the proposal would provide all investors valuable information on Alphabet’s risk management performance, while also supporting investors in implementing their own obligations to assess human rights risks in their portfolios. Further, the increased transparency could help to bolster confidence in the company’s approach and reduce reputational and regulatory risks.

 

Please note the Proponents recognize that considerations related to misinformation and disinformation are exceedingly complex, with many competing and even contradictory factors that may indeed eclipse what any one company—even one with the size and reach of Alphabet—can mitigate unilaterally. This proposal does not assume there are easy answers nor attempt to supplant the judgement of management and the Alphabet Board of Directors (“the Board”). Rather, it seeks to fill an urgent need by providing increased transparency, guided by independent expertise and using a familiar and appropriate form for the Company, which Proponents agree is necessary for Alphabet’s Board, management, and shareholders alike to consider whether the Company’s existing policies and oversight concerning mis- and disinformation are sufficient.

 

We firmly believe this proposal is in shareholders’ best interest and we encourage all shareholders to support it by voting FOR Proxy Item #16.

 

The views expressed are those of the authors and Loring, Wolcott & Coolidge Fiduciary Advisors, LLC as of the date referenced and are subject to change at any time based on market or other conditions. These views are not intended to be a forecast of future events or a guarantee of future results. These views may not be relied upon as investment advice. The information provided in this material should not be considered a recommendation to buy or sell any of the securities mentioned. It should not be assumed that investments in such securities have been or will be profitable. This piece is for informational purposes and should not be construed as a research report.

 

NOTE: This is not a solicitation of authority to vote your proxy. Please DO NOT send us your proxy card; LWCFA is not able to vote your proxies, nor does this communication contemplate such an event. LWCFA urges shareholders to vote FOR Proxy Item 16 following the instructions provided on the management’s proxy mailing.

 

 

_____________________________

1 https://www.sec.gov/divisions/corpfin/cf-noaction/14a-8/2022/sustainabilityalphabet041222-14a8.pdf

2 https://www.sec.gov/Archives/edgar/data/0001652044/000130817922000262/lgoog2022_def14a.htm p 94-95

 

  1
 

 

Table of Contents

 

Rationale in support of a “FOR” vote 2
   
Existing disclosures do not meet the essential request of the proposal 3
   
Human rights impacts can be amplified by Alphabet’s control over our information environment 3
   
Companies and investors are obligated to respect human rights and conduct sufficient due diligence 3
   
Companies face risk if human rights impacts are insufficiently managed 4
   
Investors and companies increasingly rely on Human Rights Impact Assessments to provide necessary transparency 4
   
Shareholder proposal summary 5
   
Proponent response to Alphabet Opposing Statement 5
   
Annex: Supplemental background information in support of a “FOR” vote 6

 

 

Rationale in support of a “FOR” vote

 

The war in Ukraine and the ongoing global pandemic are just two of the most recent, tragic reminders of the ways in which human rights—including those directly impacted by misinformation (information that is incorrect or misleading3) and disinformation (information that is deliberately false and often intended to obscure the truth4)—can create material risks for companies, investors, and markets. In particular, Alphabet faces significant reputational and regulatory risks if users, advertisers, and regulators lose confidence in its existing policies and approach. In its Opposing Statement to the proposal, Alphabet makes clear it is committed to respecting human rights and has a dedicated human rights program. This proposal would help to reinforce the existing human rights infrastructure by providing valuable transparency, and is consistent with the Company’s responsibilities to conduct due diligence5 in this area. Given the significant human rights risks posed by misinformation and disinformation, as well as Alphabet’s seemingly ubiquitous products and tremendous influence in shaping our information environment, the Proponents believe it is particularly important for its policies to be independently assessed.

 

Towards that end, Proponents request a Human Rights Impact Assessment (HRIA), a report type that is used increasingly by tech peers and other large companies—even Alphabet itself, through its largest subsidiary, Google. It is a well-known form with clear boundaries that is not overly complex for investors, while also flexible enough for company management to shape its specific parameters. Most importantly, the proposal seeks the independent perspective necessary to evaluate the efficacy of the Company’s policies related to mis- and disinformation. Merely demonstrating the existence of such policies and certain outcomes and enforcement, which the Company has made public, fails to address the heart of the proposal.

 

 

_____________________________

3 Merriam-Webster. Available at https://www.merriam-webster.com/dictionary/misinformation

4 Merriam-Webster. Available at https://www.merriam-webster.com/dictionary/disinformation

5 “Corporate human rights due diligence: emerging practices, challenges and ways forward,” Summary of the report of the Working Group on Business and Human Rights to the General Assembly, October 2018. Available at https://www.ohchr.org/sites/default/files/Documents/Issues/Business/ExecutiveSummaryA73163.pdf

 

  2 | Page 
 

 

Existing disclosures do not meet the essential request of the proposal

 

The Proponents recognize that the Company has policies and practices related to mis- and disinformation and provides some reporting on their outcomes. However, the act of disclosing these policies publicly, even with certain cherry-picked outcomes, does not necessarily demonstrate they are effective, let alone sufficient. That is why the objective of this proposal is to seek a third-party report, applying the HRIA framework, to provide shareholders with an independent assessment to evaluate the efficacy of Alphabet’s existing policies and practices as written and implemented, insofar as they are sufficient to address misinformation and disinformation. As such, the SEC concurs that Proponents’ request has not been met.6

 

Further, while Alphabet touts its Global Network Initiative (“GNI”) membership and evaluations7, both through Google, this does not fulfil the proposal’s core request. According to the GNI, it “helps companies respect freedom of expression and privacy rights when faced with government pressure to hand over user data, remove content, or restrict communications.”8 It is not focused on either misinformation or disinformation, nor on establishing controls for harmful content, and thus do not meet the shareholders’ request for disclosure. With multiple experts having raised specific concerns with Alphabet’s products, structure, and existing policies, an independent assessment is all the more important.9

 

Finally, while the Proponents are pleased that the Audit and Compliance Committee charter has been updated to clarify the Board’s role in overseeing human rights impacts, this also fails to address the central ask of the proposal. Commissioning an HRIA will provide a vital tool for the Board to help fulfill its mandate by receiving independent, third-party reporting to augment its due diligence process.

 

Human rights impacts can be amplified by Alphabet’s control over the information environment

 

Through its tremendous size and seemingly ubiquitous products and services, Alphabet plays an outsized role in facilitating the dissemination of information globally. Accordingly, it faces greater-than-average risks pertaining to mis- and disinformation, and is therefore particularly obligated to conduct thorough due diligence in this area and to ensure its policies are effective. An HRIA can provide a critical tool in this process.

 

Companies and investors are obligated to respect human rights and conduct sufficient due diligence

 

Alphabet—like all companies—has the responsibility to respect human rights. In its opposition statement, Alphabet makes clear it is “committed to respecting the rights enumerated in the Universal Declaration of Human Rights and its implementing treaties, as well as upholding the standards established in the United Nations Guiding Principles on Business and Human Rights…”10 To uphold these obligations, it must conduct sufficient due diligence to assess its relevant impacts. Similarly, investors must also undertake due diligence to assess human rights risks in their own portfolios. The findings of an HRIA regarding mis- and disinformation would benefit both the Company and investors by allowing them to better meet these obligations.

 

 

_____________________________

6 https://www.sec.gov/divisions/corpfin/cf-noaction/14a-8/2022/sustainabilityalphabet041222-14a8.pdf

7 The Global Network Initiative (GNI), of which Alphabet is a member, is a multi-stakeholder initiative whose mission “is to protect and advance freedom of expression and privacy rights in the ICT industry by setting a global standard for responsible company decision making and serving as a multi-stakeholder voice in the face of government restrictions and demands”. See https://globalnetworkinitiative.org/team/our-mission/

8 https://globalnetworkinitiative.org/

9 See, for instance, Whitepaper: “How Google Fights Disinformation,” February 2019. Available at https://www.blog.google/documents/37/How Google_Fights_Disinformation.pdf/

10 https://www.sec.gov/Archives/edgar/data/0001652044/000130817922000262/lgoog2022_def14a.htm p 94-95

 

  3 | Page 
 

 

Companies face risk if human rights impacts are insufficiently managed

 

Lack of effective policies could present both regulatory and reputational risks for the Company. If Alphabet fails to manage its human rights-related risks, it could face a breakdown of trust from users and advertisers—a dangerous proposition for a company that derives the vast majority of its revenues from advertising.

 

Similarly, global technology companies like Google are facing pressure from regulators and pushback from users over how they moderate the constant stream of information on their platforms. In April, the European Union agreed to landmark legislation that would require Alphabet and other tech giants to address misinformation more aggressively and increase transparency.11 According to Freedom House, a nonprofit group that tracks the global state of democracy and internet policy, “Authorities in at least 48 countries pursued new rules for tech companies on content, data, and competition over the past year.”12 Similarly, in the US, controversy erupted regarding the Communications Decency Act Section 230, which provides a safe harbor against liability for Internet hosting companies like Alphabet. Proposals have been surfacing to amend that law, based on the inadequacy of current safeguards against harmful misinformation.13

 

Investors and companies increasingly rely on Human Rights Impact Assessments to provide necessary transparency

 

Along with growing recognition of company and investor obligations to protect human rights, investors are increasingly turning to HRIAs to ensure sufficient transparency is provided. According to the Interfaith Center on Corporate Responsibility, 14 proposals expressly calling for an HRIA have been filed since 2016 with most clustered in 2020-2022, making clear the form is familiar and valuable to investors.14 Alphabet itself has used an HRIA to evaluate the risk profile of its celebrity recognition tool—a clear nod to its value.

 

HRIAs provide a systematic framework, and a process familiar to both investors and companies—without being overly prescriptive in a way that supplants the judgment of the Board or management—that would enable investors for the first time to evaluate whether Alphabet’s existing policies to address mis- and disinformation are effective. Moreover, an HRIA would help fulfill the company’s obligation under Principle 17 of the Guiding Principles on Business and Human Rights, which states “In order to identify, prevent, mitigate and account for how they address their adverse human rights impacts, business enterprises should carry out human rights due diligence. The process should include assessing actual and potential human rights impacts, integrating and acting upon the findings, tracking responses, and communicating how impacts are addressed.”15 And since investors also have the obligation to conduct sufficient due diligence to assess human rights risks in their portfolios, an HRIA would help to facilitate this process too.

 

 

_____________________________

11 Satariano, Adam, “E.U. Takes Aim at Social Media’s Harms With Landmark New Law,” The New York Times, April 22, 2022.

12 Shahbaz, Adrian and Funk, Allie, “Freedom on the Net 2021: The Global Drive to Control Big Tech” Freedom House, September 16, 2021. Available at https://freedomhouse.org/sites/default/files/2021-09/FOTN_2021_Complete_Booklet_09162021_FINAL_UPDATED.pdf

13 See, for instance, Ghaffary, Shirin and Heilweil, Rebecca, “A New Bill Would Hold Facebook Responsible for Covid-19 Vaccine Misinformation,” Vox, July 22, 2021. Available at https://www.vox.com/recode/2021/7/22/22588829/amy-klobuchar-health-misinformation-act-section-230-covid-19-facebook-twitter-youtube-social-media

14 ICCR Proxy Resolutions and Voting Guides. Available at https://www.iccr.org/resources/iccrs-proxy-resolutions-and-voting-guide

15 “Guiding Principles on Business and Human Rights”, United Nationals Human Rights Office of the High Commissioner, 2011. Available at https://www.ohchr.org/sites/default/files/documents/publications/guidingprinciplesbusinesshr_en.pdf

 

  4 | Page 
 

 

Given the profound impacts that mis- and disinformation can have on human rights and the risks Alphabet faces if it fails to adequately address this, the Proponents believe it is imperative for the Company to provide greater transparency and use a human rights lens to evaluate whether its policies and practices are sufficient.

 

Shareholder proposal summary

 

Resolved: Stockholders request the Audit and Compliance Committee commission an independent Human Rights Impact Assessment report (“the Report”), conducted by a reputable third party at reasonable cost, evaluating the efficacy of Alphabet's existing policies and practices to address the human rights impact of its content management policies to address misinformation and disinformation across its platforms. A summary of its findings should be published, omitting confidential, proprietary, or legally privileged information, or admissions relevant to pending litigation.

 

Supporting Statement:

 

The Report should reflect international standards and guidelines such as the UNGPs, and evaluate:

 

·Existing governance and oversight mechanisms to evaluate how Alphabet's senior leadership and directors incorporate human rights due diligence—especially related to misinformation and disinformation—in their decision-making.

·Whether existing policies are effectively limiting the proliferation of misinformation and disinformation on its platform;

·Potential recommendations, if necessary, to strengthen measures to mitigate human rights harms associated with the dissemination of misinformation and disinformation.

 

Proponent response to Alphabet Opposing Statement

 

Company Claim: Committed to human rights; efforts have been reviewed by an independent third party.

Response:

·    An HRIA would help the company effectuate its current commitments to respect human rights.

·    GNI provides third party expertise, but is inherently limited by its focus on freedom of expression and privacy. The risks of mis- and disinformation are far more expansive and warrant a broader evaluation.

Company Claim: Continued efforts are driven by a dedicated human rights program.

Response:

·    The proposed HRIA would support and advance existing efforts by providing expert feedback on whether these programs are indeed effective.

·    The assessment would ensure that existing resources are deployed effectively and efficiently.

Company Claim: Maintains robust policies and transparency on management of mis- and disinformation.

Response:

·    The heart of the proposal is not whether policies exist, but whether they are effective and sufficient.

·    The HRIA would ensure that investors (and the Company) can evaluate the efficacy of existing policies.

Proponent Conclusion: The company has not made the case that it has assessed the efficacy of its policies and practices. As such, support for the proposal is warranted.

 

  5 | Page 
 

 

Annex: Supplemental information in support of a “FOR” vote

 

Review of existing disclosures and governance structures

 

While the Company discloses some policies and practices related to misinformation and disinformation and provides some narrative examples of its approach, including outcomes and enforcement actions, the Proponents maintain these are inadequate to allow investors to assess the efficacy of such policies.

 

The Company publishes an annual Transparency Report and provides a transparency hub, YouTube has removal policies and Community Guidelines and the Company provides disclosure via the Ad Safety report and other sources to disclose the outcome of these policies. In addition, it publishes blogs and white papers that make clear the Company is working hard to improve its content moderation policies, improve transparency around content removals, and reduce bias. While these disclosures show investors the degree to which the Company has rules in place ostensibly restricting or prohibiting certain forms of content, they do not provide enough information to evaluate the efficacy of existing policies, nor guidance on whether they are sufficient. Moreover, while the data disclosed reveals the volumes of content that the Company has had to remove for violating community guidelines, existing disclosure does not explain the extent to which current policies actually obstruct the flow of disinformation and misinformation that has a substantial impact on human rights.

 

Regarding governance structures, the Audit and Compliance Committee charter has been updated to clarify the board’s role in overseeing human rights impacts, but again this fails to address the central ask of the proposal – to seek information on the effectiveness of existing oversight structures and how directors incorporate due diligence into their decision-making. The Proponents believe that commissioning a Human Rights Impact Assessment will provide a vital tool for the board to help fulfill its mandate by receiving independent, third-party reporting to augment its due diligence process.

 

Alphabet’s influence over the information ecosystem

 

Alphabet’s size and influence shape our information environment—it is hard to overstate the influence and pervasiveness of Alphabet’s products. For example, after recently reporting record financial results, the New York Times noted the following:

 

Google’s strong results were a reminder of the underlying power of its business and how, regardless of the circumstances surrounding it, the Company will continue to thrive as long as people are active on the internet. Google search remains the preferred on-ramp to the internet. YouTube is an essential online destination for entertainment, information and music. While it lags Amazon and Microsoft, Google is well positioned to capitalize on the seismic shift of businesses outsourcing technology infrastructure to the cloud.16

 

 

_____________________________

16 Wakabayashi, Daisuke, “Alphabet’s profit increased 36 percent, to $20.64 billion, in the fourth quarter,” The New York Times, February 1, 2022, Available at https://www.nytimes.com/2022/02/01/technology/google-alphabet-earnings.html

 

  6 | Page 
 

 

Google, Alphabet’s largest subsidiary, holds a market share of around 90 percent in a wide range of digital markets, a position that generated over $209 billion in 2021 advertising revenue worldwide.17 Additionally, according to Global Media Insights, “Technically, YouTube is the second-largest search engine, after Google.”18 Further, according to YouTube, “500 hours of content are uploaded to YouTube every minute.”19

 

Moreover, Alphabet is the largest digital ad company in the world and 85% of smartphone users worldwide utilize its Android operating system – something the Company uses to secure its reach.20 According to the Wall Street Journal, “Google uses its Android operating system as a vehicle to extend its advertising reach by putting its search engine, mapping system and YouTube video network into the hands of mobile users.”21

 

Certainly, Alphabet’s platforms have tremendous influence shaping our information environment. As a result, Google, along with social media peers, receives outsized attention in the content moderation debate because it has “an effective monopoly on online information flows in certain segments of society,” writes Danah Boyd, founder of Data & Society.22

 

News reports have raised concerns regarding the efficacy of Alphabet’s existing policies and practices. For example, in January 2022 an international group of more than 80 fact-checking organizations—which Google cites as an example of an outside group with which it partners23—sent a letter to YouTube CEO Susan Wojcicki claiming that YouTube’s misinformation policies are insufficient, and called for increased transparency around their application.

 

The open letter stated:

It’s been almost two years since the COVID-19 pandemic started. The world has seen time and time again how destructive disinformation and misinformation can be for social harmony, democracy, and public health; too many lives and livelihoods have been ruined, and far too many people have lost loved ones to disinformation. As an international network of fact-checking organizations, we monitor how lies spread online—and every day, we see that YouTube is one of the major conduits of online disinformation and misinformation worldwide. This is a significant concern among our global fact-checking community.

 

What we do not see is much effort by YouTube to implement policies that address the problem. On the contrary, YouTube is allowing its platform to be weaponized by unscrupulous actors to manipulate and exploit others, and to organize and fundraise themselves. Current measures are proving insufficient. That is why we urge you to take effective action against disinformation and misinformation, and to elaborate a roadmap of policy and product interventions to improve the information ecosystem—and to do so with the world’s independent, nonpartisan fact-checking organizations.

 

 

_____________________________

17 Google: ad revenue 2001-2021. Statista, 2022. Available at https://www.statista.com/statistics/220534/googles-share-of-search-market-in-selected-countries/

18 “YouTube user statistics 2022,” Global Media Insight, January 3, 2022. Available at https://www.globalmediainsight.com/blog/youtube-users-statistics/

19 “YouTube by the Numbers.” Available at https://blog.youtube/press/

20 Graham, Meghan and Graham, Jennifer, “How Google’s $150 billion Advertising Business Works,” CNBC, May 18, 2021. Available at https://www.cnbc.com/2021/05/18/how-does-google-make-money-advertising-business-breakdown-.html

21 Mickle, Tripp et al. “Google plans to curtail Cross-App Tracking on Android Phones,” The Wall Street Journal, February 16, 2022. Available at https://www.wsj.com/articles/google-plans-to-curtail-cross-app-tracking-on-android-phones-11645016401

22 Boyd, Danah, “Google and Facebook Can’t Just Make Fake News Disappear,” Data & Society, March 27, 2017. Available at https://points.datasociety.net/google-and-facebook-cant-just-make-fake-news-disappear-48f4b4e5fbe8

23 Whitepaper: “How Google Fights Disinformation,” February 2019. Available at https://www.blog.google/documents/37/How Google_Fights_Disinformation.pdf/

 

  7 | Page 
 

 

The examples are too many to count. Many of those videos and channels remain online today, and they all went under the radar of YouTube’s policies, especially in non-English speaking countries and the Global South. Though the Company has recently made some moves to try to address this problem, daily activity on the platform suggests these efforts are not working—nor has YouTube produced any quality data to prove their effectiveness.24

 

This conclusion is deeply troubling and supports the need for an independent assessment of Alphabet’s existing policies called for in the proposal.

 

Moreover, other researchers are raising concerns about Alphabet’s technologies and role in the overall information ecosystem. Kate Starbird, co-founder and researcher at the Center for an Informed Public at the University of Washington, told NPR:

 

We've done research on disinformation around [the] 2016 election, around the civil war in Syria, conspiracy theories of crisis events. I've got a bunch of different cases. Over and over again, YouTube is the dominant domain in those conversations. It's not Facebook. They're all pulling in content from YouTube. So what YouTube does is it creates these content resources that get mobilized on other platforms. And so it’s not just a problem within YouTube; it's actually a problem for the whole information ecosystem—the fact that YouTube hosts and allows those videos to be resources that are repeatedly mobilized in these other platforms at opportunistic times to spread mis- and disinformation.25

 

Additionally, she noted:

 

We don’t have great insight into YouTube because it is harder to see. It’s one of the platforms that’s hardest to collect data about. And it’s very—it’s almost inscrutable for us compared to some other platforms, like Twitter, where we can collect lots of data and look at things. So YouTube fits centrally into the phenomenon but it actually—it is harder to access for our research teams.26

 

Further, one review by the Anti-defamation League has suggested that among the many social media platforms, YouTube has unique challenges:

 

YouTube’s design and architecture suggest numerous reasons for concern. First, YouTube is an open platform that depends on user-generated content and thus allows people with fringe or extremist views to compete directly with established media and information sources. Second, the financial incentives that YouTube provides based on viewership and watch time may encourage creators to appeal to people with extreme views and provoke controversy. Third, YouTube’s algorithm makes recommendations based in part on past user behavior. These recommendations can influence user behavior, especially because the top recommendation is played after the current video concludes by default. Fourth, video requires transcription and takes longer for humans to review than text, posing difficult content moderation challenges.27

 

The Proponents contend that as a result of these concerns, an independent assessment is imperative.

 

 

_____________________________

24 “An Open Letter to YouTube’s CEO from the World’s fact-checkers,” By the International Fact-Checking Network, January 12, 2022. Available at https://www.poynter.org/fact-checking/2022/an-open-letter-to-youtubes-ceo-from-the-worlds-fact-checkers/

25 “Exploring YouTube and the Spread of Disinformation,” Transcript from National Public Radio’s Morning Edition, April 13, 2021. Available at https://www.npr.org/2021/04/13/986678544/exploring-youtube-and-the-spread-of-disinformation

26 Ibid

27 Chen, Annie et al, “Exposure to Alternative & Extremist Content on YouTube,” Part of the Anti-Defamation League’s Belfer Fellowship Series, February 2021. Available at https://www.adl.org/media/15868/download

 

  8 | Page 
 

 

Human rights risks posed by misinformation and disinformation

 

Public Health: According to the World Health Organization (WHO), disinformation and misinformation regarding COVID-19 has been shaping health outcomes with, at times, deadly consequences. According to the Director General of the WHO, “we’re not just fighting an epidemic; we’re fighting an infodemic. Fake news spreads faster and more easily than this virus, and is just as dangerous.”28 The Company has relevant policies and has taken concerted action to address COVID-19-related misinformation and disinformation. However, to date, the Company has not publicly released any third-party assessments of the efficacy of these policies nor whether they are sufficient to protect against human rights risks.29

 

In another very current example, in March 2022, the United States Surgeon General requested information from tech giants about the spread of misinformation related to COVID-19 on its sites. According to the New York Times:

 

President Biden’s surgeon general on Thursday formally requested that the major tech platforms submit information about the scale of Covid-19 misinformation on social networks, search engines, crowdsourced platforms, e-commerce platforms and instant messaging systems.

 

A request for information from the surgeon general’s office demanded that tech platforms send data and analysis on the prevalence of Covid-19 misinformation on their sites, starting with common examples of vaccine misinformation documented by the Centers for Disease Control and Prevention.

 

The notice asks the companies to submit ‘exactly how many users saw or may have been exposed to instances of Covid-19 misinformation,’ as well as aggregate data on demographics that may have been disproportionately exposed to or affected by the misinformation.

 

The surgeon general, Dr. Vivek Murthy, also demanded information from the platforms about the major sources of Covid-19 misinformation, including those that engaged in the sale of unproven Covid-19 products, services and treatments.

 

‘Technology companies now have the opportunity to be open and transparent with the American people about the misinformation on their platforms,’ Dr. Murthy said in an emailed statement. He added: ‘This is about protecting the nation’s health.’30

 

 

_____________________________

28 Address by Dr. Tedros Adhanom Ghebreyesus, Munich Security Conference, February 15, 2020. Available at https://www.who.int/director-general/speeches/detail/munich-security-conference

29 The United States is signatory to the International Covenant on Economic, Social and Cultural Rights, which includes article 12: The States Parties to the present Covenant recognize the right of everyone to the enjoyment of the highest attainable standard of physical and mental health.

30 Alba, Davey, “The Surgeon General Calls on Big Tech to Turn Over Covid-19 Misinformation Data,” The New York Times, March 3, 2022. Available at https://www.nytimes.com/2022/03/03/technology/surgeon-general-covid-misinformation.html

 

  9 | Page 
 

 

Free, fair and safe elections:31 This critical human right is jeopardized by disinformation campaigns that have proliferated on Alphabet platforms. For example, in the United States, Alphabet’s YouTube was subpoenaed by the Committee of the House of Representatives investigating the violent insurrection that occurred at the US Capitol on January 6th, 2021, in part because “YouTube was a platform for significant communications by its users that were relevant to the planning and execution of the January 6th attack on the United States Capitol”32 Further, the letter continued by saying, "The Select Committee believes Alphabet has significant undisclosed information that is critical to its investigation, concerning how Alphabet developed, implemented, and reviewed its content moderation, algorithmic promotion, demonetization, and other policies that may have affected the January 6, 2021 events.”33

 

The Company is cooperating with the congressional committee and has noted: “We have strict policies prohibiting content that incites violence or undermines trust in elections across YouTube and Google's products, and we enforced these policies in the run-up to January 6 and continue to do so today."34

 

However, again, the efficacy of Company efforts to control disinformation and misinformation that can distort and discredit election outcomes is an open question.

 

Civil rights in times of war: In the current crisis in the Ukraine, world leaders are coming together to encourage social media platforms —including Alphabet—to do more to address misinformation related to Russia. In a February 27, 2022 joint letter to the chief executives of Alphabet’s Google and YouTube units along with Twitter and Facebook, the premiers of Poland, Lithuania, Latvia, and Estonia contend that “Although the online platforms have undertaken significant efforts to address the Russian government’s unprecedented assault on truth, they have not done enough.”35 Further, the letter states, “Russia’s disinformation has been tolerated on online platforms for years; they are now an accessory to the criminal war of aggression the Russian government is conducting against Ukraine and the free world.”36

 

Further, on February 25, 2022 Senator Mark Warner sent a letter to Alphabet’s CEO noting that “your platforms continue to be key vectors for malign actors—including, notably, those affiliated with the Russian government—to not only spread disinformation, but to profit from it. YouTube, for instance, continues to monetize the content of prominent influence actors that have been publicly connected to Russian influence campaigns. Just yesterday, for instance, my staff was able to find RT, Sputnik and TASS channels’ content specifically focused on the Ukraine conflict to be monetized with YouTube ads.”37 Sen. Warner made clear “As one of the world’s largest communications platforms, your company has a clear responsibility to ensure that your products are not used to facilitate human rights abuses, undermine humanitarian and emergency service responses, or advance harmful disinformation.”38 He urged the companies, among other things, to “conduct an audit of Google and YouTube’s advertising business, including its compliance with sanctions.”39

 

 

_____________________________

31 The European Parliament published “The impact of disinformation on democratic processes and human rights in the world” in April 2021. The report notes that Article 21 of the Universal Declaration of Human Rights (UDHR) states:

1. Everyone has the right to take part in the government of his country, directly or through freely chosen representatives; …

3. The will of the people shall be the basis of the authority of government; this will shall be expressed in periodic and genuine elections which shall be by universal and equal suffrage and shall be held by secret vote or by equivalent free voting procedures.

The report also notes that according to the UN Human Rights Committee, states are obliged to ensure that ‘Voters should be able to form opinions independently, free of violence or threat of violence, compulsion, inducement or manipulative interference of any kind’. Available at https://www.europarl.europa.eu/RegData/etudes/STUD/2021/653635/EXPO_STU(2021)653635_EN.pdf

32 Letter to Mr. Sundar Pichai, Chief Executive Officer, Google LLC, Alphabet Inc. from the Select Committee to Investigate the January 6th Attack on the United States Capitol, January 13, 2022. Available at https://january6th.house.gov/sites/democrats.january6th.house.gov/files/2022-1-13.BGT%20Letter%20to%20Alphabet%20-%20Cover%20Letter%20and%20Schedule_Redacted.pdf

33 Ibid

34 Wolfe, Jan, “U.S. House Panel Subpoenas Social Media Firms in Jan. 6 Attack Probe,” Reuters, January 13, 2022. Available at https://www.reuters.com/article/usa-capitol-tech-idCAKBN2JN252

35 Chee, Foo Yun and Chalmers, John, “Google, Facebook, Twitter, Must Combat Ukraine Fake News—Polish, Baltic Leaders, Reuters, February 28, 2022. Available at https://www.reuters.com/technology/google-facebook-twitter-must-combat-ukraine-fake-news-polish-baltic-leaders-2022-02-28/

36 Ibid.

37 Letter from Senator Mark R. Warner, Virginia, to Mr. Sundar Pichai, Chief Executive Officer of Alphabet, February 25, 2022. Available at https://www.warner.senate.gov/public/_cache/files/4/a/4a3293e2-b07c-49fe-b052-3b3733ee586b/598A05F2F5B91764B6652B9ED516AD3B.google-final.pdf

38 Ibid

39 Ibid

 

  10 | Page 
 

 

Obligations to respect human rights

 

Alphabet—like all companies—has the responsibility to respect human rights. In 2011, governments around the world came together in the UN Human Rights Council to unanimously endorse the UN Guiding Principles on Business and Human Rights (UNGPs) which establishes that all companies, including Alphabet, have a responsibility to respect human rights, including civil, political, economic, social, cultural and labor rights.40

 

According to the United Nations High Commissioner on Human Rights:

 

“In order to meet their responsibility to respect human rights, business enterprises should have in place policies and processes appropriate to their size and circumstances, including:

a)A policy commitment to meet their responsibility to respect human rights;
b)A human rights due diligence process to identify, prevent, mitigate and account for how they address their impacts on human rights;
c)Processes to enable the remediation of any adverse human rights impacts they cause or to which they contribute.”41

 

According to documents produced by the United Nations High Commissioner on Human Rights to clarify the responsibilities of companies, the responsibility to respect human rights is not optional. The document makes clear the following:

 

The responsibility to respect human rights is not, however, limited to compliance with such domestic law provisions. It exists over and above legal compliance, constituting a global standard of expected conduct applicable to all businesses in all situations. It therefore also exists independently of an enterprise’s own commitment to human rights…There can be legal, financial and reputational consequences if enterprises fail to meet the responsibility to respect. Such failure may also hamper an enterprise’s ability to recruit and retain staff, to gain permits, investment, new project opportunities or similar benefits essential to a successful, sustainable business. As a result, where business poses a risk to human rights, it increasingly also poses a risk to its own long-term interests.42

 

There is an increasing focus on the unique role that technology companies play in relation to human rights. According to the United Nations Human Rights B-Tech Project:

 

1. Technology company business models, and the commercial underpinnings of 21st century technological advances, are being increasingly criticized for creating or exacerbating negative impacts on a range of human rights. Business executives and entrepreneurs across the technology industry are being called on to address this concern. That companies do so in credible ways is fast becoming essential to gain (or regain) trust from stakeholders, build resilience into business models and sustain their legal and social license to operate.
2. Under the UNGPs, companies are expected to conduct human rights due diligence across all of their business activities and relationships. This includes addressing situations in which business model-driven practices and technology design decisions create or exacerbate human rights risks. This will require engagement from boards of directors, executives, entrepreneurs, and founders that have an influence on company strategy, not only individuals traditionally leading the implementation of a company’s human rights, ethical or responsible business programs.
3. Institutional investors—including asset managers, pension funds, private equity firms, and venture capitalists—have a responsibility to respect human rights consistent with the UNGPs.43

 

 

_____________________________

40 UN Office of the High Commissioner for Human Rights, “Guiding Principles on Business and Human Rights: Implementing the United Nations ‘Protect, Respect and Remedy’ Framework” June 16, 2011. Available at https://www.ohchr.org/documents/publications/guidingprinciplesbusinesshr_en.pdf

41 Ibid

42 UN Office of the High Commissioner for Human Rights, “The Corporate Responsibility to Respect Human Rights: An Interpretive Guide”, 2012. p 13-14. Available at https://www.ohchr.org/documents/publications/hr.pub.12.2_en.pdf

43 “Addressing Business Model Related Human Rights Risks: A B-Tech Foundational Paper,” B-Tech and the Office of the UN High Commissioner on Human Rights, 2020. Available at https://www.ohchr.org/Documents/Issues/Business/B-Tech/B_Tech_Foundational_Paper.pdf

 

  11 | Page 
 

 

As a result, Alphabet’s responsibility to protect human rights—and conduct adequate due diligence which can include Human Rights Impact Assessments—transcends statutory compliance, and the ability of its shareholders to request human rights-related disclosure is consistent with these obligations.

 

For investors to similarly meet their own obligations to safeguard human rights, they must have a better understanding not just of the nature of company policies and processes, but of their efficacy in mitigating against human rights harms. Investors need this information from portfolio companies to fulfill their own obligations under the UNGPs and other human rights laws, commitments and obligations.44

 

Risks of human rights mismanagement

 

Regulatory risks: Government and regulatory bodies have already began implementing measures to hold companies accountable, given the large-scale impacts of business on human rights. Voluntary, business-led efforts have been limited in their efficacy. Examples of these legislative efforts include:

 

·The Digital Services Act package in the European Union (EU), which consists of the Digital Services Act and Digital Markets Act. Through this package the European Council explicitly targets “very large digital platforms and services to analyze systemic risks they create and to carry out risk reduction analysis” with respect to issues such as 1) dissemination of illegal content, 2) adverse effect of fundamental rights, 3) manipulation of services having an impact on democratic processes and public security and 4) adverse effects on gender-based violence, and on minors and serious consequences for the physical or mental health of users;45

·The German Supply Chain Act, which is to come into force on January 1, 2023, represents mandatory human rights and environmental due diligence for global supply chains for companies that sell in Germany;46 and

·France’s Duty of Vigilance Law, which came into effect in 2017, requires large French companies to publish an annual vigilance plan which must establish effective measures to identify risks and prevent human rights and environmental impacts by the company, its subsidiaries and applicable subcontractors and suppliers.47

 

 

_____________________________

44 United Nations Office of the High Commissioner on Human Rights report “Taking stock of investor implementation of the UN Guiding Principles on Business and Human Rights” Available at https://www.ohchr.org/sites/default/files/Documents/Issues/Business/UNGPs10/Stocktaking-investor-implementation-reader-friendly.pdf

45 The European Commission’s description of the Digital Services Act package. Available at https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package

46 As reported by the Business and Human Rights Resource Centre. Available at https://www.business-humanrights.org/en/latest-news/germany-act-on-corporate-due-diligence-in-supply-chains-published-in-federal-law-gazette/ . See also DLA Piper’s summary of the German Supply Chain Act. Available at https://www.dlapiper.com/en/us/insights/publications/2021/09/german-supply-chain-act-lieferkettensorgfaltspflichtengesetz/

47 The Business and Human Rights Resource Centre’s summary on France’s Duty of Vigilance Law. Available at https://www.business-humanrights.org/en/latest-news/frances-duty-of-vigilance-law/

 

  12 | Page 
 

 

As the EU, and individual nation states in the EU, enhance legislation that speak to enhanced corporate accountability and transparency on human rights issues, and digital rights issues, companies will face continued scrutiny and additional regulatory risks from non-compliance. This activity is indicative of the potential of other global regulatory authorities to similarly consider legislative changes or additions. In response to such efforts companies may have to enact changes to policies, processes and disclosure in order to meet these new expectations. It is necessary for investors to understand how our company is positioned to meet the requirements of these shifting legislative and regulatory contexts. Beyond any costs required to comply, the risk of non-compliance could present significant reputational risks for Alphabet.

 

Reputational risks: Companies with a digital presence face heightened reputational risks from being associated with adverse human rights impacts. Advertising revenue is central to the economic competitiveness of Alphabet’s business model (as well as many of its peers) and any serious reputational concerns will ultimately impact revenues. The 2020 boycott of advertising on Facebook by major brands who were concerned about the company’s inaction related to prohibiting hate speech is one example of the real-world consequences of human rights-related reputational risks.48 While temporary boycotts undoubtedly impact the bottom line, the real risk lies with the possibility of a broad-based crisis of confidence in the company’s ability to manage human rights-related risks, which could have long-term impacts on the ability of the company to continue to generate advertising revenues.

 

Increasing reliance on Human Rights Impact Assessments

 

As recognition of company and investor obligations to protect human rights grows, investors are increasingly seeking HRIAs in order to ensure companies are providing sufficient transparency. This is generally understood to be part of fulfilling a company’s obligation under the UNGPs, and can help companies establish trust. The UNGPs establish that companies should “account for how they address their human rights impacts” and “be prepared to communicate this externally, particularly when concerns are raised by or on behalf of affected stakeholders” (UNGP 21).49 The UNGPs also state that “communication can take a variety of forms, including in-person meetings, online dialogues, consultation with affected stakeholders, and formal public reports”.50 In light of the information and expertise asymmetry between technology company personnel and the public regarding how technology products, services and solutions work, effective communication will be a critical part of reinforcing trust from users, customers, society-at-large and policy makers.

 

The value of conducting HRIAs is becoming widely recognized and many of Alphabet’s peers are conducting them as well. Google itself has used an HRIA to evaluate its celebrity recognition tool,51 and many other tech peers are using HRIAs to evaluate the human rights implications of their policies.

 

 

_____________________________

48 Hsu, Tiffany and Friedman, Gillian, “CVS, Dunkin’ Lego: The Brands Pulling Ads from Facebook Over Hate Speech,” The New York Times, June 26, 2020. Available at https://www.nytimes.com/2020/06/26/business/media/Facebook-advertising-boycott.html

49 The UNGPs as referenced by the United Nations Office of the High Commissioner for Human Rights. “Key Characteristics of Business Respect for Human Rights: A B-Tech Foundational Paper”. Available at key-characteristics-business-respect.pdf (ohchr.org)

50 Ibid.

51 “Google Celebrity Recognition API Human Rights Assessment: Executive Summary,” BSR, August 2019.

Available at https://services.google.com/fh/files/blogs/bsr-google-cr-api-hria-executive-summary.pdf

 

  13 | Page 
 

 

For example, Meta Platforms Inc. has used HRIAs to assess their operations in a number of countries where they have been the center of controversy. Its subsidiary Facebook recently released “three independent human rights impact assessments we commissioned in 2018 to evaluate the role of our services in Sri Lanka, Indonesia and Cambodia, along with details on how we’ve responded to the recommendations in each assessment. The assessments build on the work we’ve done over the last two years, beginning with creation of a human rights team to inform our policies, products, programs and partnerships around the world.”52

 

Additionally, Microsoft has a strong commitment to conducting HRIAs. According to Article One, “From 2017 to 2018, Microsoft partnered with Article One to conduct the first-ever human rights impact assessment (HRIA) of the human rights risks and opportunities related to artificial intelligence (AI).”53 More recently, after receiving a shareholder proposal on the topic, in October 2021 Microsoft committed to conduct another HRIA. According to Microsoft, “We recently decided to conduct additional human rights due diligence regarding the role of our technology and its potential impact on certain communities in select situations.”54

 

Just as the Company deemed it valuable to conduct an HRIA in order to evaluate the risk profile of its celebrity recognition tool, the Proponents believe Alphabet would derive value from undertaking a similar process to evaluate the efficacy of its policies to address misinformation and disinformation.

 

Conclusion

 

The Company has put forth disclosures showing the existence of policies, oversight mechanisms, and white papers as evidence that the proposal is substantially implemented. This misconstrues the essential objective of the proposal: to ensure investors have the information necessary to evaluate the efficacy of existing policies.

 

Currently, the Company does not have any independent, third-party verified assessments of whether its existing policies and practices are effective and sufficient to protect against potential human rights abuses related to misinformation and disinformation. The Proponents acknowledge that policies exist, some governance structures are in place, and the Company is taking steps to address and remove content. However, none of these steps fill or replace the need for an independent, third-party assessment evaluating the efficacy of such policies and governance structures.

 

Therefore, Proponents maintain this is a reasonable ask—one that has been made by shareholders before and one which Google itself, as well as Alphabet’s peers, have experience using. The assessment would provide the transparency necessary for the Company and outside stakeholders to assess the efficacy of existing policies. Therefore, we believe this proposal is in the best interests of all shareholders and the Company, and we encourage all shareholders to support the proposal.

 

We ask that all shareholders vote FOR Proxy Item #16.

 

 

_____________________________

52 Sissions, Miranda (Director of Human Rights) and Warofka, Alex (Product Manager, Human Rights), “An Update on Facebook’s Human Rights Work in Asia and Around the World,” May 12, 2020, Meta Newsroom. Available at https://about.fb.com/news/2020/05/human-rights-work-in-asia/

53 Article One Case Studies. Available here: https://www.articleoneadvisors.com/case-studies-microsoft

54 “Taking on Human Rights Due Diligence,” Microsoft on the Issues, October 20, 2021. Available at https://blogs.microsoft.com/on-the-issues/2021/10/20/taking-on-human-rights-due-diligence/

 

 

 

13 | Page