Tech tyrant dies at 56

Former YouTube CEO Susan Wojcicki died Friday at 56 after a two-year bout of non-small cell lung cancer, her husband Dennis Troper announced on Facebook.

“It is with profound sadness that I share the news of Susan Wojcicki passing,” Troper wrote.

“My beloved wife of 26 years and mother to our five children left us today after 2 years of living with non small cell lung cancer. Susan was not just my best friend and partner in life, but a brilliant mind, a loving mother, and a dear friend to many. Her impact on our family and the world was immeasurable. We are heartbroken, but grateful for the time we had with her. Please keep our family in your thoughts as we navigate this difficult time.”

A platform for government disinformation

Wojcicki served as YouTube’s CEO from 2014 to 2022. Under her stewardship, the video-streaming giant became a major propaganda tool for governments around the world, including the United States and the United Nations. To be effective, the Google-owned platform relied heavily on suppressing content that challenged “authoritative” narratives. During COVID-19, YouTube became a main purveyor of government disinformation by openly censoring medical professionals and researchers who questioned federal science.

Step 1: Remove all disfavored opinions

In 2022, Wojcicki explained at the World Economic Forum’s Davos summit how YouTube serves as a government bullhorn by “fighting misinformation” in three ways.

The first, she said, is by removing anything that violates YouTube’s many policies. 

“So, the first would be from a policy standpoint, we would look at content that we would think about in terms of being violative of our policies,” said Wojcicki. “So if you look at COVID, for example, we came up with ten different policies that we said would be violative.” 

The then-CEO mocked users who questioned the origin of the virus. 

“Like an example of that would be saying that COVID came from something other than a virus. And we did see people attacking 5G equipment, for example, because they thought that it was causing COVID,” she said as chuckles were heard around the room. “And so that would just be an example of a policy that we would remove. So we do remove content based on those policies. We actually publish that in a transparency report.” 

Wojcicki did not elaborate on how the company verifies that certain views and theories are untrue, if at all. 

Step 2: Insert government information

In addition to removing information, YouTube also inserts messaging approved by authorities. 

“The second one would be really raising up authoritative information. So if you are dealing with a sensitive subject like news, health, science, we are going to make sure that what we're recommending is coming from a trusted, well-known publisher that can be reliable.” 

It is unclear how YouTube determines which publishers are “trusted” and “reliable.”

Step 3: Hide and demonetize ‘borderline content’

In addition to removing contrary information and feeding authority-approved information, YouTube also downranks, or hides, content it deems to be “lower quality.”

“The third is making sure that if there's content that's borderline content that technically meets our policy but is lower quality, that's content that we basically will not recommend to our users. Our users can still access it, but they will not recommend it,” Wojcicki explained.

She did not define “lower quality” or disclose the process by which content is deemed as such. 

Finally, Wojcicki said that the company does not monetize information that is “generally understood” to be inaccurate, though she did not explain how the company assesses something as “generally understood” and by whom. 

“And then lastly, we're just really careful about what we monetize. So, we always want to make sure that there's no incentive. So, for example, with regard to climate change, we don't monetize any kind of climate change material. So, there's no incentive for you to keep publishing that material that is propagating something that is generally understood as not accurate information.”