34 Ways the Wall Street Journal Got Google Wrong

Sam Ruchlewicz

There is no doubt that search engines are among the most important – and least understood – tools available to us. Search engines bring some semblance of order to otherwise chaotic, and unimaginably large corpus of information known as the web.

In recent years, “big tech” (Google, Facebook, Microsoft, Amazon, Twitter, etc.) has come under increasing scrutiny for a myriad of business practices; from aiding the spread of fake news published by foreign actors, to anti-competitive practices, to disputes over intellectual property and copyright infringement.

There are real concerns about the outsized roles that these companies play in shaping our world – from how we get information, to how we connect with one another, to how we make purchases. None of those concerns, however, justify the use of shoddy, agenda-driven or otherwise improper journalism. Especially when used to smear one of the “big tech” companies. The latest victim of the “tech-lash” was Google. Google has come under fire after a Wall Street Journal article accused the tech giant of (among other things) manipulating organic search results to the whims of powerful companies.

The Wall Street Journal – and the four reporters who published this story – deserve substantial criticism for their work. It is certainly long (the original article is ~8,000 words), but quantity of words does not entail quality of work. In what follows, I’ve broken down 34 of the false, misleading, and illogical claims raised in the article. I’ve used the Microsoft version of the article, as it is not behind a paywall and accessible to all readers. My hope is that this starts an open, and honest dialogue between reporters, SEOs, business owners and tech insiders as to what really happens behind the scenes in search.

As I’ve previously written on Twitter, articles like these– especially those published by usually authoritative sources like the WSJ – hurt the entire digital marketing industry. When business owners think the search game is rigged against them; that Google uses “black magic” in search results, and that there is nothing they can do about it, they tend move away from doing what’s best for their site and their business, which results in bad outcomes, which only furthers the belief that Google is acting in a nefarious way.

I’m not going to pretend that search is simple (it’s not) or that doing SEO isn’t frustrating (it is). But it is – for better or worse – a (relatively) level playing field. And with that, let’s get into it. All references are in the order in which they appear in the article. Quotes are from the MSN version of the article, accessed on Sunday, November 17, 2019 at 7 pm EST. Emphasis is mine, unless noted otherwise.

Claim #1:

“Google made algorithmic changes to its search results that favor big businesses over smaller ones”

There is no evidence that Google explicitly (or implicitly) changes its search results (algorithmically or manually) to favor big business over smaller businesses. Google has been extremely transparent in what it takes to rank well – and the basics really haven’t changed since the first updates in 2003/2004:

  • Produce great, original, high-value content that is responsive to the needs of your users and facilitates task accomplishment
  • Secure back-links from other authoritative*, high-quality sites (* = Google’s definition of “authoritative” is a bit different than most traditional ones)
  • Present that great content on a website that is easy to use (for both people and bots) and adheres to the current technical standards

Does it just so happen that bigger businesses tend to be better at doing those things than smaller businesses? Yes, probably. Larger companies (in general) tend to be better at marketing than smaller businesses (as in, they usually have marketing teams, while smaller businesses usually have team members wearing many hats, one of which is marketing). But does that mean that Google is “favoring” big businesses? Absolutely not. This is a classic example of correlation without causation.

Like all search engines, Google’s job is to present the pages that are most responsive to the query and intent of the searcher. If Google didn’t favor the things in the bulleted list above, and returned a bunch of poorly-designed pages chock full of spam links and terrible content, how often would people use Google? Never. That’s not good for Google’s business, nor is it good for the experience of the individual users.

There are countless examples of small businesses ranking extremely well (especially in local search results) for high-value queries. The reason for that? Those businesses produced, great, uniquely-valuable, high-quality content, secured authoritative back-links and presented that content on a well-built website.

Claim #2:

“In at least one case made changes on behalf of a major advertiser, eBay Inc., contrary to its public position that it never takes that type of action.”

More on this one later. But let’s just say that (a) I don’t think a company with a market cap of $900B is going to expose itself to massive liability for a paltry $30M in search ad revenue, (b) there is a clear separation of church and state (so to speak) at Google across Ads & Search teams and (c) just because Google made adjustments to its search algorithm that happened to impact many sites (including eBay) does NOT means that it made them for or because of eBay. Post hoc ergo propter hoc.

Claim #3:

“Google’s engineers regularly make behind-the-scenes adjustments to other information the company is increasingly layering on top of its basic search results. These features include auto-complete suggestions, boxes called “knowledge panels” and “featured snippets,” and news results, which aren’t subject to the same company policies limiting what engineers can remove or change.”

There is a LOT wrong here:

First: let’s start with the obvious (originally pointed out by Bill Slawaski on Twitter): this claim contains a false dilemma logical fallacy. How do they know what policies are, and whether or not various SERP features must use them? Also, there is ample evidence that SERP features are scored using the same algorithm as traditional blue links, as discussed by Kalicube.pro here.

Second: while there is no denying that search is incredibly complex, we do have a general idea of what Google wants: freshness, topicality, relevance/quality, entites, RankBrain, PageSpeed, Structured Data, etc.. While the application of these factors to different SERP features may admit of degrees (again, unconfirmed), there’s strong evidence that everything is combined together in a way that allows the features that are most responsive to the intent and query of the user to appear at the top. So, in many ways, it seems like the data suggests that everything is governed by the same algorithmic policies.

Further, Google is quite transparent about how SERP features work. You can read about structured data here, And featured snippets here. Want the entire SERP features gallery? No problem. Here you go.

Claim #4:

“Despite publicly denying doing so, Google keeps blacklists to remove certain sites or prevent others from surfacing in certain types of results. These moves are separate from those that block sites as required by U.S. or foreign law, such as those featuring child abuse or with copyright infringement, and from changes designed to demote spam sites, which attempt to game the system to appear higher in results.”

This is a very confused, very spicy meatball of a claim. Let’s break this one down:

First: the “blacklists” comes straight out of Project Veritas, which has been discredited by just above everyone, including the Cato Institute. H/T to Kristine Schachinger for getting this one first on Twitter.

Second: Google does “filter” search results to remove explicit content. They are extremely transparent about it (read here) and the intention is to prevent an innocent search at the office from turning up porn or who knows what else (the internet is a bit of a crazy place).

Third: Filtering search for spam sites has been happening since the first major update (Florida) in 2004 – and probably before. For as long as search engines have been directing traffic around the internet, people have been trying to game the system to get that traffic to their site. This isn’t new or news. This is the reality of the internet. Further: Google has been pretty transparent about what websites should (and should not) do via their Webmaster guidelines. They are published and available here.

Claim #5:

“In auto-complete, the feature that predicts search terms as the user types a query, Google’s engineers have created algorithms and blacklists to weed out more-incendiary suggestions for controversial subjects, such as abortion or immigration, in effect filtering out inflammatory results on high-profile topics.”

This is another example of where (ironically enough) a Google search would’ve cleared up a lot of confusion around how auto-complete works. The entire policy, including the rationale for it, is available here. No cloak-and-dagger secrecy. No shady practices s. Just a desire to prevent inappropriate, hateful and otherwise problematic suggestions from appearing.

In addition to the above, it is important to remember that there are other factors that are at play when autocomplete triggers – including the velocity of search, (i.e. if something is trending) and the topics involved. As Google has clearly stated, the goal of these policies is that “autocomplete should not shock users with unexpected or unwanted predictions.”

 

Claim #6:

“Google employees and executives, including co-founders Larry Page and Sergey Brin, have disagreed on how much to intervene on search results and to what extent. Employees can push for revisions in specific search results, including on topics such as vaccinations and autism.”

Google is a company that employs lots of people who write massively complex algorithms designed to sift through unfathomable amounts of information, all of which was created with unknown agendas. Search is (as the WSJ has pointed out) an extremely consequential part of the digital world. Most internet users trust Google to deliver them the most accurate, relevant and appropriate results.  In fact, many people view being listed on Page 1 of Google as a mark of credibility.

To its credit, Google has been extremely transparent in how it discloses its policies around sensitive issues, as well as on the larger questions of what pages should appear higher in SERPs via its Quality Raters Guidelines.

Claim #7:

“To evaluate its search results, Google employs thousands of low-paid contractors whose purpose the company says is to assess the quality of the algorithms’ rankings. Even so, contractors said Google gave feedback to these workers to convey what it considered to be the correct ranking of results, and they revised their assessments accordingly, according to contractors interviewed by the Journal. The contractors’ collective evaluations are then used to adjust algorithms”

Two things here:

  • Google pays Quality Rates about $13.50 an hour – which seems halfway decent, at the very least. McDonalds (on the other hand) only pays people an average of $9.45 an hour.
  • The more substantive point here – namely, that Quality Rates have a direct impact on search results – is simply not true. Again, Google is extremely transparent in both the guidelines that Raters are expected to use, and how those ratings may impact search results indirectly.

THE JOURNAL’S FINDINGS undercut one of Google’s core defenses against global regulators worried about how it wields its immense power—that the company doesn’t exert editorial control over what it shows users. Regulators’ areas of concern include anticompetitive practices, political bias and online misinformation.

Claim #8:

“Far from being autonomous computer programs oblivious to outside pressure, Google’s algorithms are subject to regular tinkering from executives and engineers who are trying to deliver relevant search results, while also pleasing a wide variety of powerful interests and driving its parent company’s more than $30 billion in annual profit. Google is now the most highly trafficked website in the world, surpassing 90% of the market share for all search engines.”

Once again, lots to unpack here. Someone should remind the “journalists” who wrote the piece that throwing a bunch of salacious claims together with no evidence does not constitute “proof” of anything, aside from the ability to type.

To the first claim, that Google makes lots of changes to its algorithm: yes. That happens. Most of these changes are minor updates that impact an extremely small percentage of SERPs; there are major ones (the most recent of which was BERT) that do impact the wider search landscape. If you’d like to review these changes, Moz keeps a pretty good list of the major ones. However, regular updates to any algorithm (just like regular updates to your OS or mobile device) do not entail “tinkering” by “executives and engineers” with the intention of “pleasing a wide variety of powerful interests.”

On the face, it seems like the explosion in the number of “changes” to the Google algorithms could be related to Google exerting editorial control (which is certainly the story the WSJ would like to spin). The reality of the situation is actually rather bland and logical: if you’d like to read about it, here’s a great article from Moz.

The short version: Google makes lots of small changes, each of which impact only a tiny, tiny fraction of SERPs. As the number of elements involved in SERPs has increased (SERP features, local search, proliferation of image, video, mobile and voice search, etc.), so too must the number of “improvements.” Let’s also remember that Google works around the world – so a change that impacts how various grammatical constructs in German are treated would (a) could as an “improvement” and (b) not really be very relevant to editorial control.

Finally: Google’s “algorithm” is actually a series of complex algorithms which work together to provide users with the most relevant, informative and authoritative results for their query.

Claim #9:

“Google said 15% of queries today are for words, or combinations of words, that the company has never seen before, putting more demands on engineers to make sure the algorithms deliver useful results.”

So, engineers are “tinkering” with results that they don’t even know exist? What?

This entire line of reasoning demonstrates a serious lack of understanding regarding how search works. Engineers aren’t trying to figure out what results should appear for individual searches. Engineers  are focused on creating and refining algorithms that use various technologies (like NLP and ML), along with policies (which they’ve detailed extensively) to solve the meta-problem of search: providing relevant results to queries using many data points.

For instance, Google uses natural language processing (NLP) to break down queries into relevant components (i.e. parse the intent of the query), overlays it with lots of other information (location, language, web browsing history, device type, etc.), then retrieves pages from the index that are most responsive to that query. This process works regardless of whether the query has been searched before (though having actual user data detailing the outcome is helpful in refining results). While the actual algorithms are kept secret, the general information on how the process works is not.

Claim #10:

“AS PART OF ITS EXAMINATION, the Journal tested Google’s search results over several weeks this summer and compared them with results from two competing search engines, Microsoft Corp.’s Bing and DuckDuckGo, a privacy-focused company that builds its results from syndicated feeds from other companies, including Verizon Communications Inc.’s Yahoo search engine.

The testing showed wide discrepancies in how Google handled auto-complete queries and some of what Google calls organic search results—the list of websites that Google says are algorithmically sorted by relevance in response to a user’s query. (Read about the methodology for the Journal’s analysis.)”

There is so, so much wrong with this.  Let’s first get this one out of the way: if you’re going to assess a massively complex machine that handles 3.8M queries a minute, you’ll need a larger sample size than 17 queries t over a period of 31 days (in three periods). Sample size matters.

The terms used by the WSJ were also incendiary and politically charged, which itself would likely trigger Google’s policies around autocomplete intended to reduce shocking, unwanted or unexpected results.

Claim #11:

“A 2016 internal investigation at Google showed between a 10th of a percent and a quarter of a percent of search queries were returning misinformation of some kind, according to one Google executive who works on search. It was a small number percentage-wise but given the huge volume of Google searches it would amount to nearly two billion searches a year.

By comparison, Facebook faced congressional scrutiny for Russian misinformation that was viewed by 126 million users.”

So, in 2016 there were about 1.2T searches on Google, so 0.10% would be 1.2B searches and 0.25% would be 3B searches. Not sure where we’re coming up with 2B, but let’s leave that alone and focus on the more substantive issue.

There is a fundamental difference between ~2B SERPs *including* Google-defined misinformation of some kind (which includes content defined as lowest quality) and Facebook showing Russian misinformation about a Presidential election that was *viewed* by 125M people (roughly ½ of US Adults).

For one: it’s rather difficult to compare searches with users – a single user could search for (and view) 1,000 SERPs containing misinformation; that’s 1 person and 1,000 SERPs. Given the nature of misinformation, it is entirely plausible that the number of actual people who viewed a SERP containing what Google defined as misinformation could be far less.

For another: the definition of misinformation used by Google is fundamentally different than the one used in conjunction with the Facebook incident, as Google’s definition includes any sites that is deemed to be “lowest quality.”

Last (but certainly not least): Google SERPs include (at least) 10 blue links, along with other SERP features. Just because a SERP contains one piece of misinformation (i) does not entail that the link to the misinformation was viewed and (ii) even if it was, does not entail that it was actually clicked on.

Claim #12:

“Google assembled a small SWAT team to work on the problem that became known internally as “Project Owl.” Borrowing from the strategy used earlier to fight spam, engineers worked to emphasize factors on a page that are proxies for “authoritativeness,” effectively pushing down pages that don’t display those attributes.”

As mentioned above, Google’s definition of “authoritativeness” is far more technical than the one used in mainstream conversation, and the use of E-A-T (expertise, authoritativeness & trust) rating guidelines is not new.

Claim #13:

“The U.S. Justice Department earlier this year opened an antitrust probe, in which Google’s search policies and practices are expected to be areas of focus. Google executives have twice been called to testify before Congress in the past year over concerns about political bias. In the European Union, Google has been fined more than $9 billion in the past three years for anticompetitive practices, including allegedly using its search engine to favor its own products.”

OK, while this is true, the EU fines were about (1) pre-installing Google Search in Android, (2) prioritizing Google shopping results, and (3) AdSense competition. What does any of that have to do with organic search rankings?

For more information on Google’s colorful history with the EU, here’s a helpful Wikipedia page.

Claim #14:

“GOOGLE RARELY RELEASES detailed information on algorithm changes, and its moves have bedeviled companies and interest groups, who feel they are operating at the tech giant’s whim.”

This…isn’t entirely true. There are a number of Google experts, including John Mueller, Gary Illyes, who often answer questions about search changes, updates, etc. There are also a number of public resources and communications channels where information and updates on search are published.

That all being said, the premise that companies who rely on Google for a large share of their web traffic (and, in turn, their business) are at the whim of Google is true. If you don’t want to be reliant on Google for traffic, then get a different business. The reality is that Google is providing a service that *happens* to benefit you, but only if your company (a) complies with their terms and conditions, (b) continues to deliver value for your users in the form of high-quality, authoritative content, and (c) does all of that via a website that meets performance standards. Google has provided resources and guidance for (a) and (b), along with tools (including Webmaster Tools & PageSpeed Insights) to help with (c).

Claim #15:

“In one change hotly contested within Google, engineers opted to tilt results to favor prominent businesses over smaller ones, based on the argument that customers were more likely to get what they wanted at larger outlets. One effect of the change was a boost to Amazon’s products, even if the items had been discontinued, according to people familiar with the matter.”

We’ve already covered the “big business vs. small business” point above. Also, I’ll just let Ms. Levin from Google cover the discontinued stocks:

“Ms. Levin said there is no guidance in Google’s rater guidelines that suggest big sites are inherently more authoritative than small sites. “It’s inaccurate to suggest we did not address issues like discontinued products appearing high up in results,” she added.”

Not nefarious. Just a complex problem that is usually made more difficult by platforms continuing to try to index pages for discontinued items.

Claim #16:

“Google engineers said it is widely acknowledged within the company that search is a zero-sum game: A change that helps lift one result inevitably pushes down another, often with considerable impact on the businesses involved.”

Again, Google does not directly help or hurt any individual business or organization. Google’s job (and why they are a $900B company) is to deliver the set of results that are most responsive to the query and most relevant to the user, based on the information Google has. That’s it.

If your business relies on large volumes of Google traffic to make money (directly or indirectly), then your business model has a risk factor. No business is entitled to organic traffic from Google (or any search engine).  Being ranked on Page 1 is not a right.  it is a privilege. This privilege is earned over time by doing the hard work necessary to comply with Google guidelines and deliver extraordinary content and value for your users.

Claim #17:

“Some very big advertisers received direct advice on how to improve their organic search results, a perk not available to businesses with no contacts at Google, according to people familiar with the matter. In some cases, that help included sending in search engineers to explain a problem, they said.”

This is disingenuous at best, as (a) Google’s webmaster guidelines, quality rater guidelines, expert insights, and more, are all publicly available, and (b) that the very next paragraph explicitly states that this information was all available publicly (emphasis mine):

“If they have an [algorithm] update, our teams may get on the phone with them and they will go through it,” said Jeremy Cornfeldt, the chief executive of the Americas of Dentsu Inc.’s iProspect, which Mr. Cornfeldt said is one of Google’s largest advertising agency clients. He said the agency doesn’t get information Google wouldn’t share publicly. Among others it can disclose, iProspect represents Levi Strauss & Co., Alcon Inc. and Wolverine World Wide Inc.”

Claim #18:

“One former executive at a Fortune 500 company that received such advice said Google frequently adjusts how it crawls the web and ranks pages to deal with specific big websites. Google updates its index of some sites such as Facebook and Amazon more frequently, a move that helps them appear more often in search results, according to a person familiar with the matter.”

This is a very confused series of words. Let’s try to break it down:

Crawling & indexing is an expensive process for any search engine – it requires time and resources to find each page, make a copy, follow each link and repeat. And, as any good business does, it’s important to manage and optimize your resource-intensive activities in order to maximize profit (NOT talking about advertising here – just simple costs associated with crawling and indexing the web). Google is no different, which is why they “frequently adjust how it crawls the web.”

Google does update the index of more highly trafficked sites more frequently, simply because (a) that’s the only way the new content published on those platforms makes it into search results and (b) sites with large volumes of traffic are likely the ones that most users will be searching for.

To illustrate, let’s consider a hypothetical example: the New York Times and Joe’s Blog both publish an article – for the NYT, it’s their 50th of the day; for Joe, it’s his first in a month. As a fairly popular publication, the NYT receives about ~146M visits per year (that’s an average of ~400k per day); Joe’s blog receives 500 visits per day. As we’ve discussed above, crawling & indexing is an expensive task, so Google (and every other search engine) needs to prioritize what sites it will crawl first, and it does this based on a number of factors, including overall site popularity, probability of finding new content, etc.

In the case of our two websites, Google is likely to crawl the NYT more frequently than they crawl Joe’s blog, because (a) the probability of them finding new content that is not currently in the index is higher and (b) because more people are likely to be looking for the content found in the NYT than on Joe’s blog, which means that the content on the NYT site is likely to have more impact. Again, let’s return to basics: Google’s job is to be responsive to the intent and informational needs of the searcher. More searchers are likely to want to see a NYT article than one on Joe’s blog. Therefore, Google prioritizes its limited resources in such a way that the NYT gets crawled more frequently than Joe’s blog. This isn’t nefarious, it’s logical.

However, Google updating its index does NOT help a site appear more often in search results – the index is simply the search engine’s copy of the website. Updating a specific site’s section of the index more frequently may make *new* content eligible to appear in SERPs more rapidly but indexing the same bad content 10,000 times will not make it appear more frequently in SERPs.

Claim #19:

“There’s this idea that the search algorithm is all neutral and goes out and combs the web and comes back and shows what it found, and that’s total BS,” the former executive said. “Google deals with special cases all the time.”

See #18.

Claim #20:

“Online marketplace eBay had long relied on Google for as much as a third of its internet traffic. In 2014, traffic suddenly plummeted—contributing to a $200 million hit in its revenue guidance for that year.

Google told the company it had made a decision to lower the ranking of a large number of eBay pages that were a big source of traffic.

eBay executives debated pulling their quarterly advertising spending of around $30 million from Google to protest, but ultimately decided to step up lobbying pressure on Google, with employees and executives calling and meeting with search engineers, according to people familiar with the matter. A similar episode had hit traffic several years earlier, and eBay had marshaled its lobbying might to persuade Google to give it advice about how to fix the problem, even relying on a former Google staffer who was then employed at eBay to work his contacts, according to one of those people.

This time, Google ultimately agreed to improve the ranking of a number of pages it had demoted while eBay completed a broader revision of its website to make the pages more “useful and relevant,” the people said. The revision was arduous and costly to complete, one of the people said, adding that eBay was later hit by other downrankings that Google didn’t help with.”

Lots to unpack here – and many questions:

  1. Why were the pages downranked? Given the timeframe (2014), this was likely Panda 4.0 & Payday Loan 2.0 updates. If you’d like to read a pair of excellent articles on what was behind this and why it happened, I suggest this one from Moz and this one from Refugeeks.
  2. So, Google (the $900B gorilla) was going to manipulate its own results and expose itself to MASSIVE risk from governments and regulators the world over for about $30M in advertising spend from eBay? Really? That’s the case you want to make?
  3. And most importantly, if you read the piece from Refugeeks, you’ll see that there was some curious timing on internal linking structures for the 2014 incident. So…it doesn’t seem like Google changed the algorithm as much as eBay changed their site, then started to rank well again.

Looking at the eBay situation specifically, one of the things that stands out is the fact that eBay had engaged in some shady link practices around the /bhp/ directory. Those pages appear to be the ones negatively impacted by the algorithm update, so it seems like: (i) the algorithm update worked as intended (ii) eBay responded to that with a structural change to the website and (iii) eBay’s pages subsequently ranked higher in organic search results. This doesn’t illustrate anything nefarious – it actually demonstrates quite the opposite: even some of the largest sites in the world are not immune from being penalized by Google.

Claim #21:

(The Wall Street Journal is owned by News Corp, which has complained publicly about Google’s moves to play down news sites that charge for subscriptions. Google ended the policy that year after intensive lobbying by News Corp and other paywalled publishers. More recently, News Corp has called for an “algorithm review board” to oversee Google, Facebook and other tech giants. News Corp has a commercial agreement to supply news through Facebook, and Dow Jones & Co., publisher of The Wall Street Journal, has a commercial agreement to supply news through Apple services. Google’s Ms. Levin and News Corp declined to comment.)

Kind of shady for WSJ to bury this all the way down here. Also, I wonder if the WSJ would agree to a “news review board.” #AskingForaFriend.

Claim #22:

“Jonathan Zittrain, a Harvard Law School professor and faculty director of the Berkman Klein Center for Internet & Society, said Google has poorly defined how often or when it intervenes on search results. The company’s argument that it can’t reveal those details because it is fighting spam “seems nuts,” said Mr. Zittrain.

“That argument may have made sense 10 or 15 years ago but not anymore,” he said. “That’s called ‘security through obscurity, a reference to the now-unfashionable engineering idea that systems can be made more secure by restricting information about how they operate.

Google’s Ms. Levin said, “extreme transparency has historically proven to empower bad actors in a way that hurts our users and website owners who play by the rules.”

To Dr. Zittrain’s assertion: that’s hardly the claim Google is making. Google is running a business; the algorithm is the “secret sauce” that differentiates it from all of the other search engines out there. If Google reveals how it ranks pages, every other search engine can copy it – and then Google has no business. That’s not fair to Google or its shareholders – and certainly isn’t how a market economy ought to operate.

As I hope the many, many links included throughout this article have demonstrated, Google is actually quite transparent about how it ranks pages and displays results. But that transparency must be balanced by protection of the company’s intellectual property.

Claim #21:

“On one extreme, those decisions at Google are made by the world’s most accomplished and highest-paid engineers, whose job is to turn the dials within millions of lines of complex code. On the other is an army of more than 10,000 contract workers, who work from home and get paid by the hour to evaluate search results.

Evaluate. Not update. Big difference.

Also: it’s rather difficult to simultaneously sustain the following claims (which the WSJ piece tries to do): (1) Google pays ~$200M per year to QRs (~10k at $13.50/hr, working 20-30 hours per week, plus management to companies that employ them) for feedback on SERPs, which is used to inform future algorithm updates; but (2) Google engineers are constantly meddling with the algorithm to modify SERPs for their own agendas and (3) somehow training the constantly-rotating pool of ~10,000 or so Quality Raters to review SERPs exactly how they want them to so they can (4) use the QR feedback as a (very expensive) cover for those changes…all the while (5) Google is so desperate for cash that they’ll just ignore the $200M in research they paid for in order to get eBay’s paltry $30M in search ad spend.

Claim #22:

One of the first hot-button issues surfaced in 2015, according to people familiar with the matter, when some employees complained that a search for “how do vaccines cause autism” delivered misinformation through sites that oppose vaccinations.

At least one employee defended the result, writing that Google should “let the algorithms decide” what shows up, according to one person familiar with the matter. Instead, the people said, Google made a change so that the first result is a site called howdovaccinescauseautism.com—which states on its home page in large black letters, “They f—ing don’t.” (The phrase has become a meme within Google.)

Vaccines don’t cause autism. Let’s just clear that one up. Please.

Claim #23:

In the fall of 2018, the conservative news site Breitbart News Network posted a leaked video of Google executives, including Mr. Brin and Google CEO Sundar Pichai, upset and addressing staffers following President Trump’s election two years earlier. A group of Google employees noticed the video was appearing on the 12th page of search results when Googling “leaked Google video Trump,” which made it seem like Google was burying it. They complained on one of the company’s internal message boards, according to people familiar with the matter. Shortly after, the leaked video began appearing higher in search results.

Google is not biased against conservative media.

Claim #24:

“Google already had been taking what the company calls “manual actions” against specific websites that were abusing the algorithm. In that process, Google engineers demote a website’s ranking by changing its specific “weighting.” For example, if a website is artificially boosted by paying other websites to link to it, a behavior that Google frowns upon, Google engineers could turn down the dial on that specific weighting. The company could also blacklist a website or remove it altogether.”

Google does remove sites that fail to comply with applicable law, or fail to meet Google’s quality guidelines (again, which are available here). “Blacklists” is a discredited thing right out of Project Veritas.

Manual actions are issued when a human reviewer determines that one or more pages on the site are not compliant with the Quality Rater Guidelines. Site owners are appraised of manual actions through the Search Console Message Center, and there is a clearly laid out remediation process – this is not evil engineers turning knobs to hurt innocent sites, it’s trained experts working with site owners to ensure that their website is compliant with Google’s terms and conditions.

More information on manual actions here.

Claim #25:

“Search-engine optimization consultants have proliferated to try to decipher Google’s signals on behalf of large and small businesses. But even those experts said the algorithms remain borderline indecipherable. “It’s black magic,” said Glenn Gabe, an SEO expert who has spent years analyzing Google’s algorithms and tried to help DealCatcher find a solution to its drop in traffic earlier this year.”

Hey, WSJ reporters: here’s a #ProTip: don’t misquote your sources. Glenn Gabe is one of the best SEOs out there, and he would NEVER use the term “black magic” to describe Google’s algorithm.

Here’s Glenn Gabe’s entire response and account of the situation, from Twitter. He also provided a lengthy quote to SearchEngineLand, for their piece addressing this article.

But this particular issue belies a much deeper one: the use of “people familiar with the matter” a whopping 19 times in the article, according to an analysis posted on Twitter by Dan Gahant. Who are these people? Do they even work at Google and have direct knowledge of the situation? What are their credentials? Making major assertions of foul play and under-handed tactics while hiding behind “people familiar with the matter” is a dangerous game.

Claim #26:

“In April, the conservative Heritage Foundation called Google to complain that a coming movie called “Unplanned” had been labeled in a knowledge panel as “propaganda,” according to a person familiar with the matter. The film is about a former Planned Parenthood director who had a change of heart and became pro-life.

After the Heritage Foundation complained to a contact at Google, the company apologized and removed “propaganda” from the description, that person said.”

Google’s Ms. Levin said the change “was not the result of pressure from an outside group, it was a violation of the feature’s policy.”

So, search results aren’t always perfect, but when errors arise, they are rectified. Google is not a mindless robot – it is a company full of incredibly talented, intelligent people who are trying to solve an insanely complex problem.

Also: Google is not biased against conservative media.

Claim #27:

“Google still maintains lists of phrases and terms that are manually blacklisted from auto-complete, according to people familiar with the matter.

The company internally has a “clearly articulated set of policies” about what terms or phrases might be blacklisted in auto-complete, and that it follows those rules, according to a person familiar with the matter.

Blacklists also affect the results in organic search and Google News, as well as other search products, such as Web answers and knowledge panels, according to people familiar with the matter.”

Google has said in congressional testimony it doesn’t use blacklists. Asked in a 2018 hearing whether Google had ever blacklisted a “company, group, individual or outlet…for political reasons,” Karan Bhatia, Google’s vice president of public policy, responded: “No, ma’am, we don’t use blacklists/whitelists to influence our search results,” according to the transcript.

Ms. Levin said those statements were related to blacklists targeting political groups, which she said the company doesn’t keep.”

“The purpose of the blacklist will be to bar the sites from surfacing in any Search feature or news product sites,” the document states.

So much of this is just confused. The intention (presumably) of this is to show how Google “lied” by saying they don’t keep blacklists when they really do – the reality is that Google said it does NOT keep blacklists of sites targeting political groups.

There also appears to be a fundamental misunderstanding about what is entailed by the term “blacklist” – with several different definitions being conflated in an attempt to confuse and distort the truth.

Google has said that some sites are removed from the index for violations of the law or Google’s policies. This is not news.

Google has also said that certain types of “fresh” or “breaking news” content may be restricted to mitigate the dissemination of fake news that could put the safety of individuals at risk. Again, not nefarious or opaque.

Claim #28:

“Ms. Levin said Google does “not manually determine the order of any search result.” She said sites that don’t adhere to Google News “inclusion policies” are “not eligible to appear on news surfaces or in information boxes in Search.”

SOME INDIVIDUALS and companies said changes made by the company seem ad hoc, or inconsistent. People familiar with the matter said Google increasingly will make manual or algorithmic changes that aren’t acknowledged publicly in order to maintain that it isn’t affected by outside pressure.

“It’s very convenient for us to say that the algorithms make all the decisions,” said one former Google executive.”

This is disingenuous and fairly confused. It is certainly true that Google has made an increasing number of “improvements” to its search algorithm, and details on a fraction of them are acknowledged publicly. But, as stated above in Claim XX, this isn’t nefarious or evidence that Google is manually determining the order of search results (of which no evidence is provided) – more changes are a natural consequence of an increasingly complex digital landscape and an evolving search ecosystem.

Let’s also be clear about two things: (1) individual SERPs are determined algorithmically in a way that is intended to mimic what Google puts forth publicly in their Quality Raters Guidelines, Webmaster Guidelines and Policies; (2) a small percentage of SERPs are tested manually via the Quality Raters, whose feedback does NOT directly impact SERPs; (3) Google – at its core – is a collection of incredibly sophisticated algorithms that are written by and overseen by real humans. But that does NOT entail that those individual humans are meddling in individual searches.

Claim #29:

In March 2017, Google updated the guidelines it gives contractors who evaluate search results, instructing them for the first time to give low-quality ratings to sites “created with the sole purpose of promoting hate or violence against a group of people”—something that would help adjust Google algorithms to lower those sites in search.

The next year, the company broadened the guidance to any pages that promote such hate or violence, even if it isn’t the page’s sole purpose and even if it is “expressed in polite or even academic-sounding language.”

Again, search is a constantly-evolving process. Even the best-designed algorithms are unlikely to detect all speech patterns that could be communicating hateful or violent messages against other people. The fact that search engines are being responsive to shifts in online conversation should count as a point in favor, not a strike against. Further, Google has been incredibly transparent in how it is attempting to address the issues of fake news, problematic/disturbing answers and offensive content via Project Owl.

Claim #30:

“…addiction industry officials also noticed a significant change to Google search results. Many searches for “rehab” or related terms began returning the website for the Substance Abuse and Mental Health Services Administration, the national help hotline run by the U.S. Department of Health and Human Services, as the top result.

Google never acknowledged the change. Ms. Levin said that “resources are not listed because of any type of partnership” and that “we have algorithmic solutions designed to prioritize authoritative resources (including official hotlines) in our results for queries like these as well as for suicide and self-harm queries.”

A spokesman for SAMHSA said the agency had a partnership with Google.”

At best, this seems like a conflation around the word “partnership” – and given that SAMHSA did not provide any documentation of the partnership, this seems like a suspicious claim. There are many, many “partnership” opportunities with various Google business units, the most common of which is a “Google Partner” designation that is given to agencies.

Further, the adjustment of search results to include authoritative content and resources – especially those results for so-called “Your Money or Your Life” (YMYL) queries – is hardly news. How these types of queries are handled is heavily documented in the Quality Raters Guidelines.

Claim #31:

“Google’s search algorithms have been a major focus of Hollywood in its effort to fight pirated TV shows and movies.

Studios “saw this as the potential death knell of their business,” said Dan Glickman, chairman and chief executive of the Motion Picture Association of America from 2004 to 2010. The association has been a public critic of Google. “A hundred million dollars to market a major movie could be thrown away if someone could stream it illegally online.”

Google received a record 1.6 million requests to remove web pages for copyright issues last year, according to the company’s published Transparency Report and a Journal analysis. Those requests pertained to more than 740 million pages, about 12 times the number of web pages it was asked to take down in 2012.

A decade ago, in concession to the industry, Google removed “download” from its auto-complete suggestions after the name of a movie or TV show, so that at least it wouldn’t be encouraging searches for pirated content.

In 2012, it applied a filter to search results that would lower the ranking of sites that received a large number of piracy complaints under U.S. copyright law. That effectively pushed many pirate sites off the front page of results for general searches for movies or music, although it still showed them when a user specifically typed in the pirate site names.

In recent months the industry has gotten more cooperation from Google on piracy in search results than at any point in the organization’s history, according to people familiar with the matter.”

Again, Google is extremely transparent in how pages are ranked. Mass distribution of pirated (read: stolen) content is against the law. Google stipulates that it may remove pages from SERPs that violate the law.

Claim #32:

CRITICISM ALLEGING political bias in Google’s search results has sharpened since the 2016 election…. Over the past year, abortion-rights groups have complained about search results that turned up the websites of what are known as “crisis pregnancy centers,” organizations that counsel women against having abortions, according to people familiar with the matter.”

Google is not biased against conservative media.

Claim #33:

“One of the complaining organizations was Naral Pro-Choice America, which tracks the activities of anti-abortion groups through its opposition research department, said spokeswoman Kristin Ford.

Naral complained to Google and other tech platforms that some of the ads, posts and search results from crisis pregnancy centers are misleading and deceptive, she said. Some of the organizations claimed to offer abortions and then counseled women against it. “They do not disclose what their agenda is,” Ms. Ford said.

In June, Google updated its advertising policies related to abortion, saying that advertisers must state whether they provide abortions or not, according to its website. Ms. Ford said Naral wasn’t told in advance of the policy change.

Ms. Levin said Google didn’t implement any changes with regard to how crisis pregnancy centers rank for abortion queries.

The Journal tested the term “abortion” in organic search results over 17 days in July and August. Thirty-nine percent of all results on the first page had the hostname www.plannedparenthood.org, the site of Planned Parenthood Federation of America, the nonprofit, abortion-rights organization.

By comparison, 14% of Bing’s first page of search results and 16% of DuckDuckGo’s first page of results were from Planned Parenthood.

Ms. Levin said Google doesn’t have any particular ranking implementations aimed at promoting Planned Parenthood.”

So not all search engines are the same? Huh. Who would’ve thought?

Also, there are (as the WSJ has pointed out) thousands of factors that determine what results appear in SERPs.

Claim #34:

“The practice of creating blacklists for certain types of sites or searches has fueled cries of political bias from some Google engineers and right-wing publications that said they have viewed portions of the blacklists. Some of the websites Google appears to have targeted in Google News were conservative sites and blogs, according to documents reviewed by the Journal. In one partial blacklist reviewed by the Journal, some conservative and right-wing websites, including The Gateway Pundit and The United West, were included on a list of hundreds of websites that wouldn’t appear in news or featured products, although they could appear in organic search results.”

Google has been clear that content included in news is subject to the policies outlined here.