Search query reports, or search terms reports, are a cornerstone of active PPC management in Google Ads & Microsoft Ads search campaigns. Search queries offer insight into our end user’s search habits and illustrate how these search terms relate to the keywords that advertisers bid on.
Search queries are not the same thing as the keywords we target.
Rather, search queries are the closest we’ll get to understanding what occurs in our online buyer’s heads. They elucidate what a user is really thinking, complete with their misspellings, ugly grammar, strange attempts to seek out information and more. Through search queries, we’re able to see that moment of truth which results in a click on our ads.
Well, at least that’s how it used to be.
I believe I can speak for my whole profession when I say we were disappointed when Google announced it would limit our access to search term information effective September 1, 2020. And according to the early reports, it was worse than we feared with Seer reporting 20.4% of queries had gone missing.
This means that Google Ads search advertisers will see fewer search terms in their search terms reports, even when the search term receives a click. Starting back in September 2020, this was the new world order – and there really isn’t much we can do about it.
Simply put, we no longer have full insight into how Google chooses to match our targeted keywords to user’s search queries.
Running a search campaign in Google Ads is often a game of fishing. Paid media advertisers try what we think will work best, test it in the market, and respond based on the data we see. From there, we simply do more of what’s working and less of what’s not.
For example, if we test a keyword like “SEO Company,” we may come to find that “SEO company Chicago,” is more successful in terms of our campaign goals. When we go to bolster that better performing key phrase, we strive to create higher visibility and gain more market share via the more profitable terms.
If a specific keyword variation is working well toward our goal, then we want our ad to show up in the results every single time someone searches “SEO company Chicago.” Our aim is for something called “100% impression share” meaning it shows up every single time the word is searched.
On the flip side, we need to remove our ads from the ad auctions that have little or no value for us.
If we continue with the previous example, we’d want to remove keywords along the lines of “cheap SEO company.” That type of keyword is irrelevant because, truth be told, DTC is not cheap on the spectrum of SEO service costs. As a result, we’d be wasting money on queries entered by unqualified prospects.
There are essentially two ways we use search queries to inform our paid media management:
Search query data is a big deal to paid media analysts. It empowers segmentation based on performance data, relative to targeted keywords. To truly understand the impact, we need to turn our attention to the Long Tail.
Written in 2006, “The Long Tail” is a book by Chris Anderson that explores what happens just beyond the “high-volume head” of a demand curve. In the Moz graphic below, you can see how this theory applies to search. The top 100 keywords get millions of monthly searches. By the time you make your way down to the tail of the curve, here’s where we find all the one-time searches.
We typically see this long tail distribution in our search marketing search terms reports. A few variations of a single keyword get most of the impression and click volume. But there are many one-off search queries that contribute to the long tail collection of search.
It’s estimated that 16-20% of daily queries fielded by Google have never been asked before. These rogue queries typically fall within the margin of the “long tail” where we lost visibility with Google’s September search term purge.
One-off searches are more unpredictable. When you count all the variations of long tail queries, they contribute to a significant portion of the total search volume an account receives (70%, in fact, according to the Moz Graphic – not that we’d want to be bidding on all of these terms via our paid search campaigns).
However, one-off searches are less likely to reach a critical mass. Making them now a less frequent occurrence in a Google Ads search terms report. This means we’ve lost visibility into which search queries receive clicks, likely damaging the long tail.
How many queries that are bad for both your account and budget will now be able to squeeze through?
We ran an analysis on a client, with a steady ad spend north of $15K/month, to compare visibility before and after Google’s search term purge, here is what we found:
While our main “head terms” retain much of their data, when we get to the long tail, the impact of Google’s change becomes apparent.
During the summer of 2020, Google Ads allowed us to see 99.5% of search queries that produced our ads (we were missing just 18 queries). Fast forward to the fall of 2020 when Google Ads changed search query access to advertisers — we could only see 63.2% of the search queries (we were now missing 1,266 queries).
This purged visibility disproportionately affected long tail queries, which can be stranger and harder to predict and thus harder to avoid.
In the summer, 33% of our queries were one-off searches which triggered a single click over a 30 day period. In the fall, we could only see 4.2% of one-off searches that triggered clicks. That’s a lot of missing one-off clicks that we, as advertisers, have lost insight into.
We cannot have what Google Ads will not give us, so we’re going to have to get creative.
Queue up proactive negative keyword research: Proactive negative keyword research applies all the techniques we use to uncover keywords opportunities and turns it on its head. Such that we’re looking for all the auxiliary terms that would indicate a low value search term so we can proactively exclude them from our campaigns.
The value of proactive negative keyword research has dramatically increased. Whereas we had always performed some level of proactive negative keyword research before, the frequency for each client needs to increase.
One of the problems with the search terms report is that it showed low value keywords after we’d spent budget on them. While we couldn’t undo the past, but we could prevent it from happening again by adding those terms to our negative keyword list.
There are four methods you can use to proactively find and identify good negative keyword candidates, which we’ll now go through in greater detail.
Keyword Everywhere pulls data from Google and displays it conveniently on the Google search results page.
I have been using this Chrome plugin for several years now. They have a paid version, but the free version does the job. It is great for the quick and dirty on the fly keyword review and it has the added benefit of providing you with some treading data as well.
You get a helpful trend graph for the search term you used, plus a quick digest of “related keywords” that Google typically provides at the bottom of the page. Additionally, Keywords Everywhere provides a section which details the “people also search for” keywords.
Both resources are excellent to consult in your search for additional negative keywords, and you might even find a few new targeting ideas while you’re at it.
Google can be your friend when thinking through related terms to your search query.
For any search you conduct on Google there will be “related” search terms featured at the bottom of the page. This is exactly where the Keywords Everywhere tool grabs their information as detailed in method 1.
Other tools also scrape this information (UberSuggest, KeywordTool.io, etc), but I like using Google by hand best. It can be time consuming, but you are able to catch a lot more related search terms than you would combing a pre-populated list.
Just look at how Google’s suggested queries change based on the subtlety of my search. By placing an “a” in front of my search terms, Google retrieves different suggestions.
You can find radically different results based on the placement of your letter (before you head term, between a two-word phrase, or trailing your main query). Moreover, this can happen for every letter of the alphabet or any number.
The objective here is to explore the unknown – to be open to not being able to predict every iteration and find new opportunities you wouldn’t normally consider.
The other option you have at your fingertips while running A-Z manually on Google is to pull on each interesting thread you find and take it further.
For example, in our “designer handbag” example, I may want to explore the queries connected to the modifier “affordable.” By entering the modifier and clicking to explore the related terms you expand your potential keyword universe.
You can find many new keyword (both positive and negative) opportunities through using Google Suggest.
However, once your search gets super long tail, this method is more effective at finding proactive negative keywords due to Google’s “low search volume” keyword designation; meaning you can always block these types of searches through negative keywords, but you may not be able to explicitly target these precise variations.
At DTC, we use both SEMRush and SpyFu, each has its own paid license. These tools work by keeping a database of monitored search terms and websites, and tracking who shows up for what, and when. It’s not official Google data, but it’s directionally helpful.
They are, however, incomplete. If these tools do not have a record of a specific search term in their own unique database, they will not report data on the term in question.
If you have access to these tools, you can upload your company’s website and see the search terms each tool believes your website appears up for. Since you also have Google’s actual data in front of you, you can easily compare your targeting and your intent, versus how these tools show you based on their database of terminology.
Google has historically only shown search terms that yielded a click. As a result, this has long been a favorite method to find terms that trigger ads, but don’t leave traces in Google Ads search terms. AKA – the bad terms our ads may appear for, but don’t ever get clicks. If it’s bad, let’s remove it and ensure our budgets don’t get syphoned off this way.
When I looked at our agency’s Google Ads account, I found that our ads showed up for a “Dr. Jennifer Seo Chicago.” This was a good candidate to enlist as a negative keyword. Seo is a popular last name – and we can see how the user’s need changes based on the additional context. “SEO Chicago” and “Dr. Seo Chicago” are worlds apart.
We added medical terminology like “dr” and “doctor” to our negative keyword list. As a result, our negative keywords will block out all future “doctor x SEO” searches from triggering our ads in the future.
While there are some major differences between Google and Bing, Bing Ads still gives you more data in their search terms reports than Google currently does.
When we looked at the same client (over the same periods as our previous example), we found that Microsoft Ads retrieved 87.5% of our total query data even after Google’s search term purge.
While Microsoft Ads does not have the volume of queries of Google Ads, both capture search terms based on user intent. Because of this similarity in search behavior, when we identify bad matches in Bing, we should almost always add those terms to our Google Ads account as well.
There will not be as much data for review, but it appears Microsoft will give us more info on long tail searches which result in ad clicks… at least for now.
Note: Since Microsoft has an established habit of mimicking each change on the Google Ads end, I highly recommend consulting your Microsoft Ads search terms reports while you can. I wouldn’t be surprised if Microsoft announced they will engage in their own search terms purge.
Reactive negative keyword work will keep going. While we didn’t lose insights into everything, our work with reactive negative keyword management will go slower and operate at a smaller scale than it did previously. We can still mine queries and restructure accounts based on what is and isn’t working, but, Google’s changes cost us valuable insights into user search intent.
In paid search management, we will have to apply additional proactive research techniques more frequently to ward off unprofitable search terms for our clients. As things change within the verticals we work in, it is increasingly important to periodically audit the state of our negative keyword lists.
Search is a living thing – world events affect what people desire and influence how they search – so staying atop those changes is an ongoing part of managing paid search advertising accounts. And at the end of the day, we want to do everything in our power to make sure our message gets in front of the right audience, hasn’t that always been the job of marketing?