Negativation strategies for bad traffic segments in Google Ads

We’ll see more and more inefficient search patterns and wonder how our budgets disappeared so soon. However, there are ways to outsmart Google and deal with close variants.

Google wants to drive even more traffic to your keywords by matching search queries that wouldn’t have been triggered in the past. It accomplishes this through the way keyword match types work: Close variants have been introduced, and exact isn’t exact anymore. This gives Google significant power as it controls what “close variant” means.

For us marketeers, it means we’re going to see more and more inefficient search patterns that we have to handle somehow. Here are effectual strategies for striking back.

On this post
    Negativation strategies for bad traffic segments in Google Ads

    Monitor closely how Google matches search queries with keywords

    There will be changes under the hood, so we should monitor the system somehow. Here are some approaches for that:

    • Impression share of close variants over time
    • The unique number of search queries over time
    • The unique number of single words over time

    We’ve observed that the number of unique search queries and words exploded from one day to the next without any change in the account. This is the main driver for the increasing number of clicks. SEO guys will know the timing: Google made a core update on January 2020.

    If you’ve made similar observations for your PPC accounts, please share them.

    In the following weeks, we’ve been looking for and eliminating the new “noise” in traffic that Google is adding to our accounts.

    How can you identify the “noise” in your search queries?

    It’s a bad idea to look at complete search queries for setting negative keywords for several reasons:

    • The sample size is very low for most search queries. This means that most bad search patterns are still hidden.
    • If you apply negative keywords to entire search queries, similar search queries may still be active.
    • If you do this at the query level, you’ll eventually run out of negatives (Google limits the number of shared negative keywords.)

    A superior approach is to transform the search queries into n-grams. This gives you a higher sample size for bad patterns that were hidden before, compared to just looking at the full search queries. Another positive thing is blocking many unknown future search queries with the same bad pattern. Use 1-grams for your negatives if possible – if you need more detailed negativation, use 2-grams.

    Google constantly adds “noise” to the traffic. A very efficient way to identify the noise is transforming the search queries into n-grams.

    Share on Twitter

    Even using this approach, there will still be thousands of words with a small sample size. This is getting worse as Google does more and more “smart matching,” and you can bet there will be a lot of noise in the process.

    Here are some approaches you can use to discover more negatives:

    • Use stemming algorithms for 1-grams to get the reduced version of a single word. This allows you to look up different forms that appear in search queries without enough click data.
      “cheapest” → “cheap”
      “cheaper” → “cheap”
      “cheap” as a standalone word, for example, has enough sample data to be classified as bad. Using the stemming approach, we can easily identify other forms and classify them as negative as well.
      I use SnowBall and Porter stemming for this task.
    • Use distance functions (e.g., Levenshtein) to identify misspellings such as “chaep” or “cheep” and flag them as negative.
    • Use semantic similarity to find similar words to “cheap” like “budget” or “free.” I use a model based on Google’s word2vec to find similar terms.

    Put it all together in a data-driven process

    Every day there are new, unseen search queries. For us marketeers, that means we have to constantly search for negatives. I use performance-based rules on the N-gram level to alert me when new patterns emerge that may be suitable for adding to the negative keyword list.

    In addition, there is a second process: I check if there are new, different variants of already blocked N-gram patterns.

    All in all, this saves you a lot of money and prepares you for the next change in Google’s matching logic.

    Key takeaways

    🔵 Google’s close variants lead to more traffic for your keywords because they match search queries that wouldn’t have been triggered in the past.

    🔵 It’s a bad idea to look at complete search queries for setting negative keywords. A more superior approach is to transfer search queries into N-grams.

    🔵 Use stemming algorithms, distance functions, and semantic similarity for better results.

    Maximize your PPC strategy

    Are you struggling with handling the “noise” in your search queries? Our team can help you develop efficient strategies to outsmart Google’s close variants. Let us guide you. Contact us today.

    More Similar Posts

    Menu