Home            Blog
Showing posts with label google. Show all posts
Showing posts with label google. Show all posts

Thursday, October 23, 2014

Google Panda and the High Risk of Using Aggressive or Deceptive Advertising

Google Panda and the High Risk of Using Aggressive or Deceptive Advertising

glenn-gabe
22 Comments
SEO Evolution: Sell, Discover, Deliver & Report on Highly Converting Keywords by Krista LaRiviere, gShift
In my previous posts about Panda, I’ve hammered one important point over and over again. User engagement is critically important. If users are showing low engagement, and yielding low dwell time, then that’s a giant invitation to the mighty Panda. So, when conducting Panda audits, I keep a keen eye on factors that can negatively impact engagement, present obstacles for users, and create virtual bamboo.
One consistent problem I have seen while analyzing Panda hits has been aggressive and deceptive advertising. And I’ve seen that much more since Panda 4.0 (including P4.1, which just rolled out on September 23).
Specifically, sites employing deceptive or aggressive advertising are facing big problems Panda-wise. For example, sites that trick users into clicking affiliate links, blended ads, low-quality supplemental content, etc. In addition, I noticed a number of sites impacted by both 4.0 and 4.1 that heavily used pagination to break up articles into many component pages (to gain more ad impressions). And I’m not talking about two or three pages of pagination. I’m talking about 10, 20, or even 30 pages of pagination. Yes, I can feel you cringe as you read that. I did, too.

The Traffic Monetization Catch

So, when Panda focuses on user happiness, it’s not hard to see why sites employing deceptive tactics like what I mentioned above would have a hard time battling the mighty Panda. But you might be wondering why those sites would employ such risky tactics (especially when our furry black and white friend is actively roaming the Web). There’s an easy answer. Money.
With larger-scale websites, there are typically multiple teams working together. And I use "together" loosely here. You have the marketing team, content team, dev team, design team, etc. And of course, if the purpose of the website is to make money, you have the monetization team (or ad team).
Advertising-wise, as traffic climbs the ad team sees the potential of boosting revenue. And that’s totally fine. I get it…companies need to make money. But in my opinion, some ad teams have been too aggressive and have caused situations that heavily contributed to Panda attacks.
Like this one. Notice the giant bamboo slide to no traffic on May 20 (Panda 4.0):
panda-advertising-traffic-drop
And there’s the catch. The marketing team drives traffic. The ad team monetizes that traffic. And they often don’t see eye to eye. Part of the problem is SEO education, and part of the problem includes financial goals. Sure, everyone has goals and the ad team has their own. But that can lead to aggressive ad tactics that put websites at risk.

Let’s Run Some Numbers

Hypothetically speaking, let’s say a website is generating $200,000 in revenue per month via advertising and affiliate relationships. But let’s say the site is employing overactive ad tactics like many full-screen floating ads, blended ads, low-quality supplemental content to third-party sites, masked affiliate links, etc. Panda 4.1 rolls out and kicks the website in the gut and it loses 70 percent of its traffic. By the way, I’ve had a number of companies reach out to me with severe hits like that. I even had one company lose 90 percent of its traffic overnight with Panda 4.1.
The site that was generating $200,000 per month could lose $140,000 per month in advertising revenue due to the Panda hit. If that’s the case, then it would be left with only 30 percent of its original $200,000, which is just $60,000. Wow, that’s a huge loss, right? I’ve seen this scenario many times during my Panda work (to various levels). It’s ugly and causes massive amounts of stress for everyone involved.
panda-advertising-drop

The Sinister Surge Doesn’t Help… That’s Why It’s "Sinister"

Another phenomenon that upsets the Panda balance is the sinister surge in traffic prior to an algorithm hit. I wrote about this disturbing situation after seeing it many times since February of 2011 when Panda first rolled out.
Google ends up dishing out more and more organic search traffic, even when there are problems on the site engagement-wise. That means that Google is getting even more engagement data during the surge, even when the site has serious problems. And if Google sees unhappy visitors in aggregate, then Panda can stomp all over the site. I’ve seen it a thousand times.

Warning: Important Point Ahead…Pay Attention

So, you have a surge in visits from Google organic and many of those users are experiencing deceptive or aggressive ad tactics. Both marketers and the ad team often mistakenly believe the surge is a good thing, since they aren’t familiar with Panda. Then boom, the wave crashes, and takes a huge portion of those visits with it (including ad revenue). Then you’re left with serious questions, stress, and confusion. And all of this can happen overnight by the way. Not good.
panda-advertising-surge

Advertising Problems and Panda - What I’ve Seen

While helping Panda victims, I’ve come across some glaring advertising issues that cause serious engagement problems. I thought it would be important to list some of them below so you can better understand what I’m referring to. I already mentioned a few above, but I’ll list them below for clarification purposes. Note, these are not the only ad problems that are being employed across the Web. They are simply some of the most common issues I have come across.

Full-Screen Floating Ads (aka Overlay Ads)

If you are employing full-screen ads that take over a user’s entire browser window, then you need to understand a few things from a Panda standpoint. Users hate them, so be very careful when you trigger full-screen floating ads and how often you employ them per session. The more people that get annoyed by takeover ads and then jump back to the search results, the more bamboo you are building. Engagement drops, dwell time is low, and you are sending horrible signals to Google about user happiness.
A mockup of an ad overlay:
panda-advertising-overlay2
If you do employ full-screen floating ads, then make sure users can exit out of the takeover and that’s it very clear how they can exit. During some audits, I found myself extremely frustrated being forced to watch a full-screen ad (which I would never normally do by the way). Full-screen ads that literally take over my screen, don’t let me exit, etc. annoy the heck out of me. And many others feel the same way.

Auto-Play Video Ads (or Audio Ads)

There’s nothing like hitting a Web page for the first time and immediately seeing a video trigger with audio. Most users frantically try to pause the video or at least mute the audio. I’ve seen ads like these on many Panda victim websites.
And there are times that I’ve seen multiple video ads on one page, and both have started playing! I wish I had video of myself trying to find, and then pause, multiple video ads at one time. Needless to say, employing autoplaying video or audio ads can kill engagement.
An example of an autoplaying video ad, plus other serious ad problems:
panda-advertising-video3
My recommendation is to make sure users trigger the video and/or audio. Do not autoplay those ads. Again, think about the user and what will drive strong engagement.

Roadblocks (Interstitials)

A roadblock is similar to a full-screen ad, but often redirects to you a different URL where an ad is displayed (in between page visits or even before the first page a user visits). Not only does this completely interrupt the user experience, but you are sending users to a different url automatically. Upon experiencing a roadblock ad, many users frantically try and return to the page they were on or to get through to the destination page. Roadblocks tend to anger a lot of people.
panda-advertising-roadblock2
If you are using interstitial ads, I can tell you that a distinct portion of your traffic is not enjoying the roadblocks you have in place. And there’s a chance that many of those users are popping back to the SERPs. And as I’ve mentioned before, low dwell time is something you want to avoid.

Blending of Ads With Content

During Panda audits, I have seen affiliate links and ads cloaked as content. They match the content so well in design, color, etc., that it’s hard to tell they are ads. But when you click them, you sure know they are…
Being transported to some random third-party site is not exactly what I had in mind after searching for a product, service, or solution. And some of those third-party sites are aggressive with their own tactics (and some even have malware problems, viruses, risky downloads, and more.)
panda-advertising-blended2
"Hell hath no fury like a user scorned." If you deceive users into clicking ads, then it will come back to bite you. And a Panda bite is worse than your typical animal bite. The pain can last for months (or longer). Do the right thing. Don’t deceive users. Stay out of the gray area of Panda.

Heavy Pagination (for Ad Impressions)

This isn’t as much deception as it is just a horrible user experience. Many publishers charge per impression (typically a CPM, or cost-per-thousand impressions). So, if you have 1 million impressions per day, breaking up articles into smaller pieces across a paginated set could yield 10 to 20 times the number of impressions. The ad team might run the numbers and push to do this.
And I’m here to tell you that excessive pagination can drive users crazy, while also yielding horrible engagement signals. I’ve seen the use of heavy pagination a lot during Panda work (and I’ve seen a serious uptick in sites employing this tactic get hit during Panda 4.0 and 4.1). I’m not sure if that signal was added to Panda recently, but I saw it a lot during my analysis.
38 pages of pagination:
panda-advertising-pagination3
And it contains a "view all" page, which would be great if the site didn’t force me to register to see it…
panda-advertising-pagination3-register
As a quick example, I’ve been helping a company that got pummeled by Panda (losing more than 60 percent of its traffic overnight). Upon analyzing the site, I noticed they were breaking up their articles into many small pieces (sometimes 10, 20, or 30 or more component pages). On desktop, it was painful to go through an article. Each component page only housed a paragraph or two of content. Then I had to click through to the next page, which of course loaded more ads. But desktop was a breeze compared to mobile. Trying to click through 30 component pages on your mobile phone will literally drive you insane…It was a horrible user experience.

Excessive Pagination - Possible Solutions

Each website is different, and there are several ways to tackle excessive pagination. You could simply migrate all content to one page (the best solution SEO-wise). You could also add a "view all" page and set that up properly SEO-wise – and not force people to register to see it! Then Google would surface that page in the SERPs. And then of course, you could add more content per component page and cut the pagination down by 50 to 75 percent. That’s not the best scenario, but better than providing 20 or 30 pages of pagination.

Low-Quality Supplementary Content

Supplementary content (used by Google in its Quality Rater Guidelines) is any additional content on your Web pages that’s not the core content on the page or ads. For example, you might be providing related articles, your right sidebar probably contains a lot of supplemental content, you might be employing content syndication links from Outbrain, Taboola, and others. And of course, some sites are stacking several content partners on their pages (adding even more supplementary content).
You need to be very careful with the quality of supplementary content and the amount of that content included on your website. Many users don’t know where that content will take them, and they are inherently trusting that clicking those links will be OK. But in reality, some of those links lead to ultra-low-quality pages. I’ve come across many examples of heavy sales landing pages, irrelevant content (based on the original article being viewed), and even some sites with malware and risky downloads.
And as mentioned earlier, supplementary content has made its way into Google’s quality rater guidelines. So yes, this is on Google’s radar for sure. Always think about your users, where you are sending them, and what type of experience they will have. If you can’t guarantee a positive experience, then don’t do it.
An example of supplementary content. Can you tell which links are external vs. internal?
panda-advertising-content-ads2

Fixing Advertising Problems After a Panda Hit

Once ad problems are identified, the solution is clear from my standpoint. Companies hit by Panda need to significantly cut back on their aggressive ad tactics. That means removing roadblocks, cutting down full-screen takeover ads, removing blocks of low-quality supplementary content, removing deceptive blocks of advertising, and more.
I explain to clients that they need to do this quickly, so users can start sending positive engagement signals to Google. I also make it clear that this can take a while (months). Some clients move fast to follow my recommendations, and they can often see recovery in a quicker timeframe. But then there are the companies that experience a civil war over advertising strategy.
For example, some ad teams might have sold through deals that they need to honor. But the problem is that there’s no traffic. So the ad team wants to monetize the remaining traffic even more. The marketing team (typically being guided by me), now understands Panda, how severe it can be, and how long recovery can take. They want to recover quickly, so they are ready to take action.
In my opinion, Band-Aids are not a long-term Panda recovery plan. Temporary recoveries can happen (as I documented in a recent case study). Avoid the Panda rollercoaster by making significant changes based on an audit. That’s how you avoid subsequent Panda visits.

A Final Note About Panda Recovery and Ad Tactics

When clients recover from Panda, I’m quick to explain a few key points. First, now is not the time to turn back on the ad fire hose! As I explained above, I have seen temporary recoveries. Panda rolls out frequently and if you add the problems back to your site that got you hit in the first place, then you are asking to be hit again. Panda is about long-term quality changes to your site. Don’t revert back to aggressive advertising tactics once you see a surge in traffic.
Second, now is also not the time to stop working on Panda remediation. My advice is to act like the recovery didn’t happen yet. Keep driving forward to fix the problems that were surfaced during the Panda audit. There’s an inherent gray area to Panda (and all algorithms). You want to get as far out of the gray area as possible. If you barely cross the threshold, you can get hit again. I’ve had companies reach out to me with rollercoaster Panda trending over the years. It’s maddening. Avoid that at all costs.

Summary: Understand Your Ad Problems… Because Panda Does

There’s a fine balance between simply providing advertising on your site and annoying the heck out of users to the point of insanity. From a Panda standpoint, it’s critically important that you don’t cause serious user engagement issues by employing aggressive or deceptive ad tactics. If you do, users will be unhappy, they will bounce off your site back to the SERPs, low dwell time will ensue, and Google will pick this up. And that’s a recipe for SEO disaster. Always think about user engagement. Panda does.

Wednesday, May 21, 2014

The Link Graph Conundrum: Why Citations Remain Critical to SEO Survival

The Link Graph Conundrum: Why Citations Remain Critical to SEO Survival

enge-eric
1 Comment
SEO Evolution: Sell, Discover, Deliver & Report on Highly Converting Keywords by Krista LaRiviere, gShift
It's a popularly held belief that the link graph is broken. This post will explore the roots of the problem, and why it is such a tough problem for Google and Bing to resolve.
The Link Graph Still Alive and Kicking
It all starts with the original Larry Page - Sergey Brin thesis. At the time they were developing this concept, the leading search engines of the time were almost solely dependent on keyword analysis of the content on your page to determine rankings. Spammers had so thoroughly assaulted this model that change had become an imperative, lest the concept of a search engine go the way of the dinosaurs.
Here are a couple of key sentences at the beginning of the thesis:
The citation (link) graph of the web is an important resource that has largely gone unused in existing web search engines. We have created maps containing as many as 518 million of these hyperlinks, a significant sample of the total. These maps allow rapid calculation of a web page's "PageRank", an objective measure of its citation importance that corresponds well with people's subjective idea of importance. Because of this correspondence, PageRank is an excellent way to prioritize the results of web keyword searches.
The concept of a "citation" (bolding above was mine, for emphasis) is a critical one. To understand why, let's step away from the web and consider the example of an academic research paper, which might include citations in them that look like this:
Academic Citations
Placement in this list is normally made by the writer of the paper to acknowledge major sources they referenced during the creation of their paper. If you did a study of all the papers on a given topic area, you could fairly easily identify the most important ones, because they would have the most citations (votes) by other papers.
Using a technique like the PageRank algorithm, you could build a citation graph where each of these "votes" were not counted equally (e.g., if a paper has a lot of citations, the votes it gives would count for more if they did not). And, just like the PageRank algorithm, you could apply the algorithm recursively to identify the most important papers. The reasons this works well in the academic citation environment are:
  1. Small Scale: The number of papers in a given academic space is reasonably finite. You might have hundreds, or thousands, of documents, not millions.
  2. No Incentive to Spam: You can't really buy a citation placement in an academic paper. If you were the author of a paper and had some illogical references in your citations, the perceived authority of your own paper would be negatively impacted.
  3. Small Communities: In a given area of academic research, all the major players know each other. Strange out of place behavior stands out in a way that it doesn't in an open chaotic environment like the web.

Citations and the Web

At the time of the Page-Brin thesis, the spammers of the world were attacking search engines using a variety of keyword stuffing techniques. Practical implementation of a link-based algorithm was a revelation, and it had a huge impact very quickly. The spammers of the world had not yet figured out how to assault the link based model.
As Google gained traction, this changed. Link buying and selling, large scale link swapping, blog and forum comment stuffing, and simply building huge sites and placing site-wide links on them were some of the many tactics that emerged.
Fast forward to 2014 and it appears that Google has partially won this battle. The reason we can say that they have partially won is that these days almost no one publishes articles in support of spammy link building tactics.
Unhappy Spammers
In fact, the concept of link building itself has been replaced with content marketing, which the overwhelming majority of people position as being about building reputation and visibility. This has happened because Google has gotten good at detecting enough of the spammers out there that the risks of getting caught are quite high. No business with investors or employees can afford to invest in spammy techniques because the downside risks aren't acceptable.
On the other hand, if you spend enough time studying search results, you can easily find many examples sites that use really bad link building practices ranking high in the search results for some terms. If you're playing by the rules and one of these people in outranking you, it can be infuriating.

Sources of the Problem

Why does this still happen? Part of the reason is that the web isn't at all like the world of academic papers. Here are some reasons why:
  1. Commercial Environment with High Stakes: Fortunes are made on the interwebs. People have a huge incentive to figure out how to rank higher in Google.
  2. Huge Scale: It was back in October/November 2012 that Google's Matt Cutts told me that Google knew about 100 trillion web pages. By now, that has to be more like 500 trillion.
  3. No Cohesive Community: The academic community would probably argue that they aren't as cohesive as one might think, but compared to the web there is a clear difference. There are all different types of people on the web, including those who are ignorant of SEO, those who have incorrect information on how it works, those who attempt to abuse SEO, and finally to those who try to do it the right way.
  4. User-Generated Content (UGC): Blog comments, forum comments, reviews, social media sites are all example of UGC in action. While Google tries to screen all of this out, and most of these platforms use the rel="NoFollow" attribute not all of them do. As a result, spammers implement algorithms to spew comments with rich anchor text references to their sites across the web.
  5. Advertising: The web is a commercial place. People sell advertising, and even if there intent is not to sell PageRank, many of them don't use nofollow attributes on the links and simply label the links as "Sponsored" or "Ads". Google is not always able to detect such labeling.
  6. Practical Anonymity: The chances of blowback if you link to a crappy site are much smaller than they are in the academic paper scenario. Because of the scale of the web, the advertising environment, and the structure of web content, a crappy link or two may just be seen as an ad, and the average visitor to a web page simply does not care.
  7. Complete Lack of Structure: Let's face it, the web is a chaotic place. The way sites are built, the way people interact with pages, the types of content, and the varying goals of such content lead to a web that has little real structure.
One Little Corner of the Web

Why Haven't Google and Bing Fixed This?

Of course the search engines are trying to fix it. Don't pay any attention to anyone who suggests otherwise.
Google lives in terror of someone doing to them what they did to Altavista. A fundamentally better algorithm would represent a huge threat to their business. And, of course, Bing would love to be the one to find such a new algo.
The money at stake here is huge, and both search engines are investing heavily in trying to develop better algorithms. The size of the spoils? The current market cap of Google is $356 billion.
The reason why they haven't fixed it is because they haven't figured out how to yet. Social media signals aren't the answer either. Nor is measuring user interaction with the SERPs, or on the pages of your site. These things might help, but search engines would have already started weighting them quite a bit more than they have if they were the answer.

What Does This Means To You?

Frankly, it's a tough environment. Here it is in a nutshell:
  1. Publishers that use crappy link building practices may outrank you on key terms, and they may stay there for a while.
  2. Google will continue to discover and punish bad tactics to the best of their ability, uneven though that may be. They do this well enough that any serious business just needs to stay away from such tactics (most likely that means you!).
  3. Search engines will keep looking for creative new ways to reduce their dependence on links. This will include more ways to use social media, user interaction signals, and other new concepts as well. However, Cutts says that links are here as a ranking factor for many more years.
  4. As search engines use more and more of these new signals we aren't going to get a roadmap as to what they are. Yes they patent new ideas all the time, but you won't know which patents they use, and which ones they don't. In addition, even when they use an idea from a published patent, the practical implementation of it will likely differ greatly from what you see in the patent.
Your Direction Might be Unclear
It isn't an ideal situation. Your best course of action? Focus your efforts and building your reputation and visibility online outside of the search engines. Ultimately, you want to build your own loyal audience. Here are a few ideas for doing that:
  1. Organic social media: Just recognize that this opportunity may be transient too. As we have seen, Facebook is reducing organic visibility in order to drive revenue growth. For that reason, new emerging social platforms are particularly powerful opportunities to get visible, provided that you pick the right horse to ride.
  2. Earned Media (Guest Posting): Cutts may have signalled the The Decay and Fall of Guest Blogging for SEO but writing regular columns on the top web sites in your market is something you should strive to do anyway. Don't view it as an SEO activity, its still a surefire way to build up reputation and visibility.
  3. Speaking at Conferences: This is a great technique as standing up in front of a room full of people and sharing your thoughts allows them to begin developing a connection with you.
  4. Writing Books or eBooks: Another traditional reputation builder, but a really good one. Don't underestimate the work in writing a book though. However hard you think it is, the reality is 4 to 10 times harder.
  5. Develop Relationships with Influential Media and Bloggers: Building meaningful relationships with other people that already have large audiences and adding value to their lives is always a good thing.
These activities will all give you alternative ways to build your reputation, visibility, and traffic. They also give you the best chance that your site will be sending out the types of signals that search engines want to discover and value anyway.
Ideally, your reputation will be so strong that Google's search results will be damaged in the event you aren't ranking for relevant terms, because searchers will be looking for you. You don't have to like the way the environment operates, but it's the environment we have. Complaining won't help you, so just go out and win anyway!