I haven’t blogged in awhile here, I haven’t had anything much to blog about. Life goes on, wheels turn, the sun rises. Business and Internet wise things haven’t changed much, just been in cruise control.
Then… September 29th.
For the last year or so Google has been in overdrive with doing updates and tweaks. I’m starting to think they might simply have too many employees who feel the need to keep themselves busy. I thought they were fine before, and if they want to work on more features, how about new things, continually tweaking search results that aren’t broken seems wasteful. These changes have hit lots of other webmaster, but not me, my sites were never affected, until now.
So around September 29th two updates happened, Google’s 20th so-called “Panda” update which is supposed to be an anti-content farm update originally targeted against those big content farms who push out mile wide inch deep 300 word articles 10 times a minute, and a so-called EMD (exact match domain) update that penalizes sites with keywords in their domain that are low quality.
First, I find the EMD domain thing stupid, and I really mean stupid. It is like judging a book by its cover or a person’s character by the color of their skin. If a site is quality it should rank well, if it isn’t it shouldn’t. Second guessing your own algorithm by penalizing certain domains is just stupid. The off-page quality score system that gave birth to Google works, no need to second guess it by mere humans making arbitrary decisions on what type of site they think is good.
I do not think I was affected by the EMD though since I can see no pattern with it. I favor keyword rich domains, and always have, but I have not seem losses on exact match searches. I’m still #1 for those on the sites I check. It is other searches for those sites on content pages that I lost rankings on.
Case #1, my gardening blog. Gardening is my #1 hobby, if I could quit everything else and just garden I would be happy. I would sell all my other sites at the right price, but I would keep my garden blog. I started this blog 7 years ago because I liked gardening and could easily write about it, and knew how to start blogs for obvious reasons. 100% unique content written by me (or user submissions for comments/posts). On many of my sites I use a copyright statement footer link, this site doesn’t even have that. It has been #1 for gardening blog for many years, and still is. It has also been #1 for garden blog for many years… but is no longer. Now it is like 5th. Traffic loss on Sept 29th was about 50%. This is pretty much a straight up wordpress blog, not a content farm, how could it possibly be labeled as a content farm? I hate thin short content. All the content is unique and usually at least 1000+ words (I write long posts). So if this was hit by Google’s Panda Content Farm algorithm, what are they doing over there? Honestly? I don’t even try to SEO this site, I just write good content and other blogs link to me. I can’t think of any change to make to this site to make it more white hat, it is a standard blog with unique content, isn’t that supposed to be the ideal?
Case #2, a Google maps mashup site/application I made a few years ago called Wildcrafting.net. This site was so noncommercial it didn’t have ads on it until this week. I finally stuck one ad unit on it a few days ago (after it lost the traffic). This site was an outgrowth of my survival forums (more on those later) so it does have one link to another site, but only on the homepage. It too has no footer links even. The site has some non-unique content in the form in a USDA database, but also tons of unique content submitted by users and all told the content is presented in a 100% unique way. This site still ranks well for the exact match domain term, at #3 behind the exact match .com and wikipedia. But lost 50% of its traffic or there about.
Case #3, my literature site, my oldest and highest traffic content site, 13 years old. Traffic has always fluctuated, but I did detect a drop around Sept 29th as well, it looks to be a 10 to 20% loss. For many many many years I have held the #2 search position with this site for author name keywords such as William Shakespeare or book title keywords, usually only behind Wikipedia. I don’t monitor all 300 authors or 4000 books, but on the ones I do monitor I did lose rankings. For instance for William Shakespeare I went from 2nd to 7th, Wikipedia also went from 1st to 2nd. The new 1st is a previously unknown to me site that I can’t remember ever seeing ranked before. Other authors or books I dropped as well, others I held steady. This site is so old and has been ranked well for so long it has really really really good authoritative incoming links from academic sources, newspapers, magazines, etc. I don’t think it’d seem to be related but I did remove several thousand low performing affiliate links (that were already nofollowed, by the way). We’ll see if that makes a difference. The site, publishing public domain literature, does of course have non-unique content. But I also pay writers to produce author biographies and book summaries, and book introductions are often user submitted, so it does have unique content on what I call the author or book hub pages (unique except for the many sites that steal my content, I could spend weeks on DMCA requests with this site), and those are the pages that were hit it seems.
Case #4, my second oldest site at 11 years Wilderness Survival. It has a mix of public domain and proprietary content, but the most popular stuff is all public domain, it has been #1 for wilderness survival for over a decade, lots of user submitted content too, lots of good incoming links. This site showed no traffic decline at all. Very steady. It does have a footer copyright link.
I had other sites that either held steady or lost traffic as well (such as the site you’re reading, it lost as well, I’ll admit I don’t write here much anymore, but please, this is no content farm.), but these 4 illustrate… well… nothing. The two least commercial most unique got hit the most, the two most monetized hit the least. It seems unrelated to uniqueness of content, and in truth, it seems the more unique the content the worse the hit was which is weird. I’ll do nothing to the survival site, because it is fine, apparently. For my gardening blog I’ll probably do nothing, because I can see nothing that is bad. Maybe I’ll go through the code and make sure I’m not missing anything hidden. Wildcrafting I’ll leave as is because again I see nothing that should be done. My literature site, being so huge, also could do a looking over. I already removed those (previously nofollowed) low performing affiliate links, maybe I’ll do more of that.
Ironically, the two sites with a footer copyright link lost the least traffic, so maybe I need to add an off site footer link to the other two.
Mostly, I’m hoping Google realizes this change, whatever it was, did not actually accomplish anything good and they roll it back.
October 12th, 2012 at 11:07 am
Did not look at your site, well in a few years at least. How many ads are above the fold?
http://searchengineland.com/too-many-ads-above-the-fold-now-penalized-by-googles-page-layout-algo-108613
October 12th, 2012 at 11:58 am
Except that is old, I know they supposedly did a refresh but the timeline doesn’t match my problems. Plus, one of the sites hardest hit had exactly 0 ad units.
I do find it humorous that Google, through adsense, will tell you to place ads prominently, and Google, through search, now says not to.
This is another example of Google trying to second guess their existing algorithm. If all off pages factors, filtered for spam, indicate a site is high quality, then actual humans have voted and decided the ad ratio is okay.
If things do not correct I may experiment with redesigns moving ad units around, but it doesn’t seem like that is what got me here.
October 13th, 2012 at 9:01 am
The new no.1 Shakespeare site is the most content-farm thing I know. The woman who makes this has loads of similar sites, all with bad layouts and spammy content. Her traffic stats are public due to Quantcast: http://www.quantcast.com/william-shakespeare.info?country=GB&contains=
I really don’t understand the latest update, it seems to have increased rankings of spammy sites for a lot of queries. I’ve literally got sites that copy my content ranking higher than mine with the original.
October 22nd, 2012 at 2:31 pm
Welcome to Panda Hell. You’re lucky you missed the first few years, but everyone one is going to get hit eventually so welcome aboard! Well, except for the big brands.
So, you need to read up on Panda, Webmasterworld might be a good place to start
.
#1 Panda has nothing to do with Content Farms
#2 Google is unable/unwilling to tell who wrote the original content and who stole it so get working on the DMCA’s, Duplicate content is a panda killer
Sorry I have no good news, except I have a site that recovered from Panda after 6 months or so of hard work.
Good luck! and don’t bother complaining, nobody of importance is listening.
October 23rd, 2012 at 7:21 pm
Sigh. Some of my sites got hit by the EMD update. The primary keywords for these white-hat sites didn’t rank down but completely disappeared from the SERPs. Two things they all have in common is that the whole primary keyword is in their domain names and they have ads above the fold. If Google penalized me for this combination then it is stupid since they have suggested users to put an ad above the fold even to this day.
I am at the whims of Google. I have been doing this for four years yet I can’t shake this feeling that one day the revenue will be gone, whether it’s from the result of a major algorithm change or Google taking away my adsense account for a stupid reason. I seriously got to reevaluate my long term career paths and figure out where to put my time in.
November 11th, 2013 at 4:00 pm
Hey Chris,
Been a while since I visited but I took a look at your sites. Panda is a pain for sure and there is probably more than one update in play.
You’ve offered me tons of great advice over the years so I’ll offer a few of my suggestions (in case you want them):
1) Add authorship. This plays into trust with Google (which your Shakespeare competition is using) but really plays into the user metrics with Panda. That image gets a much higher CTR than a SERP w/o one and it is huge. If you have authorship and can keep users from clicking back to the SERP with engaging content and teasers to keep navigating you’ll fix most of your issues.
2) Kinda mentioned it before but the users metrics are big. Related content, images, direct response copy, paginated articles and anything else that can lower bounce rate, keep users clicking and keep time on site means a lot.
3) Code. Everything is trending towards mobile browsing and I’m convinced G has a trust/weight for updated code. Responsive web design for mobile seems to help from my experience.
4) Freshness and markup. Google wants info so I give it as much as I can with any relevant schema or rich snippet markup that doesn’t come off as spammy. I especially pay attention to the schema “dateModified” item prop which G also seems to care about. Enough that I’m considering a scheduled DateModified chron update plugin to test it. I’d bet two identical old sites with one updating just a dateModified tag and nothing else would have totally different rankings.
Hit me up if you want to discuss more. I’ve spent literally over a thousand hours on this stuff and have recovered anything that has been hit.