google updates – SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies https://cognitiveseo.com/blog SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies Wed, 08 Aug 2018 07:07:54 +0300 en-US hourly 1 https://wordpress.org/?v=5.3 Content Pruning – the Technique That Will Protect Your Rankings from Google Panda https://cognitiveseo.com/blog/5587/content-pruning-the-technique-that-will-protect-your-rankings-from-google-panda/ https://cognitiveseo.com/blog/5587/content-pruning-the-technique-that-will-protect-your-rankings-from-google-panda/#comments Tue, 17 Feb 2015 10:18:33 +0000 http://cognitiveseo.com/blog/?p=5587 Time has passed since the Panda 4.0 update and there’s still a lot of talk around this subject. At that time, the headlines were filled with words of panic, which foretold an impending doom. Yet, time proved that the latest Google Panda shouldn’t be brought in discussions in terms of penalties. Instead,  targeting and deranking low […]

The post Content Pruning – the Technique That Will Protect Your Rankings from Google Panda appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
Time has passed since the Panda 4.0 update and there’s still a lot of talk around this subject. At that time, the headlines were filled with words of panic, which foretold an impending doom. Yet, time proved that the latest Google Panda shouldn’t be brought in discussions in terms of penalties. Instead,  targeting and deranking low quality content would be more appropriate concepts to bring into discussion. So it seems that the Panda update should  affect the ones who do not comply with Google’s idea of valuable content: creating high-quality content, that is both useful and entertaining to the reader.

Content Pruning cognitiveSEO

It’s all the rage in the SEO community and specialists are looking at ways of improving their strategies in order to work around it. Most webmasters rely on digital marketing strategies that produce content on a regular basis. The content production may be represented as a blog tab on the site or just a section of press releases. But in it’s greater form, it may be a large-scale SEO strategy that prunes old content which still generates traffic. If we were to generalize to the extreme the SEO formula, you could say that your success may come from the number of backlinks and the number of indexed pages. With this raw formula in mind, the underperforming content that gets indexed by Google might pull the whole site down in ranking. The problem is that many webmasters tend to look only forward when they are optimizing their site for search engines, leaving the old content behind.

New and up-to-date content is always going to be the focus point, but people tend to ignore the old content. And that’s how everything tends to transform into a huge pile of pages that gather little to no traffic for the site.

After Panda 4.0 was run, this forgotten bundle of underperforming content may even drag your site’s ranking down. That’s why content pruning should become an essential part of your ranking strategy. Even if in the past a lot of SEOs would argue that this was a valid point of view, I think that Google Panda 4.0 showed this is a possibility.

What Is Content Pruning?

Since Google rolled out the Google Panda 4.0 Update, some webmasters were forced to reduce the number of indexed pages and began pruning low-quality content that didn’t add any value to the site. It’s not easy to take the decision of removing indexed pages from the Google Index and, if handled improperly, it may go very wrong.

Your sites’ obsolete or low quality content may be one of the problems that generate a drop in rankings.

Google made a lot of effort to make the search algorithm and detect quality content, just as human visitors would do when they read it.  This means that low quality pages may affect the performance of the whole site. Even if the website itself plays by the rules, low-quality indexed content may ruin the ranking of the whole batch.

Even "Reuters" Has Irrelevant Content Indexed

Content pruning isn’t a technique that should be taken into consideration only by the small sites or the emerging businesses. Reuters,  for instance is one of the biggest and the most known international news agencies, managing to be an important player in the field since the last century. Yet, they too should prune their content in order to give the user the best experience there is. As we look at the screenshot above, we can see a list of pages that can hardly be found by searchers, therefore,  they shouldn’t be indexed. Moreover, the listed pages don’t offer valuable info, have duplicate content (highly penalized by the Panda Algorithm) and definitely pull the whole site back than pushing it forward in rankings.

Why Should I Prune My Own Content?

Every site has it’s evergreen content, which attracts organic traffic, and some pages that are deadwood. In the Google Index you should only have pages that you’re interested in ranking. Otherwise you might end up polluting your rank. Those pages filled with obsolete or low quality content aren’t useful or interesting to the visitor. Thus, they should be pruned for the sake of your website’s health.

Keep the Google Index fresh with info that is worthwhile to be ranked and which helps your users.

Those low-quality pages that make up for a very stale user experience, are considered unacceptable from Google’s point of view. Your site should be cleaned of these pages because it makes for a very poor and confusing user experience for the searcher. It might look like you’re an old site with lots of pages, but if you’re not on the point with your content it only means you’re writing for the search engine not for the user.

How Should I Prune My Own Content?

To successfully prune your own content you should follow these steps:

1. Identify Your Site’s Google Indexed Pages

For this task you have two methods that you may use in order to identify all your site’s indexed pages:

Method A. The first one, explained below, uses Google Webmaster Tools. The disadvantage of using this method is that the list of website pages displayed does not give only indexed pages. Here you’ll find all the pages found by GoogleBot during the crawling. However, you have a total number of indexed pages and a graphical display of it’s evolution in the Index Status.

Google Webmaster Tools Indexation Status

You can access the whole bundle of internal links for your website by going to Search Traffic > Internal Links in the Google Webmaster Tools. Here you’ll be able to download the data into a CSV or Google Docs format. That way you’ll have a list with all the pages from your website ( indexed or not ) and also the number of links pointing to each. This can be a great incentive on discovering which pages should be subjected to content pruning.

This method is recommended for very big sites that have a lot of pages.

Google Webmaster Tools Download Crawled Pages

 

 

Method B. Site:Domain Query and Extract Only Indexed Pages.

While the first method took the whole bunch of pages GoogleBot crawled for your site, this one delivers a list that contains only the indexed pages. By using the command site:examplesite.com you will only receive the actual results that are displayed in the search engine results page.

Google Webmaster Tools - Download Indexed Pages Step1

 

In order to hasten the process you need to modify some settings from the search engine. You need to go to the Search settings page and set the Results per page to 100. That way, you’re going to see more results per page. And you should also check the Never show Instant results option from Google Instant predictions. Given that you’re going to see much more results per page, you want to remove any clutches you might encounter.

Google Webmaster Tools - Download Indexed Pages Step2

The next step is done with the help of a bookmarklet which scans the results displayed in SERP and generates another window with a numbered list that contains the links and their anchor text. To install this bookmarklet you’ll have to click and drag this button onto the Bookmark bar above:

Google SERPs Extractor

Google SERP Scraper Bookmarklet

Be sure that you are viewing the SERP page as you activate the bookmarklet. As you can see from the image below, it will generate a list with all the links and all the anchor texts from the SERP. Remember, if you have sites that have more than 100 links, you have to repeat the process for all the results pages.  You should also be aware of the fact that for big sites, the process may take a while. You just have to be thorough with this process and try to grasp as many indexed links as possible.

Google Webmaster Tools - Download Indexed Pages Step3

2. Identify Your Site’s Low Ranking Pages

To identify the low ranking pages you can use Google’s own free tool – Webmaster Tools. Google Webmaster Tools provides accurate data regarding the number of impressions, clicks, click-through-rate and average position. The feature that helps you to view all these data is called Search Queries and you can reach it in the Search Traffic category. Here you will have a list containing the top pages from your website. Identifying low ranking pages becomes a really easy process as you can order the list by the number of clicks in order to see the least performing pages.

Google Webmaster Tools - Search

 

3. Identify Underperforming Pages

Understanding a site’s structure and identifying the obsolete paths and pages are critical in order to “clean up” your content. You can use metrics from Google Webmaster Tools such as number of clicks  or number of impressions in a certain period of time. This way, you can check the interest of the user in reading a certain page or his involvement in doing something actionable on your site. For instance, you can use the “Clicks” element  in order to show the pages that have zero clicks for a certain query. This way, you can have some insight about what your users are in to.

Also, you can use these metrics to make data analysis regarding the performance of your content. For instance, you can download the data provided by the GWT in CSV or Google Docs. Once you have these files, you can mark the pages that are underperforming , keep an eye on them and start filtering the content posted there in order to get the most out of your content pruning campaign.

4. No-Index the Underperforming Pages 

After you have successfully identified the pages that  don’t bring you any added value, you should start to no-index them. If you want that pages to disappear from  Google’s radar, you need to  entirely block them from crawling. This way, they won’t be indexed by Google. Here are two ways you can do this:

  • You can disallow those pages in the robots.txt file to tell Google not to crawl that part of the website. Yet, you should keep in mind an important consideration if you plan on using the robots.txt method:  be sure you do the right stuff. You may break your rankings easily with it!
  • You can  tag certain pages.  After you identify the pages that should be de-indexed, just apply the meta no-index tag to them. If you want the page to be followed by Google Crawlers you should add the <META NAME=”ROBOTS” CONTENT=”NOINDEX, FOLLOW”> tag. That will tell Google not to index the page but at the same time to crawl other pages that are linked from it.

Don’t Jump the Gun

De-indexing some pages or part of them is quite a big decision. So you need to really think it well before you start implementing it. Even though content pruning may present itself as a workaround to the recent Google updates, you should use it with caution. As with everything in life, don’t abuse this. It’s natural to feel the need to solve your issues swiftly, especially when you’re facing major rank and traffic drops. But be sure to do it gradually and know exactly what you block so you do not block entire folders to be crawled or to block the Google crawler from accessing important sections of your site.

 Excessive Content Pruning

Conclusion

Good content takes a lot of time to be built. Pruning it might take even longer but it pays off on the long run. While focusing on providing new content to your viewers should be the main priority, you should still overhaul the old content. Neglecting jaded content could harm your website’s ranking! A content pruning campaign is not effective just from a ranking point of view but could  be also a part of the content marketing strategy. Having a high-quality content will increase the overall credibility of your site, will improve the user experience and therefore will positively affect conversions.

The post Content Pruning – the Technique That Will Protect Your Rankings from Google Panda appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
https://cognitiveseo.com/blog/5587/content-pruning-the-technique-that-will-protect-your-rankings-from-google-panda/feed/ 18
Google Panda 4.0 and PayDay Loan 2.0 Updates Launched Today https://cognitiveseo.com/blog/5443/google-panda-4-0-payday-loan-2-0-updates-launched-today/ https://cognitiveseo.com/blog/5443/google-panda-4-0-payday-loan-2-0-updates-launched-today/#comments Wed, 21 May 2014 13:01:23 +0000 http://cognitiveseo.com/blog/?p=5443 Less than 24 hours ago, Matt Cutts, the head of the spam team from Google, announced via Twitter that Google is rolling out the Panda 4.0 update starting today. Although many speculations regarding the rise of a possible Penguin 3.0 have been made, it’s about that time of the year when SEO news is a-buzz with […]

The post Google Panda 4.0 and PayDay Loan 2.0 Updates Launched Today appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
Less than 24 hours ago, Matt Cutts, the head of the spam team from Google, announced via Twitter that Google is rolling out the Panda 4.0 update starting today. Although many speculations regarding the rise of a possible Penguin 3.0 have been made, it’s about that time of the year when SEO news is a-buzz with one particularly cute animal: panda. Only it’s not our Chinese bear darling that’s occupying the minds of web experts, it’s the Google algorithm.

However, all the Google updates speculations weren’t groundless. Just like the Penguin update from May 2013, Panda 4.0 was announced by strong fluctuations in the Google’s SERP.

Screenshot Google Algorithm Fluctuation

Short after, Matt Cutts also made the official announcement.

Screenshot Panda 4.0 Announcement

In the last days, headlines were ranging from the mild “Are you ready?” or “Should you be worried?” to the more severe “How to avoid getting slaughtered by the new update”. Elsewhere on the web there were plenty of surfers taking bets on what the exact release date will be for the third installment. And as with everything else mysterious and powerful, there’s even people trying to predict the date according to more-or-less “sure-proof” algorithms.

1 Day / 2 Updates = Panda 4.0 + PayDay Loan 2.0?

Which one are we talking about? you might ask yourself. The answer is quite simple: both!

Both Panda and PayDay Loan have been updated and they are both on the roll.

While Panda 4.0 is going after on page factors, the PayDay Loan “update” goes after “very spammy queries” and is unrelated to the Panda or Penguin algorithms.

Panda 4.0 and PayDay Loan 2.0 – Why Would Google Need Them?

As stated before, Google’s goal is to offer the best possible results, making the user’s experience as good as it can be. Right now Google’s engineer team looks like a relentless, innovation oriented squad, that’s created the right context for a new update that includes better features.

If we want to be prepared for the future, we must first understand what happened in the past. What did Matt Cutts say about how Panda affected (until now) the webmasters?

And so, if you think you might be affected by Panda, the overriding kind of goal is to try to make sure that you’ve got high-quality content, the sort of content that people really enjoy, that’s compelling, the sort of thing that they’ll love to read that you might see in a magazine or in a book, and that people would refer back to, or send friends to, those sorts of things. “

If we take this statement at face value, we could say that Panda is focused on content. For the time being, we’ll have to speculate a bit. Therefore, Google might need a high-quality-content guardian. Should we conclude that the big search engine is not happy with the way content is used or abused by webmasters in the present?

Google Panda Update

Great content has to be the foundation of any good site”.

This is what Matt Cutts told us 5 years ago  and this is what he has kept on saying in the most recent months.

The original PayDay Loan algorithm went after unique link schemes, many of which are violating Google’s guidelines for spammy queries such as “payday loan”, pornographic and other heavily spammed queries.

“The PayDay Loan has been a very special kind of beast from the very beginning.”

Its explicit purpose is to crack down on spammy and low-quality links, on anchor-text optimization and link-building schemes and on various other over-optimization tactics. So, even though Google seemed to handle the spam issue quite well, improvement is needed in this area as spammers will always find new ways of “cheating” the system. A Google spokesperson gave a declaration for Search Engine Land saying  that “Over the weekend we began rolling out a new algorithmic update. The update was neither Panda nor Penguin — it was the next generation of an algorithm that originally rolled out last summer for very spammy queries.”

Panda 4.0 – a Boost for Small Businesses?

The question that is on everybody’s lips right now is: what is the big change that Panda 4.0 brings? Will Panda 4.0 be more like “Star wars – The Phantom Menace”, a flawed but game-changing reboot, or more like “Alien 4”, business as usual and not that exciting? Well, for one, it doesn’t look like it will necessarily stop at being the final sequel in the series.

Being a very fresh topic, we can only assume what Panda 4.0 is being targeted at. I am pretty sure that in about a month or so we will see some real, palpable effects of the update. Until then, we can infer some changes based on the announcement made by Matt Cutts at the Search Marketing Expo. The head of the search spam team at Google stated at the conference that took place in March that they are working on a new generation of Panda.

Matt Cutts explained to the audience that this new Panda update should have a positive impact on helping small businesses and small websites do better in Google search results.

A “softer” Panda update was also rolled out in 2013, in July. The same Matt Cutts said that

“We are looking at Panda and seeing if we can find some additional signals, and we think we’ve got some to help refine things for sites that are kind of in the border zone, the gray area a little bit, And so if we can soften the effect a little bit, for those sites, that we believe have got some additional signals of quality, that will help sites that were previously affected – to some degree.”

It is questionable whether benefits in ranking were really visible after this “softened” update, or if they were beneficial to the intended parties. We can only wait and see if this update will be really doing any good to the small businesses or if this is just wishful thinking.

Along penalties, a common issue in World Wide Web is that the algorithm’s changes affects mostly local websites at the expense of national or international businesses.

It’s never been easy to game the algorithm but nowadays, results don’t come cheap at all.

Quite often, the battle isn’t between the most relevant sites but between the ones with larger pockets. It doesn’t matter if a small, local business has better prices and services. As long as it can’t be found on Google, it will probably not be accessed that much. As they don’t have a big budget to spend on advertising, they will focus on generating high quality content based on what their users searched to find their page. Even so, having just a few clues to tell them whether they’re getting it right, small businesses rarely win the battles with the big fish.

Will Panda 4.0 give a helping hand to the small businesses, boosting them to the top of the rankings or will the first page of results still be dominated by “the big guys”? It’s best not to get ahead of ourselves, as we’re still not sure what impact the 4th Panda will have.

Competition

Is your website ready for Panda 4.0 and PayDay Loan 2.0 or any Other Update?

Speaking of the Star Wars movies, there’s one line from the wisdom archetype master Yoda that should stick with you:

“There are many things you have to unlearn before you start learning”.

Website Audit

Firstly you should be actively and constantly auditing your website in order to be aware of any search optimization issues. And having a clue about what Panda 4.0 and PayDay Loan 2.0 will target should pretty much be enough to get you on the right track. There’s a lot of things that you need to work on in order to remove things that will most likely hurt you: links from guest blogging networks, links from spam sites, exact match anchor links, optimized anchor links etc.

Site Architecture

You might have heard this before but a mindful webmaster should always pay attention to crawling. If you want to be sure that you will be properly indexed, you need to make sure that not only the first page but all the pages of a website are easy to crawl (at least the ones that you want to be crawled).

Another architectural element that you should keep in mind is the speed. Saving seconds from the load time of your website could increase the rate of conversion and decrease the bounce rate.

Also, your URL’s name might influence the overall impression of your website. User’s first contact to your site is your site’s URL as seen in the SERP. There is a chance that if your URL’s name is more readable, it will be more likely to be clicked on.

Also it is very important how you interlink all the pages on your site. It all has to make sense and be easy to crawl by the Google bot.

Quality

One of the reasons Panda saw the light of the day in the first place was to watch over the quality of the content. Even so, easy-like-sunday-morning articles are still to be found in the wild. One should not write an article just for the sake of an editorial agenda but for the sake of meaningful ideas and interesting research that you’ve done or noticed.

Engagement

We won’t put our money on what Panda 4.0 is really about but we can say for sure that it all comes back to quality. You have to ask yourself: are you generating content for your audience or just to fill up some holes in your publishing schedule? You need to create content that will engage customers, content that will provide added value to them, responding to their needs and being as relevant as it gets.

Keywords

Google really made some interesting changes along the way in what concerns the understanding of a word meaning and in generating content on the strength of intent and not on the exact keyword match. Even so, the misuse of keywords is still very common and Panda 4.0 might have a real problem with it.

Don’t create similar web pages just for the sake of optimizing some specific keywords.

Make sure that every single page of your website stands on its own and optimize each of it keeping Google’s regulations on top of your mind. You can make use of the Google Keyword Planner in order to find the keywords that perform best and are the most suitable for your website and your business.

Freshness

You should focus on topics of interest to your audience, meaning not just fresh and hot topics but also interesting subjects for them.

New content will not only tell your target that you are always vigilant but it will also bring benefits in terms of crawl frequency.

Social

Make your content “social”. Allow for Facebook, Twitter and Google+ sharing, or even make comments dependent on logging in with social network user profiles. Expand your social media mix: add YouTube, LinkedIn, Pinterest, and whatever else works for you.

Titles, Headers, Description

There are also things that you need to add or include (not just things to remove), in order to fully take advantage of the new Google algorithms. Since anchor-texts will matter less in the future, it’s time to embrace the idea that the best way to get the attention of search engines is to not write for the search engines, but for the users. Write content like you would like an article that you yourself would like to read on the subject. Have a first paragraph that clearly explains what the story is about. Use subheaders and a very easy-to-read structure. Go in-depth and add screenshots and diagrams, or even videos.

Conclusion

It’s quite hard to draw conclusions just after the announcement of an update. What should you be aware of? What things should you stay away from? I think the best advice we could give remains to use your common sense. If you are building a natural, healthy and quality content that will help you get to the relevant audience and you are not trying any shady link scheme, than all these updates shouldn’t have a negative impact on your website.

What are the biggest challenges that the new Google PayDay Loan 2.0 and Google Panda 4.0 Updates will bring?

Photo credits: 1 2

The post Google Panda 4.0 and PayDay Loan 2.0 Updates Launched Today appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
https://cognitiveseo.com/blog/5443/google-panda-4-0-payday-loan-2-0-updates-launched-today/feed/ 1
Black Hat SEO – A Never Ending Story + A Google Hole https://cognitiveseo.com/blog/2995/black-hat-seo-a-never-ending-story-a-google-hole/ https://cognitiveseo.com/blog/2995/black-hat-seo-a-never-ending-story-a-google-hole/#comments Wed, 12 Jun 2013 14:26:43 +0000 http://cognitiveseo.com/blog/?p=2995 Yesterday at SMX, Matt Cutts tweeted about a new update that is targeting highly spammed keywords such as : Payday loans Buy Viagra Adult stuff Other Highly competitive keywords (practically everything that makes tons of money…) I am glad that Google has finally addressed the problem that I publicly mentioned a year ago.(really funny that […]

The post Black Hat SEO – A Never Ending Story + A Google Hole appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
Yesterday at SMX, Matt Cutts tweeted about a new update that is targeting highly spammed keywords such as :

  • Payday loans
  • Buy Viagra
  • Adult stuff
  • Other Highly competitive keywords (practically everything that makes tons of money…)

I am glad that Google has finally addressed the problem that I publicly mentioned a year ago.(really funny that the post was done on June 13 2012 and the “response” came on June 12 2013)

This is a very important update because it tries to find a solution to faster penalize highly aggressive spam sites.

Looking at the some of the queries affected by this update, I can see a trend on all of them in terms of returned results. It is a mix of:

  • “Authority” sites (more or less)
  • Newspapers & Informational sites
  • Webspam (yes it is still here)
  • Unrelated sites (due to massive spam)

 Results after the Update

 

After a quick study on the results it seems that, depending on the SPAM level on each keyword, Google has now a higher chance of showing irrelevant content or too much informational content. In the “buy Viagra” case, which is a commercial keyword with immediate user intent, I can only see results that provide unrelated conten and spam sites. The SPAM has been reduced but now Google might need to try and work on the relevancy of the results.

Here are a few hypothetical questions:

What if in a specific niche no sites “play” by Google’s rules?

What sites would Google choose to rank?

Policing these keywords is really tough because of the “eco-system” that was formed around them.  For years, almost anyone trying to rank on these keywords did a lot of shady stuff. Even the legit companies did.

The exaggerated competition, and “holes” in the system, led to “an never ending spamming” activity that was applied by almost everyone in a chance to trick the system. Out of the top 10 results, you could easily see around 7 to 10 spammy results.

My personal theory on why these keywords are so tough to police algorithmically, is that the entire “eco-system” around these keywords is formed of spammy sites. Let’s give it a hypothetical number of 95%. We have another 3% informational sites and maybe we have another 2% legit companies in a particular super highly competitive niche.

If all the algorithms rules will be applied “à la carte” we might end up with no commercial sites ranking on those keywords, and instead we will have a lot of informational content for a set of commercial keywords. Google ends up with cleaner results, in terms of sites that play by their rules, but this leads to compromising the users experience.

That is why it might be so hard to police these keywords from a conceptual point of view and not from a technical point of view.

The user doesn’t know and doesn’t care what’s a spam site. He expects to find a site that solves his personal problems when he does a specific search.

The policing problem that Google has is to find the fine border between user experience and relevant ranking sites (with or without shady techniques).

From an algorithmic point of view, here is something that still works.

An Active Google Algorithm “Hole”

There are still “holes” in the Google Algorithm, and one of those is Google’s inability to visually crawl pages quickly. One of the latest black hat techniques that are applied on some of the highly competitive keywords is to get hidden links that can’t be seen by a normal crawler but by a visual one.  Google catches these spammy link profiles but not as fast as the sites are spamming the results. This leads to sites that stay there for a few days and then get dropped. The spam process is recurrent and goes over and over again at a very high speed.

Here is a post I written half a year ago that addresses this “hole”.

I end my post with the desire to learn more from your experiences. Looking forward for your comments.

The post Black Hat SEO – A Never Ending Story + A Google Hole appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
https://cognitiveseo.com/blog/2995/black-hat-seo-a-never-ending-story-a-google-hole/feed/ 2