Search Results for “subdomain” – SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies https://cognitiveseo.com/blog SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies Thu, 30 May 2024 14:00:40 +0300 en-US hourly 1 https://wordpress.org/?v=5.3 13 Things We Learned from 10 Years of Writing SEO Friendly Blog Posts https://cognitiveseo.com/blog/24668/seo-friendly-blog-posts/ https://cognitiveseo.com/blog/24668/seo-friendly-blog-posts/#comments Mon, 20 Feb 2023 16:03:13 +0000 https://cognitiveseo.com/blog/?p=24668 We have written and optimized – for our own blog or other pages – thousands of articles and SEO friendly blog posts in the last 10 years. What better way to learn from past successes and mistakes? We’d like to share them with you, not just as a holiday gift, but mostly because sharing our […]

The post 13 Things We Learned from 10 Years of Writing SEO Friendly Blog Posts appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
We have written and optimized – for our own blog or other pages – thousands of articles and SEO friendly blog posts in the last 10 years. What better way to learn from past successes and mistakes? We’d like to share them with you, not just as a holiday gift, but mostly because sharing our insights with others is what makes us and the people around us stronger and more resilient.

 

Google’s rules and updates changed way too many times to be worth counting and the content writers on this blog have changed a lot as well, trying to uncover Google’s mystery box.  We’ve lost weight, hair, nerves, and money in the process but believe us, we have learned a lot and most of the times the hard way.

 

So, we’re going to expose ourselves in a vulnerable matter; yet, if this article saves you from repeating some mistakes we did, we declare ourselves happy. 🙂 

 

Things we learned from writing SEO articles

 

The end of the year is usually a time to reminisce and reflect on the time that has passed and hopefully extract some amount of wisdom – no matter how small – to take into the new year. For many of us, 2021 will be different, since we’ve likely had to go through this exercise a lot earlier than the end of the year.

 

So, we decided to look even further back, in hopes that the lessons we find will speak to something greater than the general terribleness of this year.

 

We’ve run an SEO blog for ten years now. And we’ve learned one or two things along the way. Here are 13 of them (yes 13, in 2020):

 

  1. Optimize Everything from URLs to Conclusions
  2. Don’t Assume Anything! Check Grammar, Facts, Quotes
  3. Being Consistent Is Difficult. It’s Also the Key to Success
  4. Originality Is Good, but so Is Content Update
  5. Google Algorithms Come and Go, Quality Content Stays
  6. Creativity Is A Lot of Hard Work
  7. Questions Are More Useful than Answers
  8. Forget Academic Writing
  9. Solve Your Readers’ Problems, not Your Dilemmas
  10. The Title of the Article Will Influence Its Performance
  11. Write Less but Write Better
  12. Write Less, Promote More
  13. Stick to Your Principles
 

Optimize Everything from URLs to Conclusions

 

I know, you already got this: you know that you need to have well optimized content, meaning that you need to use your targeted keyword within the title once and the body content at least ten times and Google will have no other choice than rank your content. And if it doesn’t, well, we all know that Google sucks so, it’s their fault.

 

Of course, this is an exaggeration. But it has a grain of truth in it.

 

As search marketing changed, we have evolved as well as learned what both our readers and Google expect from us. 

 

We all know that optimized content is the key to success. But optimization shouldn’t stop at the body of the content. You should also optimize:

 

  • The Title
  • The URL
  • The Meta Description
  • The Slug
  • The Images 
  • The Internal Links
  • The Outbound Links
  • The Text Length
  • The Article Main Image

 

We used to overlook some of these items. Or, at least we did not give them the right importance. This is how we ended up with URLs like this: https://cognitiveseo.com/blog/26/the-best-link-building-blogs-experts-and-their-tutorials-from-2011/ or with three articles published one after another having the titles: How to Get Links, How to Earn Links and How to Acquire Links

 

You’ll find plenty of material on this blog on how to optimize all of those reminded above, so we won’t insist too much here. But here are some resources you might find useful:

 

SEO Writing for Copywriters

Internal Linking Strategy 

Title & URL Influence on Rankings Research

Link Building Campaigns

How to Write SEO Friendly Titles

 

Don’t Assume Anything! Check Grammar, Facts, Quotes

 

Do you know what Euripides and my manager have in common? They are both stuck in my head repeating this sentence: Question everything! 

 

While this might be an obvious one, take a sincerity test: when was the last time you double or triple checked a well-known fact? Even the basic ones like: cracking your knuckles will give you arthritis or Eskimos have dozens of words for snow.  You know this, so you use them in your content without checking them again. 

 

Whether it’s a well-known SEO fact, a very famous quote or an obvious grammar rule, double check it. 

 

Some time ago we published an article that talked about Wikipedia and whether advertising can be made on the big encyclopedia’s site. And as an example, we offered Gibraltar. 

We were super excited about the article and we thought it would stir a lot of controversies. And it did. Yet, the apple of discord wasn’t whether you can inject advertising, but whether Gibraltar is a country or not. It is a territory disputed between UK and Spain, we knew that, but we assumed that if it has its own capital and currency, it must be a country. Well, it’s not and we had to stop and moderate the comments section due to this issue. We changed the screenshot and transformed “country” into “territory” but, too little too late. 

 

gibraltar country

 

We also had our share of “grammar shaming” from our users. Some more years ago, we published a few pieces (that we assumed were written correctly from a grammar point of view) and we got some emails from our readers that weren’t so flattering.

We had spent so much time in doing research, documenting and reading a lot, and what people noticed first were the grammar mistakes.

Of course, we were proud enough to assume that we knew grammar, and that it wasn’t what should have mattered first. But a few emails from our users later, we decided to collaborate with an English teacher. And that was one of the best decisions we took.  

 

When it comes to grammar, things are a bit tricky, we know. If your first language is not English, chances are you’ll need someone to look over your content. At least in the beginning, to give you some guidance. Even if you’re one of those who goes overboard when correcting grammar mistakes, a fresh eye is always a good idea.

Because you want your articles, your research, your blog to become an authority in your market, and this would be almost impossible with typos or style errors in your masterpiece. 

If someone else is checking your work, that doesn’t make you less of a content writer or copywriter. It will make you a better one. 

 

Being Consistent Is Difficult. It’s Also the Key to Success

 

Ever wondered why so many TV shows which go on forever sooner or later have a significant drop in quality?

 

Coming up with a good idea can happen to any of us. Coming up with a good idea every week for years and years? That takes more than creativity.

 

It takes hard work and discipline and accepting the fact that some weeks it’s going to be a lot harder than others. But the important thing is to keep going and keep wanting to put out good work. Things aren’t always going to be in a straight line. Difficult times are followed by better ones. And sometimes the best rewards come after a period of struggling and pushing through.

 

Do you see the screenshot below? It’s a pretty nice growth, isn’t it?  It’s a screenshot of cognitiveSEO’s visibility taken from our Site Explorer

The chart below looks this good because what we did two things:

 

  1. We published constantly (we published at least once a week)
  2. We optimized everything mentioned in the previous chapter, from URLs to conclusions. 

 

cognitiveseo increase

(I admit that I gaze at this chart from time to time when my motivation is gone, or I procrastinate too much.)

 

Yes, we also used our Content Optimizer a lot and it worked great for us. But we were consistent most of all.

We published frequently and we optimized everything for each and every article, be it an in-depth research or a short case study.

And it did pay off. 

 

Originality Is Good, but so Is Content Update

 

Coming up with original ideas is no doubt key to progress. But so is updating existing ones, despite being less glamorous.

 

Gaining knowledge tends to happen in small and uncertain steps, rather than in leaps and bounds. Part of the process is refining and testing existing knowledge. Which is why we’ve often opted to update articles and research rather than start from scratch.

 

And it turned out to be a very good decision. We even made an article on how content optimization increased our SEO visibility big time.

The results are summed up in the chart below. 

 

Keyword rankings after optimization

 

You surely have articles that are not relevant anymore. Or research you’ve worked so hard on, but now it is not bringing anything in terms of traffic.

 

Instead of letting them rot, try to resuscitate them, if it makes sense. You’ll win more traffic and improve your overall blog quality.

Worst case scenario, this strategy won’t bring you much more traffic but at least you’ll have your blog articles up to date. 

 

Google Algorithms Come and Go, Quality Content Stays

 

This is similar to the one hit wonder music bands.

We all remember the musical hit, we sing it at birthday parties for a period of time, yet we probably won’t buy the album just for that tune.

Same thing may happen to your content. If you want your readers to look at your blog/brand/name with respect and put you in the trustworthy content category, make a habit out of delivering quality.

 

As the saying goes, we are what we repeatedly do.

Excellence is not an act, but a habit.

You’ve written a blog post and you have thousands of shares and appreciations? That’s a great. Yet, that one-time performance won’t keep you on the top for long. On the contrary, once you’ve set the bar high, you need to keep up with it to have killer content. 

 

And yes, Google Updates are a harsh reality. Yet, we have thousands of readers, clients, users, and it rarely happened for very good quality content to be penalized.

We’re not saying that it didn’t happen. Unfortunately, we’ve seen good quality being penalized. Yet, take this as a prevention measure. 

 

If a policeman stopped you in traffic, would you be super confident that you did nothing wrong, or would you feel a bit panicked as you know you probably broke a few rules here and there?

 

Same with content and updates. I know, it’s waaaaaaay easier said than done, but try to write content in such manner, that if a quality update pops out, you won’t feel that scared.

 

But don’t take our word for granted. Here’s what Google says about what you should do when they update their algorithm. 

 

google update recommendations

 

Creativity Is A Lot of Hard Work

 

The famous American television and radio host Larry King used to say to his audience a very witty story about his father.

He says that his father, of Ukrainian origins, came to the US thinking that America was the greatest land of all, where even the streets were paved with gold. However, shortly after arriving, his father realized three things:

  1. The streets weren’t paved with gold.
  2. The streets weren’t paved at all.
  3. He was the one to pave the streets.

 

There’s a dangerous cliche about creativity being the sort of lightning in a bottle phenomenon, young hip creatives staying late at night and coming up with wild ideas out of thin air. And that’s… definitely a more exciting version than reality.

 

But the truth is that so many times creativity is time and patience and incremental progress.

And it’s a collaborative effort more than it is an individual one. Ultimately, you discover there is truth in the adage that spontaneous things take a lot of preparation.

 

Don’t pressure yourself in being creative 100% of the time. You might be creative most of the time, without even noticing. 

And we know it’s easier said than done but even if you don’t feel like, start writing. Just start, and the rest will follow. 

 

Larry’s King father was a great man, we’re sure of it. Yet, as inspirational as these success stories are, this is what they tend to remain: stories.

Of course, not all of them. Inspiration exists, but it has to find you working.

 

Questions Are More Useful than Answers

 

We’ve asked a lot of questions along the years and we haven’t always been able to answer all of them.

Understanding the right question to ask is, more often than not, the more challenging task.

 

Getting access to large amounts of data is no longer a serious problem. What to do with the data will largely remain one for years to come.

 

So, whenever you want to perform research, or elaborate on some stats, please remember: it’s not just that statistical interpretation and analysis requires a certain skillset. There are also strategic and sometimes even ethical choices to be made about how to frame the results or even what to look at in the first place.

 

Over the years, we have performed research on billions of data points. And no, there is no exaggeration here. Literally billions of data points. 

Each and every time we started having a question in mind we wanted to get an answer for; yet, almost each and every time we realized along the way that we had asked the wrong question. 

Of course, this is the beauty of research. But don’t be too proud when performing research. Even if your computer blocks after tens of Excel documents open, even if your stats software crashes, never ignore the other questions that will pop up from the research. 

 

Take from instance this research on the infuence of Title and URLs on rankings. The initial plan was to analyze a few hundred article titles and see if and how a title can influence rankings. 

We ended up analyzing 35k keywords in both Titles and URLs because we realized that what we should actually look for is the importance of a keyword’s occurrence in a title, URL, domain, subdomain and URI. 

 

And as you can see in the chart below, in some cases, that huge amount of analyzed data told us…well…nothing relevant. Yet, it was still worth discovering it. 

 

title research cognitivveseo

 

Forget Academic Writing

 

We’re going to keep this short. When we talk about academic writing, we don’t necessarily mean scientific articles.

But there’s a certain rigor to that writing which might not appeal to the general audience. This is not to say you should dumb things down.

 

There’s a simple, two-step approach for making this happen:

Read what you’ve written out loud. If some parts don’t sound like something you’d say to a friend in real life, then change it.

 

Solve Your Readers’ Problems, not Your Dilemmas

 

A lot of content comes from wanting to share experience. You know something most other people – even those in your field – might not know. You are excited about that and want to make them aware of your newly gained wisdom. The important thing to remember is that the focus should stay on sharing and not on you as the source of wisdom. That’s not to say your opinions have no place in your writing, on the contrary.

 

But always ask yourself these 3 questions:

 

  1. Will this info help my readers better understand the subject?
  2. Will this info help my readers better apply this knowledge for doing something practical?
  3. Will this info help me look cool but not add anything valuable to this piece?

 

If the answer to the first two questions is a resounding “Yes”, then you should definitely include that piece of information in your copy. If the answer to these questions is “No” and the only “Yes” comes from the third question, well… you’ve got yourself a pretty good conversation opener for the next party.

 

The Title of the Article Will Influence Its Performance

 

We’re not talking here only about the fact that keywords used in the titles of your copy have a high ranking importance. We’ve conducted a study where it seems that keyword appearance in the title makes a clear difference between ranking 1st or 2nd. But also about the “catchiness” factor.  And no, I am not talking about click bite titles, but about titles that are relevant in your industry.

 

We’ve published so much, and some content registered success while some didn’t, but one thing is sure: if it has anything related to Google in the title, it’s going to perform pretty well. 

When comparing any study, research, opinion article that we’ve published to one that has “Google” in the title, chances are that the latter will get the most traffic, shares, links, etc. 

 

google posts

 

So, you might be thinking: why don’t you add “Google” in all titles? And now we come back to the previous points mentioned here: we don’t want to do click bait only; we want to be relevant, to offer quality and respect for our readers. 

 

When creating the title of your content, you should really think things through, as the title must not be only relevant and attractive but also SERP friendly.

 

Write Less but Write Better

 

I am sure you’ve noticed this as well for years: fluff content in most of the industries (especially digital marketing).

There seems to be very little focus on SEO, audiences, conversions, and how articles/content will be helpful for the readers. All of these things should be figured out BEFORE you write anything. It is obvious when something was written strictly for an SEO goal;

 

Without value, fluff content isn’t going to help you even if you rank well for it.

 

There is no reason to create content so that you have it. You have to plan each piece of content based on which audience(s) you want to reach, what goals you have for each piece of content, how you will use it in social media to further your goals, and how you are hoping to rank.

This takes a lot of time and effort, but it is better than wasting time and resources on content that brings nothing to your company. Without showing the ROI for your work, your job becomes expendable.

 

Write Less, Promote More

 

Don’t neglect content promotion. You might be overly focused on content marketing processes. And that’s definitely a good investment!

However, a lot of content marketers tend to skip the steps related to content promotion. We know, we did. 

 

Without promotion,  the majority of your work never reaches your targeted audience.

 

We recommend writing less and devoting more time to content promotion. Start your content plan by writing a list of channels and influencers that can help you reach more relevant users and will allow you to start getting traffic and leads from each published post. 

 

promotional plan

source: multichannelmerchant.com

 

Stick to Your Principles

 

There will be many cases when you might be tempted to write click-bait articles or write just for the sake of writing, without offering real value.

 

Because you’ll be looking at your competitors, and you’ll see that they might get results even with not-so-qualitative content. You’ll invest time, money, and nerves in well-documented researches that will not always perform the way you want. You will get angry, and you will swear you won’t spend another night trying to bring quality to the Internet world.

 

I am sure this happens to any content writer at least once. Yet, if you do stick to your principles, if you do invest time and quality in each and every article you write, it does pay off. These are not just empty words. Long time effort in content writing does pay off.

Don’t write anything you wouldn’t read. 

There are no easy gains, indeed. Yet, what matters at the end of the day is to have good long-term performances. 

 

You might have read this article thoroughly, or you’ve just browsed the main titles, saying to yourself: I knew that. Knowledge is power. But what matters at the end of the day is what you do with that knowledge. All the actions you perform daily define who you are as a person, a marketer, or a business owner. So make sure your actions have an impact on something, no matter how big or small. 

 

The Canadian writer Margaret Atwood once said that the internet is 95 percent porn and spam. So, let’s make that 5% be damn good. 

The post 13 Things We Learned from 10 Years of Writing SEO Friendly Blog Posts appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
https://cognitiveseo.com/blog/24668/seo-friendly-blog-posts/feed/ 26
Do Private Blog Networks (PBNs) Still Work in 2024? Should You Build One? https://cognitiveseo.com/blog/20059/do-pbns-still-work/ https://cognitiveseo.com/blog/20059/do-pbns-still-work/#comments Sun, 06 Sep 2020 08:15:25 +0000 https://cognitiveseo.com/blog/?p=20059 If you use SEO to promote your website, you most probably have heard of PBNs (Private Blog Networks). The concept isn’t very difficult to understand (although it is difficult to execute), but with the overload of information out there, you might be conflicted.   Do PBNs still work in 2022? Are there any risks? Well, […]

The post Do Private Blog Networks (PBNs) Still Work in 2024? Should You Build One? appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
If you use SEO to promote your website, you most probably have heard of PBNs (Private Blog Networks). The concept isn’t very difficult to understand (although it is difficult to execute), but with the overload of information out there, you might be conflicted.

 

Do PBNs still work in 2022? Are there any risks? Well, the short answers are yes and yes. So should you build one? That’s up to you. However, this article has to purpose to help you make an educated decision. This won’t be a guide on how to build one; however, if you’ve thought about building a PBN for your websites, then make sure you’re giving this a good read.

 

Do_Private_Blog_Networks_Still_Work-min

 

  1. What Is a PBN?
  2. Do PBNs Still Work?
    1. How Google tricked everyone and was always one step ahead
    2. The fine line between PBN and owning multiple sites
  3. PBN Advantages & Disadvantages
    1. Advantages
    2. Disadvantages
  4. Should You Build a PBN in 2022?

 

Disclaimer: Just to get things straight from the start, we do NOT recommend building a PBN. They are risky and against Google’s terms of service. They also cost both time and money which you could be investing in safer and more efficient, evergreen tactics.

 

What Is a PBN?

 

PBN stands for Private Blog Network. Long story short, it’s a group of websites that a webmaster owns for the sole purpose of boosting the rank of other websites.

 

These networks fall into the BlackHat SEO link building tactics category. Obviously, at first, they didn’t. But after people started abusing them, Google took action. Before 2014, PBNs were the stuff. The hype was high and everyone started building them. Everyone wanted to get in for the ride.

 

That was until Google completely destroyed a few PBN services and some popular marketing bloggers’ PBNs. Here’s just one example of a single PBN service out of many that have fallen:

 

 

The truth is that PBNs are nothing more than Web 2.0s on steroids. Instead of using subdomains from WordPress.com and Blogger, you use expired domains that already have authority built to them. If you purchase the right expired domains and build the network properly, you can get yourself a nice asset.

 

In order to avoid getting caught by Google, BlackHat SEOs have to constantly hide their traces. What a tiresome struggle… This list of traces can contain:

 

Whois Info: If you have 10 domains that all link to one another and are all registered publicly under the same name, everyone is going to know it’s a PBN. Now most of the time the details aren’t public and you can also purchase Whois protection for extra fees (sometimes offered for free) but I’ve heard stories of Google having access to Whois info anyway.

 

IP Address & Hosting: If more websites are on the same IP, they’re definitely connected in some way. That’s why shared hosting accounts are dangerous. If someone spams that IP address, you’re also on the list. You can purchase dedicated IPs for each domain to fix this. (Not only for PBN sites. Do this for all your sites. It helps keep things safe.)

 

Company name and contact details: Some countries legally require you to post this information if you monetize your site in any way (you do, since it’s part of your marketing strategy, so if someone reports you, you risk getting fined).

 

Design, Code, Formatting, Content: Many times, people simply duplicate these websites by copy pasting and changing some basic aspects, like colors and logo. However, the platforms and themes they use are always the same.

 

Analytics Accounts: Have 10 sites under the same Analytics Account? Good luck evading Google.

 

Script IDs: If you have different tools that require tracking, you’ll need separate subscriptions for each and every one of them, otherwise anyone could figure out the connection between two sites.

 

Now these are the most basic ones and they are often publicly available. However, some people say (including me) that Google might be looking at other things as well:

 

E-mail accounts: Register anything under the same e-mail, especially if it’s a Gmail and you’ll end up linking everything to it. One mistake and you can compromise it.

 

Location: Have you logged into the same e-mail account or worked on all your PBN sites under the same IP at home? Good job, you just told Google you own them. You need proxies and proxies to keep things clean.

 

Doesn’t this sound sort of like a hacker or criminal hiding his !@% from the police? I actually never understood why Google doesn’t try to make these things illegal. I mean… if you’re spamming the internet and accessing the website with bots without the owner’s permission… If you’re constantly pretending to be someone else… Shouldn’t this kind of stuff be fined? I guess they either can’t or they just want people to keep doing them.

 

Now call me paranoid, but we use so many services from Google, such as Chrome, Analytics, MyBusiness and Gmail. They’re all owned by Google. Am I saying that Google is reading your e-mail to figure out if you have a PBN or if you’re buying links? No. But… Maybe?

 

Your Gmail account is probably connected to your smartphone. You have the internet connection on everywhere these days, so Google knows your location. You and another webmaster appear at the same location and 2 hours before your websites start linking to each other? Just saying…

 

Those are just speculations but think of it: If you wanted not to be caught owning two different hosting accounts, would you register under the same e-mail? Most definitely not. I’m not saying that someone from Google is reading your private e-mails and spying through your webcam. I’m just saying that there is a possibility that some sort of algorithm is out there, tracing administrative relationships between websites.

 

Do PBNs Still Work?

 

You’re probably wondering if PBNs still work. How much has Google evolved and how good is it at catching these schemes in 2021?

 

Well, despite what many people think after 2014, PBNs still work and they will work as long as backlinks are a ranking factor, something that won’t change very soon.

 

How Google tricked everyone and was always one step ahead

 

When PBNs started to become popular, I’m sure that both parties were somewhat terrified. The webmasters of getting caught and Google of people discovering the ultimate BlackHat SEO technique.

 

However, Google was also one step ahead. I’ll explain:

 

You see, as mentioned before, Google cares about backlinks. Link building is still a very important ranking factor. They’ve tried removing the links from the algorithm, but apparently, the search engine results are worse:

 

 

A PBN can’t really be detected if it’s done properly. There’s simply no way of finding out, or at least being sure. Everybody links to other websites. People know each other, they talk and they give links. If you banned solely on patterns, innocent people doing White Hat SEO might get hurt. You might as well ban everyone, right?

 

So what did Google do? How did they catch the ones that were truly thinking of building a PBN as a BlackHat way of ranking their money sites high? Well… they didn’t.

 

Instead, Google targeted a few popular PBN services and some popular marketing bloggers that were using them and blogging about them.

 

 

But how? How could Google possibly identify an entire network of private websites, built by professionals and experts in the BlackHat SEO industry? Well… it’s an easy answer. They bought the service. It’s that simple.

 

I mean… everyone could. Even Matt Cutts. Just order the service and know exactly what PBN sites are used. Worst part? Many people and local business owners had to suffer. Their websites got penalized and they did not even know how, because their SEO companies acted as intermediaries and bought the PBN service instead. It happens all the time: “Guaranteed #1 position.” and also guaranteed “It wasn’t me boss” in case of penalty.

 

How did Google get to the marketing bloggers? I have no idea, but I’m sure they were targeted. Don’t you think it’s kind of strange that immediately after, many popular marketing and SEO bloggers got their PBNs penalized, while hundreds of other unknown players kept saying that they worked? Had it been just a Google update and no manual verification, it could’ve been a disaster.

 

After getting penalized, both Spencer Haws and Pat Flynn (two very popular niche site builders and bloggers, you should check them out) turned 180 degrees, writing about how much time and money they put into their PBNs just to lose everything in one second. And then the almighty phrase started showing up everywhere: “PBNs are DEAD!” No… they’re not. They’re just very risky. However, people stopped jumping head first into it.

 

The fine line between PBN and owning multiple sites

 

I collaborated with a pretty big nutrition and fitness company. It was really hard work as I did all the content and promotion (although I had nothing to do with bodybuilding). Anyway, I did a lot of research before writing anything and managed to pull up great quality content that increased the organic traffic of the website by 30% in under 6 months.

 

Bragging aside, one thing that was really difficult about that particular project was that there were simply no websites to get links from. Why? Because almost all the websites were owned by 3-4 companies that were direct competitors. At least that’s what my boss told me, but I tend to believe him since he’d been in the market for over 10 years.

 

I started doing some research to find link opportunities and found out that these websites were all linking to one another.

 

See where I’m going with this?

 

While the fact that I had nowhere to get links from was frustrating, the fact that I discovered a PBN was fascinating.

 

There was just only one issue:

 

This wasn’t really a PBN. Nobody was getting penalized. I even reported this to John Mueller but my request was completely ignored. Why? Well… probably because the websites were completely legitimate businesses, with their own phone numbers, teams and services. You could very well order products from them and they would send them to you under their name.

 

And that’s when I started thinking that the PBN issue is much bigger than it seems. Is that really a PBN? Not sure… Can Google detect it and consider it a PBN? Probably. Should it penalize it? I don’t think so.

 

I believe that multiple websites from the same company can create a network that dominates the first pages of Google. On multiple positions.

 

I did a little research just to prove that this is the case for various markets and countries. While looking into the motorcycle niche, after only 2-3 searches, I bumped into this motorcycle news website (motorcyclenews.com) that ranks very well for some keywords.

 

I then used SpyOnWeb.com to determine if any other websites are linked to it via IP or some code. Turns out that many are. The relationship has been established through the Google Adsense account.

 

finding administrative relationships

 

Then, I used cognitiveSEO’s site explorer to take a quick look at what websites are linking to it:

 

pbns still work 2018

 

I only searched some websites that are in the same niche, like performancebikes.co.uk, ride.co.uk, classicbike.co.uk and mcnsport.co.uk. They all link to motorcyclenews.com. These are only 4 sites, but there could be others that are not so easily correlated.

 

Could this be considered a PBN? Maybe. But they’re all legitimate, authority sites that sell bikes or offer some sort of service. All of them generate organic traffic, offer interesting and good quality content and some even have physical, printed magazines.

 

Is that unethical anymore? Should it be against Google’s TOS?

 

Imagine that you build a very successful coffee shop. How do you expand? Do you make the same shop bigger and bigger until it eats up the whole city? No, you open a new one in another city and another city and so on. But do you make cheaper, worse coffee and just recommend people to your main shop in the primary city if they want to drink the good one? Of course not. You serve them the best you can if you want them to stay. And when they visit the initial city, they’ll know where to go.

 

It’s the same with these websites. Sure, you can’t call them all Starbucks, but you can build new ones all the time when you want to expand and they’ll all survive and thrive as long as they all provide quality to the users. And not even interlinking or using the same IP will be able to get them penalized.

 

Google’s only request is to provide quality to the users, so that they keep coming to Google to search and find YOU, because that’s how Google survives. If bad websites rank at the top because of some PBN scheme, people won’t like Google anymore.

 

Most PBNs are websites that lack design, personality and usefulness. They just fill the web and use Google’s resources (for crawling, indexing and other things).

 

However, if these websites are good, I don’t think anyone will have a problem. You can’t even call them a private blog network, at least not in the BlackHat SEO way of saying. In my opinion, it’s perfectly White Hat to have multiple high quality websites.

 

However, you can still get penalized…

 

PBN Advantages & Disadvantages

 

Although PBNs do have advantages, in our opinion, the disadvantages overweight them so we do not recommend that you build a PBN.

 

However, we will outline them, just to make sure you’ll get a good overview of things. 

 

Advantages:

 

More efficient with ranking: From all the BlackHat SEO techniques, PBNs are probably the most efficient. The better the expired domain names you purchase, the faster you’ll see the results.

Harder to get caught: Also, it will take some time until you get caught, if you get caught. You’ll have to be always looking over your shoulder and erase your every trace. It’s not an easy lifestyle, but it’s safer than blasting links with GSA Search Engine Ranker.

Easy in theory: Buy expired domains at a domain auction, build websites and put content on them, cover up all traces, link to money site. The part with covering up all traces is a little bit more difficult, but you’ll get the hang of it. That is if Google doesn’t hang you first.

Extra profit: You can also profit off PBNs by selling links to other websites. As long as they’re not direct competitors, it doesn’t really matter. However, keep in mind that this exposes you to risks. Someone could compromise your network.

Full control: This one is the best one. I have to admit…. don’t you just like it when you’re in control? No more “Here’s my nice content, will you link to it?”

 

Disadvantages:

 

High cost: At first it might sound cheap. Some shared hosting, a couple of IPs, some domain names and some content. Couple of hundred bucks? Well… make that double. Then multiply it by 50. If you really want to do this the right way, you’ll need good hosting, tons of IPs, proxies and tools. Sure, you could only be spending $1.000 on your network if you’re in a low competition market, but will the investment be worth? For high competitiveness you need to buy expired domains that have a high authority and you need a lot of them, at least 10-25. You also need to make sure they aren’t spammed. They can go for thousands of dollars each and we haven’t discussed the content yet. You think 500 word spinned articles will do the job? Think again, or the Panda’s gonna catch you.

Takes a long time: Although when you start linking the results are seen rather fast, the whole setup process is time consuming. You start with one site, then expand, but each takes time.

Difficult to manage: You’ll also have to constantly take care of your network. Excluding the content that you need to keep posting, each site comes with its own problems like SPAM, hackers, bills… and remember you have to do them all under a different name, IP and probably device as well, just to make sure.

Can get caught: The worst part is obviously that you always have the risk of getting caught. Is it worth investing all this time and money for it?

 

Should You Build a PBN in 2021?

 

This really depends on who you are as a person. Do you like taking risks? Do you care about Google’s TOS? Some people see Google as this evil entity that controls the internet and would love to profit off it.

 

You see, the truth is that great websites on Google, that always rank at the top, don’t need to build PBNs.

Webmasters that provide great content, that network and connect, that promote their content, products and services properly don’t need to use any kind of schemes to get to the top.

They will always be there or at least they will get there at some point, because the users dictate this.

 

So should you build a PBN? Probably not. We consider it to be wiser to spend all that money and time to develop one website first. Then you can expand to another website and another one and have your little legit network of high quality websites.

Spending thousands of dollars and hundreds of hours on simply buying expired domains and filling them with content you don’t really care about instead of spending it into user experience, quality content creation, promotion and better services doesn’t seem very wise.

 

At the end of the day, it’s a mix of time, skill and luck that decidess whether you’ll get caught with a PBN or not. You might get caught after the first 2 weeks. You might never be caught and even your grand grandchildren will use your PBNs to profit. However…

 

If you’re in for the long run SEO game, you’d better stay away from PBNs and focus on evergreen White Hat SEO techniques.

 

Search Engines are always evolving and getting better at detecting tricks every day. Why not focus on playing fair and actually improve the skills that truly matter? Why waste your time with learning how to hide when you can learn how to create better experiences?

 

Whatever side you’re on, we’re curious about your experience with private blog networks. Have you ever owned one? Have you ever tried a PBN service? Have you got penalized? Do you still own one and does it still work? Let us know in the comments. 

The post Do Private Blog Networks (PBNs) Still Work in 2024? Should You Build One? appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
https://cognitiveseo.com/blog/20059/do-pbns-still-work/feed/ 16
URL Structure. Dos, Don’ts and Best Practices for SEO https://cognitiveseo.com/blog/23628/url-structure/ https://cognitiveseo.com/blog/23628/url-structure/#comments Thu, 12 Mar 2020 08:40:38 +0000 https://cognitiveseo.com/blog/?p=23628 Hey, we get it. The URL structure is a difficult SEO topic. It’s not easy to master, but not impossible, either. In fact, we’re here to make things easier for you.   In this article, we’ll try to answer all your questions and provide examples for a better understanding of how the structure of your […]

The post URL Structure. Dos, Don’ts and Best Practices for SEO appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
Hey, we get it. The URL structure is a difficult SEO topic. It’s not easy to master, but not impossible, either.
In fact, we’re here to make things easier for you.

 

In this article, we’ll try to answer all your questions and provide examples for a better understanding of how the structure of your URLs influence your SEO strategies.

 

URL structure_cognitiveSEO

 

Even if you own an eCommerce website or are struggling with Local SEO and WordPress, after reading this article you’ll know how to set up your website’s URL structure for great SEO results.

So, keep on reading.

 

  1. What Are URLs?
  2. Why Are URLs Important for SEO?
  3. Does URL Structure Affect Google Rankings?
  4. URL Structure & User Experience
  5. URL Types: Static URLs vs. Dynamic URLs
  6. Click Depth vs. URL Structure
  7. Subdomains vs. Subfolders
  8. Trailing Slash vs. No Trailing Slash
  9. Best URL Structure for Local SEO & WordPress
  10. The Best URL Structure for eCommerce Websites
  11. URL Structure Mistakes
  12. Best URL Structure for SEO (Tips & Tricks)
  13. Does Google Plan to Get Rid of URLs in the Future?

 

 

What Are URLs?

 

A URL (short for Uniform Resource Locator) is the address of a resource on the internet.

 

You can think of it as a regular address, for a house.

 

Servers and browsers use URLs to access web pages and resources on the web. You type in an address, you reach a web resource. It’s pretty simple on the surface.

 

Now, of course, there are a lot of technical aspects to Uniform Resource Locators. However, most of them aren’t an issue for the regular web developer, since they’re handled well by servers and platforms these days.

 

url structure

source: https://sitechecker.pro/

 

Because platforms make it so ‘easy’, the URL structure of a website is often neglected. It’s not easy to understand and nobody tells you why you should pay attention to it.

 

People end up with big sites and bad URL structures and, unfortunately for them, URL issues are pretty nasty.

 

Why?

 

Because they require a lot of patience and double, if not triple check-ups to make sure nothing goes wrong.

 

If you mess things up, you can end up with a big drop in all your rankings.

 

So, it’s a lot better if you get things right from the beginning instead of fixing them later, when the site is big.

 

URL Web Address

 

Why Are URLs Important for SEO?

 

A lot of search engine optimization experts say that the URLs are very important for SEO.

 

So, are they?

 

Well… yes, they are.

 

A URL is important as it’s a link between the user and your content.

 

Google shouldn’t really care what your URL is as long as it’s compliant, indexable & unique. But what does this mean exactly?

 

google webmaster url structure

 

What’s really important is what’s behind that URL #thecontent.

 

Many say the URL needs to be short but, in my personal experience, Google handles long URLs just fine. And they can rank well too.

 

So, only “refining” your URLs constantly won’t help you very much to achieve true SEO success.

 

There are other, more important, OnPage SEO tasks to attend to.

 

What Google actually cares about in relation to your URLs is your site’s structure.

 

Structure is related to your URLs, but also to click depth, which we’ll soon talk about.

 

The good thing with URL structure is that you just have to set up things right once (for the bigger picture).

Then, just follow a simple list of best practices (which I’ll share with you soon) when creating new URLs.

 

 

Does URL Structure Affect Google Rankings?

 

URLs can definitely impact SEO.

 

There are a number of issues that are related to URLs that can affect your rankings. Two of the most important ones are keywords and length.

 

First of all,  you have to make sure that your URLs are valid. Only use the allowed URL characters. If you don’t know what those are, then the best thing to do is to stick to letters, numbers and dashes. Not underscores, but dashes.

 

As Google recommends:

 

Consider using punctuation in your URLs. The URL http://www.example.com/green-dress.html is much more useful to us than http://www.example.com/greendress.html. We recommend that you use hyphens (-) instead of underscores (_) in your URLs.

 

Keywords in the URL can also help you rank better for a specific phrase. For example, if I want to write for “Site Explorer” it’s a good idea to have my URL as /site-explorer.

 

Having something completely irrelevant in the URL can negatively impact the rankings of that  page, as the URL should be descriptive of the content within the page.

 

So that’s  why it’s a good idea to do some keyword research before writing your URLs. You can use our Keyword Research Tool.

 

Another important factor is the URL length. This isn’t an official ranking factor, but there is a strong correlation between shorter URL length and higher rankings.

 

It’s a good idea to read this entire article to find out how to have the best URL structure for your website, as it can definitely have an impact on your rankings on the long term.

 

URL Uniqueness

 

A URL has to be unique. Well… there’s no other way around it, actually. You can’t have two of the same URL and not land in the same place.

 

What you need to understand is that there’s a big link between URLs and content.

 

Google likes mapping content to a single URL. That makes it unique.

 

Can You Have the Same Content on Different URLs?

 

Have the same piece of content on different URLs and Google won’t like it.

 

For example, duplicate content is mostly considered a content problem when, in reality, it’s strongly linked to URLs.

 

Don’t believe me? Let me show you what I’m talking about:

 

You have a product that fits two categories on your site. That’s perfectly fine. However, if your standard website URL structure is an hierarchical one, then the product might show on two different URLs (with the same content).

 

So we could have a plant-A in the category green-plants but also in tall-plants. If the URL structure is hierarchical, it will look something like this:

 

domain.com/tall-plants/
domain.com/green-plants/
domain.com/tall-plants/plantA
domain.com/green-plants/plantA

 

This way, domain.com/tall-plants/plantA and domain.com/green-plants/plantA both host the same content, which makes it duplicate content.

 

That’s why, for big eCommerce websites, it’s a good idea to separate the products from the categories. This way you could have:

 

domain.com/categories/tall-plants
domain.com/categories/green-plants
domain.com/plants/plantA
domain.com/plants/plantB

 

Talking about ecommerce, if you need to extract data from ecommerce websites, check out this article.

This issue above is strongly related to structure. If you structure your website in a hierarchical way without considering the above mentioned, you’re bound to have duplicate content issues.

 

Of course, sometimes you can use hierarchical structures to your advantage, such as when you have a simple local website with presentations.

 

domain.com/services/digital-marketing/ads
domain.com/services/digital-marketing/seo
domain.com/services/branding/logo
domain.com/services/branding/design

 

If you know that ‘logo’ and ‘design’ are bound to the ‘branding’ category and ‘ads’ and ‘seo’ are bound to the ‘digital marketing’ category, then there’s no issue in keeping them like that. It actually makes sense to do so!

 

 

URL Structure & User Experience

 

Many SEOs say that URLs are important for a user’s experience. Let’s see why. 

 

Usually, you end up on a website either through the root domain name or from another website, through a link.

 

You’ll rarely type in https://cognitiveseo.com/blog/category/case-studies/ in the browser to access that page.

 

Most probably, you’ll go there through the Google search results or via our navigation menu.

 

Even if it was for you to access that URL from another website, it would probably be under an anchor text, like this: SEO Case Studies.

 

Google has been making efforts to shorten/hide the display of URLs in the browser, if not removing URLs altogether.
(Yes, indeed… well talk more about this at the end of the article.)

 

Sure, a very long URL can look shady and discourage people from clicking it.

 

What would you rather click?

https://cognitiveseo.com/blog/category/case-studies/

or

https://www.google.ro/search?safe=active&sxsrf=ALeKk03mlWIPa2ZmKmvUqRUZXkcfViGLTQ%3A1583311321632&source=hp&ei=2WlfXsqWJI_ergSCv7BI&q=cognitiveseo&oq=cognitiveseo&gs_l=psy-ab.3..35i39l2j0l8.336.1502..1638…0.0..1.176.1258.9j3……0….1..gws-wiz…….0i203j0i10.0ZJhf6POO0Y&ved=0ahUKEwiK55GntoDoAhUPr4sKHYIfDAkQ4dUDCAY&uact=5

 

Well, if it comes from a reliable source (such as a friend), you’ll probably click it. But otherwise, most likely, you won’t. 

 

From what I know, the longest URLs on the web are Google search results pages and links with Facebook ID parameters. Please feel free to share your opinion on this matter on the comments section below.

 

URLs are, however, important for a blogger’s experience.

 

If you want to get backlinks, you want to make your URLs appealing.

 

You don’t want to discourage a blogger to share your post on social media or link to your site from their blog posts.

 

That’s what I think an ‘SEO Friendly URL’ means. So keep your URLs pretty.

 

URL Types: Static URLs vs. Dynamic URLs

 

URLs can be split into two categories. You have dynamic URLs and static URLs.

 

But which ones should you use?

 

Websites, especially eCommerce stores, have both static and dynamic URLs.

 

In fact, any platform which has a database probably has some sort of dynamic URL protocol.

 

So if I set up a basic HTML website, those would be true static URLs. When I have a platform with a database and I’m trying to pull information from that database (let’s say eCommerce filters, such as colors and sizes) the platform will generate dynamic URLs.

 

Static Vs Dynamic URL

 

In Google’s eyes, all URLs are ‘static’. Once they’re indexed, it’s done. Change it without a 301 and Google will consider it gone and derank it.

 

The issue with dynamic URLs is that there’s an infinite amount of URLs that can be generated. That happens because of filters.

 

If you’re not careful, you won’t be able to keep track of them easily.

 

It’s a good idea to avoid too many parameters in a single URL. Limit them to 2 or 3.

 

This usually occurs when people add too many irrelevant filters and index too many pages.

 

Most of the time, people index all the parameters, which is a bad practice. Why index a page if it doesn’t have any searches?

 

Make sure that the parameters you’re letting Google index actually have searches. So if you have a sweater in 10 colors, see if people search for all those colors.

 

If not, index only the ones that do have searches.

 

Thus, if people only search for ‘red sweaters’, then you will only index domain.com/shop/sweaters?color=red. This means that ?color=blue, ?color=black would remain unindexed.

 

Moreover, keep your parameters in an absolute order!

 

What does that mean? It means that if your user selects the color first and then the size, the URL will be ?color=red&size=small but if he selects the size first and then the color, the URL will still be color=red&size=small.

 

So the order of the parameters in the URL doesn’t change. It’s the better option.

 

Sometimes, it’s not easy to set up a proper faceted navigation that benefits both the user experience and SEO.

 

If you want to set up a filtering menu properly, read this article about Faceted Navigation & Filters.

 

Canonicalization

 

Keeping an absolute order is not always easy to achieve. You’ll need a good web developer.

 

In case you can’t keep absolute order for parameters, canonicalization is an easy alternative.

 

So if you have both URLs (?color=red&size=small and ?size=small&color=red) you can just pick one as the main URL.

 

Remember to also self canonicalize the main URL.

 

Therefore, if ?color=red&size=small is our main URL, it would have a rel=”canonical” to ?color=red&size=small and then ?size=small&color=red would have a rel=”canonical” to ?color=red&size=small.

 

Confusing, I know, but very important. You can find out more about canonical tags & URLs here

 

301 Redirects

 

I want to make sure I also cover 301 redirects in this article, because they’re really important.

 

If you simply move a web page from one URL to another, Google will just consider the old page gone and the new page a fresh one.

 

301 redirects SEO

Source gomage.com

 

This means that it has to rank it again, which means you’ll lose the rankings of the old one and have to put up all the work again to rank the new one.

 

To keep the rankings and make Google understand that the old page simply changed its location, you have to use 301 redirects.

 

You probably know all that, but you’d be surprised how many people forget to properly 301 when merging websites. This has catastrophic consequences, so make sure you properly 301.

 

It’s also a good idea to avoid redirect chains. So, it’s better to have A > C and B > C than A > B > C.

 

If you’re looking for SEO tools that can trace redirect chains, the CognitiveSEO Site Audit is a good choice. You’ll find what you need under Architecture > URLs / URL Chains.

 

Redirects & Redirect Chains

 

Click Depth vs. URL Structure

 

URLs are about technical SEO. Not so much user oriented. Click depth, on the other side, is very user oriented.

 

Remember when I said that click depth also matters in the site’s structure?

 

Your site’s structure reflects itself in the click depth and your users react to it.

 

The more users have to click to get to where they want, the less likely they are to convert.

 

Click Through Rate

 

The same thing goes with Google. The deeper the click depth to a page, the less important Google thinks it is.

 

Click Depth is also technical, but it reflects the human behavior, more specifically users’ interaction with your website.

 

Click depth matters for SEO. We could even say it’s one of Google’s ranking signals. In fact, Google official John Mueller said it himself.

 

Now if you read my stuff in general, you know I’m not a big fan of just going after what John Mueller says.

 

However, in this case, there’s a lot of proof to back it up.

 

Breadcrumbs

 

Breadcrumbs can be a sketch of your site’s structure.

 

There are multiple ways you can implement breadcrumbs on your site.

 

The first would be in relation with the URL and site structure and the second in relation with the user’s click path.

 

It’s better to implement the first one, in general. A user’s click path can also be followed via the back and forward buttons in the browser.

 

You also have more control on making the breadcrumbs useful to the user if you structure your site properly.

 

Breadcrumbs & Trails URL Site Structure

 

For example, if you list the featured product Tuna on the Homepage and the user clicks it, a history based breadcrumb system would generate Home > Tuna.

 

Not very useful if the user also wants to see other types of fish.

 

Instead, if I have the domain.com/categories/fish/tuna I can have Home > Categories > Fish > Tuna, regardless of where the user comes from on that page.

 

The breadcrumbs can (and should) be hierarchical, even if the URL structure isn’t.

 

This means that I can have domain.com/shoes/running/ and domain.com/products/nike-xyz

 

Home > Shoes > Running > NikeXYZ where ‘NikeXYZ’ would link to domain.com/products/nike-xyz, ‘Running’ would link to domain.com/shoes/running and ‘shoes’ to domain.com/shoes, while the Home breadcrumbs will link to domain.com.

 

You can see how the site’s structure doesn’t always reflect in the URL path.

 

Subdomains vs. Subfolders

 

When structuring your site, there’s always the option of using subdomains.

 

A subdomain is what’s before your root domain name. Thus, tools.cognitiveseo.com is a subdomain, while cognitiveseo.com/blog is a subfolder.

 

Subdomains act… sort of like separate websites.

 

Many say there’s no difference between using subdomains vs. using subfolders, but many have also brought proof that it’s safer to use subfolders.

 

If your internal links strategy is set up properly, subdomains should also work very well.

 

While subdomains can rank properly, if you don’t know what you’re doing it’s better to stick with subdirectories.

 

Trailing Slash vs. No Trailing Slash

 

I’m just going to keep this short: it doesn’t matter.

 

Just make sure you keep it consistent and properly 301 to the main version.

 

Google treats https://cognitiveseo.com/blog/23628/url-structure/ and https://cognitiveseo.com/blog/23628/url-structure as separate URLs.

 

If you don’t use 301, both pages will get indexed and they will cannibalize each other.

 

In the old days of the internet, most web pages would have an extension as they were all seen as file names (such as page.html).

 

The trailing slash would represent a folder instead of a file, but today that’s not the case anymore. Just be consistent and 301 properly.

 

Relative URLs vs. Absolute URLs

 

Links can be absolute URLs or relative URLs.

 

Absolute URLs include the protocol, subdomain, subfolder and everything else after.

 

An absolute URL would be https://www.website.com/page/subpage/.

 

A relative URL would be /page/subpage/.

 

It’s very important to use relative URLs only on your website and absolute URLs on other websites.

 

So, if you do link building to get backlinks, make sure you always use absolute URLs.

 

For Google, it doesn’t really matter which one you use on your website, but it can affect you if you want to change your domain name or switch from HTTP to HTTPS.

 

If you use absolute URLs as part of your internal linking strategy, when you change your domain, those absolute URLs will remain, thus still linking to the old domain.

 

Sure, you will have 301 redirects set up, but it’s always better to have the new domain in all your internal links.

 

So make sure that when you do internal linking, you use relative URLs if possible, so when you make any changes to your domain, the platform can take care of everything and you won’t have to manually replace thousands of links.

 

 

Best URL Structure for Small Sites, Local SEO & WordPress

 

Small websites can have hierarchical URL structures, as previously mentioned. Just make sure you won’t cause duplicate content issues.

 

If you’re targeting multiple locations, then you should have separate pages for each location you’re targeting.

 

I know, many might say that these are doorway pages and that Google penalizes them.

 

However, they’ve been proven to work countless times. There’s also no alternative to doorway pages.

 

Keep it relevant and Google will reward you.

Local SEO URL Structure

If you have a WordPress blog, then you most probably want to keep the pages immediately after the root URL.

 

We’ve separated our blog under /blog because we have a separate WordPress install in /blog which makes it impossible for us to place article URLs immediately after the root domain name.

 

You might also notice the numbers after the blog. That’s an identifier, which was a technical necessity some time ago. It’s better if you don’t have those.

 

So, if you can, go for https://cognitiveseo.com/url-structure/ instead of https://cognitiveseo.com/blog/23628/url-structure-seo/.

 

If you’re wondering why we’re not doing this, here’s the answer: We could remove them, but it would require a big effort mapping all the articles for 301 redirects and we don’t consider this would have a big / positive impact on your rankings.

 

Our website has multiple functionalities and is pretty big and complex. We have our tools landing pages in our root domain, so it makes sense to separate our blog in the /blog subdirectory.

 

If you just have a blog, then keep URLs immediately after the root. Brian Dean’s blog on Backlinko.com is a good example.

 

Avoid using the date in the URL if your post is evergreen. This will discourage users to click your result in the future and will also make Google think your content is ‘old’.

 

The Best URL Structure for eCommerce Websites

 

When it comes to eCommerce websites, things aren’t that simple with URLs.

 

The safest way to go for it is to separate each section in its own subdirectory.

 

This means that you’ll need a /blog/ or /articles/ prefix for your articles and posts, a  /products/ prefix for your products and a /categories/ for your categories and so on.

 

This helps you keep track of your pages. If you every crawl your website to analyze it… it will be a nightmare to analyze the information if all the post types were in the root domain.

 

Image result for site structure

Source: searchengineland.com

 

Make sure you don’t add too many subcategories. Remember, try to keep the click depth … shallow.

 

Take advantage of the breadcrumbs recommendations I’ve made above.

 

Make sure you know exactly which URL parameters/filters you index and which you don’t.

 

You should definitely read this article about Faceted Navigation for SEO

 

URL Structure  Mistakes

 

There are some things that you must definitely avoid when creating your URL structure.

 

Here’s a list of the top biggest mistakes that webmasters make when they create their URLs.

 

Changing URLs without 301

 

As you’re on a page about URLs, if your structure is bad or you’re contemplating on changing it, then I can’t stress this enough.

 

Your rankings will drop if you don’t properly 301 from the old pages to the new ones.

 

Remember, if you change the URL, you MUST use 301 redirect from the old one to the new one.

 

Having multiple variants

 

One problem that  many websites have is not properly redirecting all the variants of the site to a single one.

 

For example you can have HTTP and HTTPS and then with WWW or without WWW.

 

This results in 4 versions which Google sees as separate sites, in a way:

 

http://cognitiveseo.com
https://cognitiveseo.com
http://www.cognitiveseo.com
https://www.cognitiveseo.com

 

Make sure you pick one and 301 redirect all the others to it.

 

You can read more about which version you should choose in our article about WWW vs non-WWW .

 

Having multiple URLs for the same content

 

Sometimes, it can happen the different URLs have the same content. This is called duplicate content and it can happen often in eCommerce websites.

 

You can have, for example, two filter parameters such as ‘red’ and ‘small’.

 

However, if all your red products are small and all your small products are red, those pages will mostly be identical.

 

This is just a hypothetical example, but things can scale pretty quickly, creating hundreds if not thousands of very very similar URLs with not much value.

 

If you want to read more about how to fix this issue, check out our Faceted Navigation Guide.

 

Using ‘bad’ characters

 

Browsers only support certain characters in the URL.

 

Most content management systems know how to handle these and will strip them from the URL if you add them unknowingly.

 

It’s best to avoid parameters and complicated URLs, at least for the pages you want to be indexed and ranked well.

 

Google can handle parameters with numbers and other characters, but most of the pages you want to rank high for very competitive keywords should be static URLs with keywords in them.

 

Using too many subdirectories & categories

 

If you have an eCommerce website, try to keep things short. Don’t add hundreds of layered subcategories. Only add the important ones.

 

A good idea to know which ones are important is to do proper keyword research. If nobody searches for those terms, maybe don’t add them as subcategories.

 

You might have some granular structure that seems important, but if users only search for the 5th level, then maybe make it the first or the second and cut the other ones.

 

Keeping everything in root domain

 

When you create the structure  of the site, make sure to separate different articles

 

Some web masters consider that the shorter the URL, the better. But not in every case!

 

If you have a blog on a certain topic, such as Backlinko.com, it might make sense to keep everything in the root domain. You have very few pages and it’s easy to manage.

 

However, if you have a big site, and you have services, products, articles, locations and so on, it will be a nightmare to analyse the website after a crawl if everything is in the root domain.

 

Not using keywords or using too many keywords

 

Make sure you have some of the most important keywords the users are searching for in your URL.

 

Not having keywords at all is a very bad idea, especially if you have only numbers, or dates or so.

 

So if you have an article about really good rock bands don’t let your URL be site.com/03/03/2020/article-1523 but instead have it site.com/top-5-rock-bands-2020.

 

On the other side, it’s a good idea to not have the keywords too many times. It looks spammy and Google can pick up on that.

 

Avoid creating duplicate iterations of the keywords in the URLs.

 

This can happen often on eCommerce websites, when creating categories and not editing their URLs.

 

The content management system will just pick up the title of the page, and the hierarchical URL structure will look like this.

 

musicsite.com/drums/acoustic-drums/acoustic-drum-accessories/

 

A better option would be:

 

brandsite.com/drums/acoustic/accessories.

 

Best URL Structure for SEO (Tips & Tricks)

 

There is no clear best URL structure for SEO as this depends on very many factors. However, in order to maximize the search engine optimization benefits, make sure to follow these best practices for SEO friendly URLs.

 

Use Keywords in Your URLs:

 

Keywords are very important for SEO. It’s a good idea to add them in your URL. These URLs are called semantic URLs.

 

It’s more important to have your target keywords in your title tags and content than in the URL.

 

However, adding them in the URL can bring some benefits:

 

For once, if users look just at the URL, they’ll know what it’s about.

 

Secondly…

 

If you do link building to your page without using keyword rich anchor texts, the URL will act as the anchor text so it’s a good idea to have the keywords there!

 

You can simplify URLs by removing short or less descriptive words such as stop words. Here are some stop words examples: to, the,  how, and, for, it, a, why.

 

For example, instead of /how-to-jump-really-high/ you could just go for /jump-higher/ or /improve-jumping/.

 

You don’t always have to remove the stop words from your URL.

 

For example, you might have the target keyword “how to cook” where the URL domain.com/how-to-cook/ is just perfect.

 

It’s also a good idea to add the main keyword in the URL, if you have one and it also has searches.

 

In this case, it fits my article: people search for “url structure seo” and my URL is /url-structure-seo/.

 

But people also search for “how does URL structure affect SEO”. Why didn’t I choose this keyword phrase as my URL?

 

Because the first one has more searches 🙂 I’ll let you figure out the rest.

 

If you’re looking for SEO tools that can check that for you on a large scale, make sure to check out CognitiveSEOs Site Explorer. You’ll find what you’re looking for in the Architecture > URLs section.

 

Keywords in URLs SEO Tools

CognitiveSEO Site Audit URL Analyzer Tool

 

Keep the URLs Short:

The popular opinion is that shorter URLs rank better.

 

While I personally still have to investigate this matter, I still keep my URLs short and to the point.

 

Why? Because they are better for user experience. Here’s how our most important pages URLs are:

 

URL Structure SEO  Examples

 

On a WordPress platform (not our case for the main site), they would be generated using data from the post title.

 

Content management systems such as WordPress would strip  some elements that are incompatible, so they would be like this:

 

cognitiveseo.com/site-explorer-by-cognitiveseo-backlink-checker-link-research

cognitiveseo.com/1-keywordtool-by-cognitiveseo-keyword-explorer-content-optimization

 

Not… horribly, but not very good either.

 

And it’s also on the safer side to keep them short. If URL length does actually matter for OnPage SEO, better have it short rather than long, right?

 

While there’s a correlation between shorter URLs and high rankings, it doesn’t 100% mean it’s because of the shorter URLs.

 

Maybe very well optimized sites also like to have prettier, shorter URLs.

 

However, don’t try to make them too short. For example, some use /p/ instead of /products/ and /c/ instead of /categories/.

 

I don’t think that’s necessary. In fact, I consider it looks more spammy.

 

Too short might also mean removing some important keywords

 

Keep URLs Unique

 

Make sure you don’t already have very similar pages before you write and publish a new page.

 

If you do, it might be a good idea to better optimize the other page instead, or target a different topic/set of keywords for the new one.

 

Use Hyphens Instead of Underscores & Avoid Special Characters in Your URLs

 

Hyphens and underscores look very similar, but on the internet they’re treated pretty differently.

 

Google recommends that you should avoid underscores in your URLs. They can cause issues.

 

Underscores are treated as word joiners by Google, while dashes as word separators.

 

People are also used to dashes more. So your URL should be url-structure-seo not url_structure_seo.

 

Also, avoid any special characters in your URL, except the basic ones used for parameters and anchors such as ? & = #.

 

Most platforms won’t even let you do it but, if your URL contains characters such as , or ; or ‘, it can cause problems.

 

If you don’t know what a special character means or does in a URL, then it’s better not to use it.

 

Of course, there’s also the trailing slash /, which is ok to use.

 

Use as Few URL Parameters as Possible

 

Parameters can add to length and they also make a URL look 

 

However, in certain situations, they also add keywords in your URL which can be a good thing if people search for those keywords.

 

Remember to only index the pages people actually search for instead of every possible filter combination your site can come up with.

 

Prioritize & Think about Click Depth

 

Don’t add too many deep pages, such as subcategories inside subcategories #inception.

 

Keep it short and to the point.

 

If you do have a lot of deep pages which are important, make sure you use internal links in your blog posts or other sections of your website so that Google can properly find them.

 

You can also share these pages on social media or other websites from time to time.

 

Avoid Hierarchical URLs When You Have a Site that Changes Often

 

This goes mostly for eCommerce or any site that is very dynamic, such as news sites, car trading sites, events sites, etc.

 

You can use hierarchy if you’re sure a resource won’t change its parent.

 

Don’t Stuff Keywords in Your URLs

 

Keyword stuffing is bad in content, bad in title tags and bad in URLs.

 

Don’t do it!

 

Sometimes, people stuff in keywords in their URLs by mistake.

 

Example: randomshoeswebsite.com/shoes/running-shoes/running-shoes-for-women/red-running-shoes-for-women/nike-running-shoes-for-women/

 

I’m not sure it’s the best example, but I hope you get the point.

 

Instead, maybe go for something like: randomshoeswebsite.com/shoes/running/women/nike/red/.

 

Avoid Duplicate, Similar & Thin Content

 

Again, duplicate issues are mostly caused by poor URL structure implementation, bad canonicalization and indexation practices.

 

Make sure you don’t have very similar pages on your website or they will impact your overall website SEO performance.

 

If you’re looking for SEO Tools that can fix duplicate content issues, then the CognitiveSEO Site Audit Tools is perfect for you. You can find what you’re looking for under the Content Section.

 

Thin & Duplicate Content SEO

 

Does Google Plan to Get Rid of URLs in the Future?

 

It might be the case that, in the future, Google will pursue its dream of getting rid of URLs.

 

The first step would be not to display them at all, first in the search results and then in the browser itself.

 

However, getting rid of URLs 

 

This all started with Google AMP, where Google caches the resources on their web servers, therefore displaying them on their ugly URLs, which they then hid.

 

If you want to know more about the subject, read this article about Google trying to remove URLs.

 

Conclusion

 

The URL structure of your website is important. Don’t neglect it! You only have to set it up right once.

 

Once you structure things properly, just follow the best practices. Add your target keywords, think about URL length, avoid keyword stuffing, limit irrelevant URL parameters and you’re good to go.

 

How did you set up your URL structure? Let me know in the comments section below!

The post URL Structure. Dos, Don’ts and Best Practices for SEO appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
https://cognitiveseo.com/blog/23628/url-structure/feed/ 3
Is Google Blocking Your Site Due to Mixed Content? Identify & Fix Mixed Content Issues https://cognitiveseo.com/blog/23215/mixed-content/ https://cognitiveseo.com/blog/23215/mixed-content/#comments Thu, 10 Oct 2019 09:41:17 +0000 https://cognitiveseo.com/blog/?p=23215 Google has recently announced that Chrome will block mixed content on web pages beginning December 2019. Starting with the Chrome 79 version, Google will gradually move to blocking all mixed content by default. Therefore, if your website has mixed content, it will be blocked and your users won’t be able to access it.   Everything from […]

The post Is Google Blocking Your Site Due to Mixed Content? Identify & Fix Mixed Content Issues appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
Google has recently announced that Chrome will block mixed content on web pages beginning December 2019. Starting with the Chrome 79 version, Google will gradually move to blocking all mixed content by default. Therefore, if your website has mixed content, it will be blocked and your users won’t be able to access it.

 

Everything from what mixed content is to how to identify/fix it can be found in the following lines. 

 

Is_Google_Blocking_Your_Site_Due to_mixed_Content

 

The Google Security Team reports that Chrome users now spend more than 90% of their browsing time on HTTPS on both desktop and mobile. The plan to begin blocking mixed content is targeted at addressing insecure holes in SSL implementations of sites that have already made the switch to HTTPS. Here’s everything you need to know about it:

 

  1. What Is Mixed Content?
  2. Why Is Google Blocking Mixed Content?
  3. How Do You Detect Mixed Content?
  4. How Do You Fix Mixed Content Issue?
  5. Does Mixed Content Affect SEO?
 

What Is Mixed Content?

 

Mixed content occurs when a secure web page (a page loaded through HTTPS) contains resources like scripts, videos, images, etc. that are served through an insecure protocol (HTTP). 

 

As you probably guessed it from the name, it’s called mixed content because both HTTP and HTTPS contents are loaded to display the same page, and the initial request was secure over HTTPS. 

 

In the following lines we’ll let you know why HTTPS is a must. You already know, but as repetition is the mother of learning, we want to highlight that that Hypertext Transfer Protocol Secure (HTTPS) is an extension of the Hypertext Transfer Protocol (HTTP) and is used for secure communication over a computer network.

 

The main motivation for HTTPS is the authentication of the website being accessed and the protection of the privacy and integrity of the exchanged data while in transit. 

 

Therefore, managing security risks is the key. Furthermore, you need to know that in HTTPS the communication protocol is encrypted using Secure Sockets Layers (SSL).  And, to better understand the concept of mixed content resources or insecure content and why the “s” from the HTTPS makes such a big difference, let’s briefly go through SSL certificates. 

 

What Are SSL Certificates?

 

SSL certificates are only used to confirm the identity of a website. These certificates are emitted and signed by certificate authorities with their private keys. Before getting a certificate from them, you must somehow confirm your identity and prove that you are the organization and website owner. Web browsers come packed up with a bunch of public keys from certificate authorities. They check if the certificates have been signed with the proper private keys, therefore confirming that their identity has been verified by a trusted authority and not by some random certificate generator. If the certificate is expired or not valid, a red warning will show up.  These warning messages will definitely turn the user down. 

 

not secure privacy

By using an SSL Certificate, webmasters can improve the security of their websites and better protect their users’ information.

 

You can protect unlimited subdomains of your base domain with cheap wildcard SSL Certificate.  Taking one step further, you need to know that there are two types of mixed content: active and passive.

 

Active mixed content – the mixed active content is the type of content that is the most harmful. In this case, an attacker can take full control of your page or website and change anything about the page. They can steal passwords, login credentials, user cookies or redirect users to other sites, etc.

 

Passive mixed content – when it comes to passive content, an attacker can intercept an HTTP request (resources loading via http) for videos or images on your site and replace those images with whatever they want. They can also replace your product pictures or place ads for a totally different product.

 

Why Is Google Blocking Mixed Content?

 

Although Google confirmed in 2014 that it considers HTTPS a ranking factor, all the buzz started when Google released version 68 of the Chrome Web Browser in July 2018. In this version, websites that don’t run on HTTPS are marked as Not Secure. 

 

not secure

 

As you can see in the screenshot above, the browser advises the user of that site not to disclose any passwords or credit cards.

The browser is advising potential customers not to perform any transaction on your site.

And that’s the last thing you want your user to see.

 

Yet, why mixing rum and cola makes a great cocktail but mixing HTTP and HTTPs is a big no-no? 

 

There are many situations that can cause mixed content issues and many reasons why mixed content is harmful, lots of them highlighted by Google itself.  Let’s focus on just a few important ones: 

 

  • Mixed content degrades the security and user experience of your HTTPS site.

 

Whether you like Google’s rules or not, you have to agree with this one: web security is more important than ever. And offering your users the comfort of security is not just a whim but a must.   

 

Imagine that you’re navigating to your bank’s website. If it’s an HTTPS connection, your browser authenticates the website, thus preventing an attacker from impersonating your bank and stealing your login credentials. Also, when transferring money using your bank’s website, this prevents an attacker from changing the destination account number while your request is in transit. 

One of the big advantages of HTTPS is that it lets the browser check that it has opened the correct website and hasn’t been redirected to a malicious site.

 

  • Mixed content is confusing. 

 

If a web page is using HTTPS, then all its resources should be pulled in via HTTPS as well. You’re somehow viewing a web page that’s both secure and not secure. It is like you bought a very good bicycle lock but you’re not using it every single time, just randomly, and at the end of the day you are surprised to see that your bike was stolen. 

 

Let’s say that you’re on a secure web page and you stay assured that everything is OK as the webpage is on HTTPS. Yet, if that page has some  insecure images (or other HTTP resources) and let’s say you’re on a public Wi-Fi network, you can get into lots of problems, from getting your keystrokes monitored to tracking cookies.

 

  • Mixed content weakens HTTPS.

 

You might have heard before about the man-in-the-middle attack (MITM). In computer security, a man-in-the-middle attack is a type of attack where the attacker secretly relays and possibly alters the communications between two parties who believe they are directly communicating with each other. One example is when the attacker makes independent connections with the victims and relays messages between them to make them believe they are talking directly to each other over a private connection, when in fact the entire conversation is controlled by the attacker. It’s like eavesdropping, just that the stakes are much higher than the latest gossip from the office. 

 

Therefore, requesting subresources using the insecure HTTP protocol weakens the security of the entire page, as these requests are vulnerable to man-in-the-middle attacks.

 

Running your site over HTTPS is not an option; it is a must. Not only is it more secure (everything is encrypted), but it also builds trust, is an SEO ranking factor, and provides more accurate referral data. Not to mention that the most important web browsers are blocking pages that are not considered secure.

 

These are just some of the many reasons why Google decided to block mixed content.  This new update will break a big number of website and many, many businesses will lose big time.

Yet, there is hope: you can quickly detect if you have mixed content and you can also fix it. Keep on reading to find out how can you still make your site accessible to your users. 

 

How Do You Detect Mixed Content?

 

The easiest possible way to see whether you have any mixed content error on your site is to run a Site Audit within cognitiveSEO. It is the easiest, safest and stress-free option you could take and it doesn’t imply any programming skills or developer guides to fix mixed content warnings. I mean, you find out if you have mixed content (+ many other issues ) on your site, with just a few clicks.

 

Just start an analysis of your site and the tool will automatically identify the mixed content issue. There is a section dedicated to this exact matter where you can see the not secure pages of any website and its insecure origins. No headaches, reliable and super simple to identify. 

 

mixed content cognitiveseo

 

It couldn’t get much easier than this. Simply check the reported pages and start fixing them. 

 

How Do You Fix Mixed Content Issue?

 

Once you find the insecure content, the resources being served over HTTP vs. HTTPS, you can start changing the URLs, by simply append HTTPS at the beginning.

 

Fixing the issue is often as simple as adding an “s” to links – http:// to https://.

 

Yet, before you do that, be sure that the HTTP resource is available over an HTTPS connection. To check this, simply copy – paste the HTTP URL into a new web browser, and change HTTP to HTTPS. If the resource (URL, image, video, etc.)  is available over HTTPs, then you can start changing HTTP to HTTPS in your source code. 

 

Mixed content is an issue that can be so easily identified and solved, but if ignored it can cause big problems, like your website being blocked by Google. 

 

Once you solved the issue, go back to the Site Audit to make sure you didn’t miss any insecure content resource. The tool re-crawls your website periodically to spot any new changes, although you can always check particular issues only to see if they have been solved. 

 

recheck issue cognitiveSEO

 

Does Mixed Content Affect SEO?

 

As we stated above, Google made it pretty clear that it values secure content and it considers it a ranking factor. It’s listed on their blog, out wide in the open.

 

The main reason is definitely security. If Google provides its users with better security, it provides better value and the users will be pleased. The fact that internet credit card fraud is on the rise definitely pushed Google into this direction.

 

google ranking factor

 

Google has tested its results with HTTPS as a ranking signal and has seen positive results. It could also mean that webmasters who take security seriously might generally present better websites as they care about the users. 

 

While there is no doubt that mixed content affects SEO (especially with the latest announcement from Google), before search engine optimization one has to think about user trust and user experience. 

 

If you have mixed content, most of the modern browsers (like Mozilla Firefox, Google Chrome, etc) will display warnings about this type of content to indicate to the user that the page contains insecure resources. Due to the mixed content warning and insecure resources loading, most likely the user will leave your site, will mark you as a deceiving site and will browse websites that offer similar services and are secure. All your digital marketing and content strategies efforts will go down the drain with chrome blocking your http content. 

So, the answer is yes, mixed content certainly impacts SEO, but more than that, it impacts your users’ trust and that’s something you can’t afford to lose. 

 

We know it’s unlikely but yet, if you haven’t done it already and you want to switch from HTTP to HTTPS, check out this article for everything you need to know about it. 

 

The post Is Google Blocking Your Site Due to Mixed Content? Identify & Fix Mixed Content Issues appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
https://cognitiveseo.com/blog/23215/mixed-content/feed/ 1
Vital Hreflang & Multi Language Website Mistakes That Most Webmasters Make https://cognitiveseo.com/blog/17150/multi-language-website-mistakes/ https://cognitiveseo.com/blog/17150/multi-language-website-mistakes/#respond Tue, 27 Aug 2019 09:02:32 +0000 https://cognitiveseo.com/blog/?p=17150 The internet gives a business the power to compete on a global level. Gone are the days when your only competitor was the other shop across the road. If you sell your products or services on a website, you have the power to quickly expand beyond your country’s borders, without spending millions of dollars on […]

The post Vital Hreflang & Multi Language Website Mistakes That Most Webmasters Make appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
The internet gives a business the power to compete on a global level. Gone are the days when your only competitor was the other shop across the road. If you sell your products or services on a website, you have the power to quickly expand beyond your country’s borders, without spending millions of dollars on opening new physical stores.

 

But in order to do this, you have to speak their native or preferred language. And, in order to speak their language, you need to translate your website. Apparently, setting up a multi language website is one of the trickiest things in the digital marketing field. International SEO is pretty hard! Beside translating it the right way, you can run into a lot of other technical issues, most of them regarding incorrect indexation and display of the language versions by Google.

 

Vital Mistakes on Multilingual Sites

 

Even more, Google has recently changed the way it displays websites internationally. You can no longer see the results in another country by simply visiting its Google version. Instead, you have to go through the search settings and select the specific country and language. This shows Google’s interest to make search results more relevant by location, so it’s more important than ever to get everything right!

 

In this small international SEO guide, we’re going to try to solve some of the more complicated issues regarding multilingual websites and hopefully shed some light on the most common hreflang mistakes and other general multilingual website issues that webmasters make when they start expanding internationally. 

  1. Technical Multilingual SEO & Hreflang Related Issues
    1. Bad Implementation of the rel=”alternate” and hreflang Attributes
    2. Conflicts, Bad Implementation and Confusion Regarding the rel=”canonical” Tag
    3. Geographical and IP Based Redirect Issues
    4. Using Robots.Txt or No-Index Tag on Translated Pages
    5. Language Selector Flag URLs
    6. English URLs for Other Languages
    7. Think of Other Search Engines, Too
    8. Focusing All the Links Only on the Main Version
  2. Display & Content Multi Language Issues That Affect UX
    1. Using Automatic Website Translation Software
    2. Not Doing Keyword Research
    3. Not Having Any Cultural Awareness
    4. Not Fully Translating Captchas
    5. Trying to rank an English page everywhere using HREFlang
    6. Fonts and Diacritics
    7. Neglecting Social Media

 

 

Technical Multilingual SEO & Hreflang Related Issues

 

Technical issues in multilingual websites are more common on custom builds. It might not always be the webmaster’s fault, but as long as you have the information and leave the problems there, you have no excuse. Here are some of the most common technical problems around the web and how to properly fix them.

 

Bad Implementation of the rel=”alternate” and hreflang Attributes

 

Oh, the hreflangs!  Studies show that around 75% percent of the hreflang implementations have mistakes in them. To be honest, while I was searching for examples online, many multi lingual websites did not even have the hreflang implemented at all!

 

That’s a concern, because not only does that prevent you from ranking high in other countries, but it also dilutes your website’s content, making it seem less relevant in Google’s eyes.

 

So what is this hreflang attribute? Well, in theory, it’s pretty simple:

 

The hreflang attribute is a way of telling Google “Hey, I have another localized version of my website here, and it’s in this language.”

 

Here’s a video from SEJ where Bill Hunt is explaining exactly what HREFlang is and how to use it correctly.

 

 

Of course, if you don’t use it, Google is probably able to figure things out on its own. But multilingual sites that help Google figure things out easier are known to get a boost in rankings! Here’s a good example from SeerInteractive that shows a traffic growth after the hreflang attribute has been correctly implemented:

 

Graph from Seer Interactive showing a growth in traffic after correct HrefLang implementation

Graph from Seer Interactive showing a growth in traffic after correct HrefLang implementation

 

Here are the most common mistkes that people do when implementing the hreflang attribute:

 

No hreflang attribute: Of course, the first rule would be to have the hreflang annotation in your HTML. As I said, I found many examples that don’t contain the attribute at all. Here’s just one of them:

 

no hreflang attribute in html

Missing hreflang on fbcareers.com

 

Although you can clearly see that they offer the website in multiple languages, the hreflang attribute is nowhere to be found in the HTML source code:

 

no hreflang attribute

Come out, come out, wherever you are! Hello? Mr. Hreflang? Are you here? …

 

No self-referencing URL: On Google’s official page about multilanguage websites, it’s clearly stated that you must use a self-referencing rel=”alternate” hreflang attribute.

 

If you have multiple language versions of a URL, each language page should identify different language versions, including itself.  For example, if your site provides content in French, English, and Spanish, the Spanish version must include a rel="alternate" hreflang="x" link for itself in addition to links to the French and English versions. Similarly, the English and French versions must each include the same references to the French, English, and Spanish versions.
Google logo Google
https://support.google.com/webmasters/answer/189077?hl=en

 

Here’s an example of a site that is missing the self-referencing hreflang tag.

 

self referencing hreflang missing

The website elcorteingles.es is missing the self referencing Spanish hreflang attribute

 

In another example, you can see from the title that the text is in English and that the English hreflang attribute is missing from the page. However, the page clearly indicates the Spanish version of the website.

 

no self referencing hreflang attribute

Missing self-referencing English hreflang attribute on cricketwireless.com

 

What’s even worse about this case is that the link tag containing the Spanish version is static and implemented in the head template of the entire website. This means that every page will have the same hreflang attribute, continuously misleading Google and harming the website.

 

same hreflang everywhere

Spanish version of the website with self-referencing hreflang but missing English hreflang

 

As you can see above, this time we have the self-referencing attribute in place, but we’re now missing the attribute that specifies the English version we saw earlier.

 

In this case, the correct implementation would include both versions, like this:

 

<link rel=”alternate” hreflang=”en” href=”https://www.cricketwireless.com/” />

<link rel=”alternate” hreflang=”es” href=”https://espanol.cricketwireless.com/” />

 

To prove that bad implementation won’t get you where you want to be, I selected US as the region in Chrome, Spanish as the language and then I searched for cricketwireless.

 

wrong hreflang wrong google display

Google shows English version on Spanish search

 

As you can see, the result isn’t the desired Spanish subdomain. Although the webmaster did specify the Spanish version, they missed out on the other rules. I performed this search in the Spanish region as well, and the Google search results were the same.

 

So if you want your website to rank well across all regions in all languages, make sure you have your hreflang return tags set up, so that Google can figure out which web pages are linked to one another.

 

Not in header: If your hreflang attribute return tags aren’t found in the header, Google will basically analyze the entire page any try to figure things out on its own before realizing the answer was right under its nose. Make sure you have it between your opening and closing head tags.

 

A hreflang attribute specifying the French version of a website should look like this:

 

<link rel=”alternate” hreflang=”fr” href=”http://www.yourwebsite.com/fr/” />

 

It’s similar to the link tags that insert JavaScript or CSS files. You can also use a sitemap or a HTTP header for non-HTML files. However, the link tag in the <head> section of your website is the recommended version.

 

Here’s a really weird implementation, where the tags are outside the head section and inside a <li> tag instead of a <link> one. Strange and interesting, but definitely not the right way to do it.

 

 bad hreflang implementation outside of header

Strange language implementation on semver.org

 

Don’t do that! Use the link tag as mentioned above!

 

Relative URLs: Google can misinterpret relative URLs, so make sure you make them absolute (https://yoursite.com/specific-page instead of just /specific-page/). If the page is a 404 or a relative URL, there might be issues in the overall indexing of your language version.

 

I couldn’t find another example, but you can take a look at the example above from semver.org, where the URLs in the already badly implemented hreflang are relative (/lang/ar) instead of absolute.

 

The correct implementation in this case would be <link rel=”alternate” hreflang=”ar” href=”https://semver.org/lang/ar/” />.

 

It doesn’t point to a specific page: Each page should point to the specific counterpart in another language, not the entire foreign language version. I couldn’t find another example, but we can use a previous one. Because there is a single hreflang attribute on the whole website, different pages actually point to the root of that language version, regardless of the page or language version you’re on.

 

specific url hreflang attribute

Specific page hreflang attribute points to root of language version

 

If you implement a non-dynamic link tag in the header template of your website, all pages will have the same HREFlang. This is a bad idea!

 

In some cases, such as this one, you’re better off not having the hreflang attribute at all rather than having it implemented incorrectly.

 

The correct implementation in this case would be https://www.cricketwireless.com/support/apps-and-services.html with a rel=”alternate” attribute to https://espanol.cricketwireless.com/ayuda/27g/aplicaciones-y-servicios.html.

 

Also, remember that it should have a self-referencing hreflang attribute to itself.

 

Incorrect language / country codes: The language code is very often misspelled. Many times, webmasters and web developers use the country code instead of the language code. Here are some official Google insights:

 

correct language codes hreflang

Official Google statement about country codes in the hreflang attribute

 

So, normally, you have to put the language code, not the country code. The country code is optional and can be added to target specific languages in specific regions. For example, you could target the Spanish speaking audience in the US, or the English speaking audience in France. Is this useful?

 

I don’t know… let’s say that some people from UK are visiting Italy, and they want to buy some souvenirs. They don’t know any Italian, so they type “buy souvenirs in Venice” in Google. There you go: you just have an English speaking target audience in an Italian region.

 

The full list of language codes can be found here, and the full optional list of country codes can be found here.

 

No x-default attribute for language pick page: Google recommends using one more tag, placed after all the other languages, to specify the language selection page, if there is one. For example, if the homepage just presents a list of languages to choose from, that would be the x-default language version.

 

In the following case, you can see that the homepage of nunnauuni.com is a language selection page. The page is well set up, redirecting users on their second visit accordingly. Although the site has all the other language attributes, including the self-referencing one, it’s missing the x-default tag which specifies the general language selection page.

 

no x-default hreflang attribute for  choose language page

Missing x-default tag on nunnauuni.com

 

The homepage is also missing all the other tags. Instead, it should include them and also have a self-referencing x-default tag. The correct code to be added after the language list in this case is <link rel=”alternate” href=”https://nunnauuni.com/” hreflang=”x-default” />.

 

If you’re using 301’s to geo-redirect users by IP, you can specify the default version in the HTTP header.  To do this in WordPress, you will need to use an HTTP Headers Plugin. The code, however, is a little bit different: Link: http://www.example.es; rel=”alternate”; hreflang=”es-ES”.

 

Untranslated pages hreflang to Homepage: This is a big issue, especially if your Homepage is an important page for your website. This happens a lot mostly because of how the hreflang metatags were implemented, generally as result of a plugin.

 

Most plugin seem to have this issue. If a page or blog post isn’t translated, the plugin doesn’t really know what to add to the hreflang link attribute so it just adds the homepage or a “/” which can be interpreted as a relative URL for the homepage.

 

Polylang, the translation plugin of my choice when it comes to multi language websites on WordPress doesn’t seem to have this issue. You can set it up to not display the wrong hreflang attribute when the page is untranslated. You should also remove any internal links that change language from the menu if there aren’t any translated versions available.

 

Conflicts, Bad Implementation and Confusion Regarding the rel=”canonical” Tag

 

People still don’t understand what the canonical tag does. They have a vague idea about it, but many times use it the wrong way. In a nutshell, here’s what the canonical tag actually does:

 

The rel=”canonical” tag tells search engines what page to display in their results pages.

 

To better understand the tag, think of it like this: if you have 10 web pages about the same subject, they will start competing with each other in the search engines. This confuses Google, so you can use the canonical tag to help it figure things out, and point to the exact page you prefer being shown in the search engines.

 

The canonical tag should always be a self-referencing one, meaning page A should point to itself, except when you want it to display something else instead of page A in the SERPS. Having a self-referencing canonical tag will help you remove any risk of duplicate content issues generated by dynamic parameters, such as ?replytocomm or eCommerce filters.

 

The canonical tag works! I’ll share a story. Some time ago, I published a post on my personal blog, which was syndicated by another publisher. I didn’t manage to get it indexed on my blog and, because the other publisher was more popular, Google indexed their article first. So, in a couple of weeks, they happily ranked in the top 5 with my article. I contacted them, politely asked for them to add the canonical tag and in about a week, Google picked it up and started displaying my page instead.

 

Don’t try to trick Google into displaying just a landing page or some strange page that doesn’t actually serve the user’s intent. It won’t work and you can risk getting penalized.

 

Getting back to the multililanguage websites, the canonical tag should be self-referencing the page it’s on, unless otherwise specifically desired . A common error is this:

 

The wrong way to do it is: www.yourwebsite.fr/defile-mode/ with a canonical URL to it’s English counterpart, www.youtwebsite.com/fashion-show/.

 

If you combine this with an HREFlang attribute, then you’re basically fooling Google around, telling it to go from EN to FR and then back from FR to EN again.

 

A good implementation would be: www.yourwebsite.fr/defile-mode/ with a canonical tag pointing to either www.yourwebsite.fr/defile-mode/ (itself) or, if desired, www.yourwebsite.fr/some-other-french-page/.

 

Never use the rel=”alternate” hreflang to solve duplicate content issues, as this is not its purpose. It will only tell Google to show that version of the page for a different location and language in a browser.

 

Geographical and IP Based Redirect Issues

 

I was discussing this recently with someone at a meeting. One of his clients insisted that the English homepage on his site was displaying in French by default, instead of English. The reason? His browser was in French, so the main English page was automatically redirected by a WordPress plugin.

 

Now Matt Cutts said in his cloaking video that geo-redirect isn’t something to worry about. He also says that users coming from France or a French speaking location will be happy to get their content displayed directly in French.

 

 

However, keep in mind that although you can send users from France to the French version, you can’t guarantee that everyone in France uses a French IP or has their browser in French. 

Many people use their browsers in English, for example. This means that they will constantly be redirected, no matter what they do. Also, with VPNs becoming more and more popular, IP isn’t a fool-proof metric either. (if you’re interested in finding more info on what a VPN is, here you can find more about it. )

 

Setting geo-redirection on its own doesn’t help you rank better in other languages. In my opinion, the best way to direct the user to the right version is using the HREFlang attribute to properly display the desired page in their search engine. Of course, if they use a different IP with VPNs, the search engine will still display the wrong version, thinking the user lives somewhere else, but any user using a VPN should be aware of that.

 

geo redirect instead of hreflang

English and French

 

If someone is going to access your business website directly, it will either access it through the right country URL, or through the homepage. If you have a clearly visible language selector in place, I consider any user to be smart enough these days to be able to get to the right version.

 

In case your website already provides automatic redirection and you choose to keep it, make sure you set the x-default hreflang attribute as well. This will tell Google where the language selection page is and it will display that whenever it is unsure of the user’s true location or preferred language.

Make sure that the language selection flags are clearly visible, on desktop and on mobile.

 

Using Robots.Txt or No-Index Tag on Translated Pages

 

Another common issue when translating pages is to forget the no-index tag on, or leave it there on purpose. I can understand forgetting it, as you do not want Google to index your alternate language version of the website before it’s finished.

 

But if you leave it on purpose, it doesn’t really make any sense. I’ve read some rumors about people being afraid of duplicate content penalties. Although there is no such a thing as a duplicate content penalty, I understand the issue.

 

You might be thinking “How could someone think French and English versions are duplicates?” At first, I thought so myself, but then I realized it must be about the same language displayed in different locations. For example, en-us and en-gb.

 

Although you could simply use the language selector to display the same version in both regions, it can be useful to have separate versions.

 

This way, you can have different sliders, products or offers in different regions. For example, if you sell T-shirts with messages, some texts might fit the US and only some might fit the UK.

 

If you do have multiple English versions, using the no-index tag is a bad idea if you have all the HREFlang annotations set up properly. If you reference a version with HREFlang and then use no-index on it, you’re basically telling Google “Hey check this out over here!” and then “Ha ha, just joking, nothing to crawl here, go away!”

 

Don’t joke with Google!

 

Language Selector Flag URLs

 

One common mistake that happens is having a static implementation of the language switcher button or flag. Users expect to see what they searched for. If you’re in a subpage of the website, changing the language shouldn’t take the user to the homepage of that language. At least, not all the time. It should take them, preferably, to that specific page, in the desired language.

 

The problem here is that it’s not always that easy to do. You can  have, for example, 10 pages in English, but only 8 are translated in French. What do you do with the other two? Well, you have three options:

 

  • Option one is to send the user to the most relevant French page that you have regarding that subject 
  • Option two is to send them to the homepage
  • Option three is to specify to the user that there is no translated version for that specific page

 

Option three is the worst from my point of view, because users will most likely leave the website on the spot when they receive the message. It basically says “This website doesn’t have what you’re looking for.”

 

The homepage option isn’t such a big deal on a small website, where you only have About Us, Services and Contact Us. People will figure out easily on what page they were previously. It can still affect the user experience a little, but things will be fine.

 

However, if you have a blog or a huge website with thousands of pages and articles, the users will have a very hard time finding what article they were on if you send them to the homepage.

 

A good example of a language selection implementation that always sends you to the homepage can be viewed on clinlife.com.

 

static homepage selector in language selection menu

 

English URLs for Other Languages

 

Since we’ve just spotted the untranslated URL structure in the example above, let’s talk about this. Why not translate all your foreign URLs? I mean, we all know that using some keywords in the URL can help you rank better. Obviously, ‘studies’ will be less helpful in Brasil than ‘estudos’. We know the content isn’t dynamic on a static URL structure because the URL parent changes (/brpt#/ to /caen#/).

If you’re going to translate your website, make sure you translate your URLs as well.

This is often overlooked in eCommerce website builders and even search engine optimization tools and there are many examples that can be given. Here’s one from an eCommerce website:

 

no url translation hreflang

No url translation on antrhropologie.com

 

And here’s another online store making the same mistake:

 

no url translation for website

No URL translation on thenorthface.com

 

Think of Other Search Engines, Too

 

Google here and Google there, but the truth is that in other countries, Google isn’t the most popular search engine! Russia, for example, uses Yandex, and China has Baidu. Different countries also use different search engines in different proportions.

 

search engine use around the globe

Source: www.martinkovac.com

 

Google is censored in some countries, so think twice before spending time to translate the content for those regions. Also, consider that other search engines don’t have the same exact algorithms as Google does. It’s good to know, for example, that Yandex doesn’t use links in it’s algorithm.

 

Focusing All the Links Only on the Main Version

 

This is one of the things that always keeps international competitors far behind the local ones. Google really appreciates local/regional links, so if you have a website translation in Spanish, you’re better off having links from .es toplevel domains than having links from .com toplevel domains.

 

Local competitors know this and, even more, it’s far easier for them to acquire .es links than it is for an international competitor. They don’t only have to rely on link building, because they can network and attend meetings, meet new people and promote their websites in other ways.

 

It’s also very common for an international website not to focus on it’s translated versions. But since you’ve spent so much time translating it, shouldn’t you also focus on promoting it?

 

If we take one of our previous examples and feed it to the Site Explorer, we can see the discrepancy:

 

Regional backlinks help you rank in that region

Screenshot from the cognitiveSEO’s Site Explorer, showing the discrepancy between links

 

What’s even worse about that 0.7% is that all those .fr links are pointing to the English language version of the website:

 

regional backlinks points to wrong version

Regional backlinks points to wrong version

 

Call it local SEO if you want, but focus on building some regional backlinks and make sure you build them to the right version.

 

Display & Content Multi Language Issues That Affect UX 

 

Although user experience issues can still be attributed to lack of knowledge, content issues probably have more to do with unawareness. Anyway, here’s what you should be keeping an eye on:

 

Using Automatic Website Translation Software

 

Let’s start with something really common… We all know that Google Translate doesn’t always get it right. It actually… gets it wrong lots of the time (for now).

 

Most people use this technique to get English content into other languages (usually with Google Translator or an Android / iOS App) , because search engines aren’t too good at detecting automatically translated content. Sure, Google might not be as good at understanding languages other than English, but users still are. And since UX is such an important metric, it’s a waste of money and time trying to do this at scale.

 

bad translation for websites

 

Human translations are definitely better, as long as they actually know both languages well (and one of them is preferably a native language). Although manually translated content is more expensive, providing bad language translation to users will affect your brand and probably the chance to ever hit that target market in the future. If you want to build something solid, get a professional human translator.

 

Not Doing Keyword Research

 

Don’t just translate the keywords and expect to get results from it. A professional translator can’t do everything. It’s a good start, but you might also need to contact someone that knows both the native language and search engine optimization to be able to properly identify and add the right phrases in your content.

 

It would also be irrelevant to compare numbers, as English is far more used than Italian for example, so the English numbers will always be higher. However, I hope you get the idea that people search different things in different countries, but want the same product. Do the research!

 

Not Having Any Cultural Awareness

 

If you really want to have an impact, you have to study the culture a little bit. A translator might help, but you might need more than this. You’ll need a regional, someone who’s actually lived there and can provide some insights. Of course, this is at a higher level, but it’s worth doing it if you have the resources.

 

A good easy example to start with are the date formats. Some countries use dd/mm/yy, while others use mm/dd/yy. Another good example would be showing an article about pork meat in a Muslim region. Not a very bright idea. Not only will it be completely irrelevant, but it will also make a lot of people feel bad.

 

Not Fully Translating Captchas

 

This is something common. Many people these days use the Google Recaptcha, but very few actually translate it. The result is something like this:

 

no recaptcha translation

All the content in French, but the captcha is in English

 

Now for you this isn’t a problem, since you’re reading this article. But for someone else who doesn’t speak English, it could be. If they don’t know what to do, they won’t be able to contact you.

 

The webmasters did take action against this by displaying the following message: Cochez la case “I’m not a robot” et suivez les instructions. Ce service nous protège des spammeurs.

 

Problem solved, right? Not quite! Does this look familiar?

 

no recaptcha translation problem translating websites

Second occasional verification. This can also be different each time.

 

Yeah… this can be a little frustrating.

 

no translation makes users unhappy

No captcha translation makes users unhappy

 

But the fix for it is actually very easy. Recaptcha works using a JavaScript file. That JS file can be translated with a ULR parameter. If the plugin you’re using doesn’t allow this, you can search the code for the following script:

 

https://www.google.com/recaptcha/api.js

 

Then, just add ?hl=xx after the URL, where xx is the language code, same as with the HREFlang annotations (fr,es,en). To translate it in French, for example, it should look like this:

 

https://www.google.com/recaptcha/api.js?hl=fr

 

Trying to rank an English page EVERYWHERE using HREFlang

 

I’m not talking about people that are trying to target English websites to English speakers from Spain, but people that are trying to target English websites everywhere, regardless of their language or location. This could either be done with intention or by mistake.

 

Let’s start with the mistake. Say I know for a fact that people in Spain search for my product in English. I want to target that market so I add HREFlang like this:

 

<link rel=”alternate” hreflang=”es” href=”https://yourwebsite.com” />

 

As mentioned before in the beginning of the article, this is wrong! Why? Because that’s actually a language code, so now I’m targeting the Spanish speaking users from everywhere to display them an English page.

 

The correct way to do it would be to use both the language and the region codes. For example, if you want to target the English speaking residents from Spain, you would use en-ES:

 

<link rel=”alternate” hreflang=”en-ES” href=”https://yourwebsite.com” />

 

But now, obviously, someone might want to abuse this… so a shady thing to do would be to separately target English residents from everywhere, like this:

 

<link rel=”alternate” hreflang=”en-ES” href=”https://yourwebsite.com” />
<link rel=”alternate” hreflang=”en-DE” href=”https://yourwebsite.com” />
<link rel=”alternate” hreflang=”en-BE” href=”https://yourwebsite.com” />
<link rel=”alternate” hreflang=”en-IT” href=”https://yourwebsite.com” />
etc.

 

Notice that I’ve used the same URL each time in the example above. If I had a different version for each region (which isn’t a complete duplicate) then it would make sense:

 

<link rel=”alternate” hreflang=”en-ES” href=”https://yourwebsite.com/spain” />
<link rel=”alternate” hreflang=”en-DE” href=”https://yourwebsite.com/germany” />
<link rel=”alternate” hreflang=”en-BE” href=”https://yourwebsite.com/belgium” />
<link rel=”alternate” hreflang=”en-IT” href=”https://yourwebsite.com/italy” />
etc.

 

This is only acceptable if there really are different offers for different regions.

 

If all the versions are identical, it’s basically a waste of time and HDD space. It might be alright to target a couple of markets or more, but not all of them. If I want to target all English speakers from all regions, I can simply just specify the language and leave the rest to Google:

 

<link rel=”alternate” hreflang=”en” href=”https://yourwebsite.com” /> (This targets all English speaking users, regardless of their region or location)

 

People will always try to find a way to spam. They will only change titles, for example, leaving the rest of the content in English and they use HREFlang to target all regions. I’ve seen multilanguage sites trying to target all languages, without regions, on the same page, like this:

 

<link rel=”alternate” hreflang=”es” href=”https://yourwebsite.com” />
<link rel=”alternate” hreflang=”it” href=”https://yourwebsite.com” />
<link rel=”alternate” hreflang=”de” href=”https://yourwebsite.com” />
<link rel=”alternate” hreflang=”fr” href=”https://yourwebsite.com” />
etc.

 

Unfortunately, I was unable to find a specific example, but I’m sure there are some out there.

 

But Google isn’t stupid. Don’t try to rank a single page everywhere using the HREFlang attribute. Not only will this not work, but it would be against Google’s guidelines and might actually hurt your rankings.

 

The HREFlang attribute should only be used if you truly have something unique/specific to display to that audience, in that language and in that region.

 

Fonts and Diacritics

 

Fonts can always be a problem when translating a website. You have to make sure that your current font supports all the special characters in the language you’re translating the site to. Otherwise they can mess up the web design by displaying a default font only for the missing characters and will usually look horrible! Something like this:

 

font issues when translating to other languages

“When the utilized font doesn’t contain a specific character, the software will use another font for it.”

 

A good thing to look for is what happens on your mobile device. Sometimes, the characters display properly on your desktop but fail to display correctly on mobile devices. Also, your computer might display the font properly if it has it installed, but other computers might not. Using your mobile device to test this is a good idea.

 

Usually it’s a font implementation issue, so make sure you check with your web designer before deciding to replace the font completely.

 

Neglecting Social Media

 

Last but not least, don’t forget or ignore social media. If you went through all the effort of translating the soon to be multilanguage websites, you might as well put in some effort into promoting it. If you’ve already registered different social media accounts for other countries, put them to good use by posting relevant content there as well.

 

Keep in mind that in some countries, different social media platforms are popular. For example, don’t spend time trying to promote your website on Twitter in East European countries for instance. (I can tell you for a fact that people don’t really use the platform). On the other side, in other countries, Facebook doesn’t even exist (China).

 

Things also vary depending on the niche you’re in. Tech images and news don’t work well on Pinterest, but cooking recipes and healthy lifestyle/motivational messages do. Thing is, your target audience might be in different places.

 

Having an active social media account is a sign of authority. It means the brand is real and, most importantly, alive. It will help you gain the traction you need in order to rank well in Google with the translated version.

 

Conclusion

 

A multilingual website with properly performed international SEO is definitely something not easy to set up but, hopefully, this article helped you understand how to avoid the most common HREFlang mistakes if you’re planning to translate your website. It’s an evergreen piece of content, just as actual in 2019 or 2020 and most likely in 2021, as some rules are there to stay. 

 

Make sure you don’t set up the hreflang meta tags wrong or you will create more issues than not adding them at all. If you’re on a custom platform or you’re using a custom website builder and want to make sure your implementation is correct, you can try Aleyda Solis’ Tool. Use it to generate the correct HREFlang tags and then add them or compare them to your current ones. Remember, they need to be in your <head> section.

 

Keep in mind that your business website is alwaays better off if it’s manually translated by a professional. When the user changes the language from the language switcher, send them to the right page or make it clear that there is no translation available. Don’t trust translation plugins out of the box and make sure you check how they implement everything.

 

Also, since you’re here, make sure you check out our article about using subdomains vs using subfolders when building multiple website sections. They might come in handy but, long story short, better have domain.com/en than en.domain.com.

 

Thanks so much for reading this till the end! If you have any comments, ideas or opinions, feel free to share them with us in the comments section.

 

 

Article reviewed by Catalin Dracsineanu

The post Vital Hreflang & Multi Language Website Mistakes That Most Webmasters Make appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
https://cognitiveseo.com/blog/17150/multi-language-website-mistakes/feed/ 0
How A Company from the Fintech’ Space Grew from 4000 to 420K Visitors in Just 6 Months https://cognitiveseo.com/blog/22747/how-to-increase-website-traffic/ https://cognitiveseo.com/blog/22747/how-to-increase-website-traffic/#comments Thu, 08 Aug 2019 09:18:09 +0000 https://cognitiveseo.com/blog/?p=22747 This is a TRUE SUCCESS story from Jibran Qazi – Founder of Hunter Canada, a cognitiveSEO long-time customer and SEO expert.   Jibran Qazi is an SEO consultant, growth hacker, and founder of Hunter Canada, a company that helps tech firms achieve exponential traffic growth. Enjoy his story, written and documented by himself, and see how […]

The post How A Company from the Fintech’ Space Grew from 4000 to 420K Visitors in Just 6 Months appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
This is a TRUE SUCCESS story from Jibran Qazi – Founder of Hunter Canada, a cognitiveSEO long-time customer and SEO expert.


 

Jibran Qazi is an SEO consultant, growth hacker, and founder of Hunter Canada, a company that helps tech firms achieve exponential traffic growth. Enjoy his story, written and documented by himself, and see how he managed to grow an app in the Fintech space, increasing its number of monthly visitors from 4000 to 420K+ in just 6 months.  All organic traffic from Google, no PPC, no paid ads.

 

Increased_Website_Traffic_to_420k+_Visitors_in_6_Months (2)

 

  1. Success Story of 420K+ visitors increase 
    1. Social Proof
    2. The Numbers
    3. First 30 Days
    4. First 60 days (2 months)
    5. First 120 Days (4 Months)
    6. First 180 Days (6 months)
  2. How To Get Exponential Results For Your Own Site
  3. The SEO Process
    1. Negative SEO Check on Main (Mysite.com) + Subdomains
    2. Build Citations
    3. Add Your Link to Reliable Directory
    4. Add Text
    5. Moving Forward
    6. Using Phrases As Anchor Text
  4. The Link Building Strategy
    1. Forget About NoFollow Links
  5. Conclusion
 

1. Success Story of 420K+ visitors increase 

 

The length of our contract was 6 months which started on May 29th, 2018 and ended on Nov 29, 2018.

 

Let’s start with some social proof, shall we:

 

1.1 Social Proof

 

Email screenshot

 

If you zoom into the email, you’ll see I got them on the first page for “Invoice” and “Receipt”.  Two massive keywords. Not sure if they are still there but during that time, they sure were.

 

1.2 The Numbers

 

Number of visitors and traffic

 

For obvious reasons, I can’t share the exact numbers but this should give you an idea.

 

Google-analytics-Number-of-visitors

 

Which lead to over 350K downloads of their app.

 

App-downloads-Google-Analytics

 

Let’s break down the month by month increase.

 

1.3 First 30 Days

 

Success to these guys was a 30% increase in the first 3-6 months. I handed this to them in the first 30 days.

 

In fact, I more than doubled it. Over 60%.

 

First-30-Days-numbers-in-Google-Analytics

 

1.4 First 60 days (2 months)

 

During these months and pretty much every month after I started, the growth was (As expected) simply exponential.

370%+ increase in traffic in just 2 months.

 

First 60 Days growth

 

Over 1200%+ increase in just 3 months.

 

1.5 First 120 Days (4 Months)

 

Over 3000%+ increase in traffic. In fact, this was the first time the traffic went over 140K.

 

First-120-days-growth

 

1.6 First 180 Days (6 months)

 

This is where I took them over 8000% in organic growth and had over 420k people in total. All in 6 months.

 

First-180-Days-growth

 

2. How To Get Exponential Results For Your Own Site

 

To get such results in your own niche, you must rank for some high volume keywords. I got this Fintech company on the first page for the following massive keywords:

  • Invoice;
  • Receipt;

invoice

 

Considering they weren’t even close when I benchmarked everything in the beginning, these are pretty solid results.

 

I literally beat out sites like Wikipedia, Shopify, Office.com and many more established sites with massive advertising budgets.

 

3. The SEO Process

 

Since I can’t share exactly what I did for legal reasons, let me share some “general tips”. I just applied my 3 step formula again:

  1. Fix what’s wrong (Site Architecture).
  2. Optimize Site (On-Page SEO).
  3. Build Links (Starts from day 1).

 

If you’re new to SEO, or not sure what ‘link juice’ is, start here.

 

My SEO Process Timeline

 

The only difference is instead of approaching these steps in a sequence, you should perform all 3 steps at the same time in a holistic way.

 

Now you can use lots of tools for the first two steps (Depends on your preference really) but the main one that really helped me get this site (Plus my other clients/sites) to the next level was hands down, Cognitive SEO.

 

Here’s why:

  • If you want the deepest backlink data, there is just no comparison.
  • For the past 3 years, I’ve been using it for the initial NegSEO cleanup and then ongoing monitoring.
  • When I’m considering getting a backlink from somewhere which looks a little odd, CognitiveSEO allows me to take a deeper look.

 

Guys, the thing is if you want great results, you have to use quality tools. CognitiveSEO is one of them. So generally speaking, here is how you should usually approach a site.

 

3.1 NegSEO Check on Main (Mysite.com) + Subdomains (help.mysite.com)

 

Always do a NegSEO check on your main domain and any subdomains first. Sometimes spammy links from your subdomains can kill your traffic over time. I love how CognitiveSEO allows me to look up subdomains for NegSEO as well. Very important.

 

NegSEO Check on Main (Mysite.com)

 

When disavowing, always make sure you manually check the metrics of the sites you want to disavow. Sometimes you can disavow good ones as well and that could hurt you a lot.

 

There is a world of difference between a spammy site vs a low quality site. You need to disavow spammy site not low quality sites.

 

According to Cognitive SEO, 97% of Google is made of low-quality sites. Check out Google’s very own backlink profile below:

 

Link Profile Influence cognitiveSEO metric

 

3.2 Build Citations

 

Citations are links from local business directories. It is surprising how so many businesses (Established or not), have not paid any attention to these.

 

Quality citations are a great source for backlinks and another great way of diversifying your backlink profile.

 

3.3 Add Your Link to JoeAnt

 

If there is one directory that I swear by to this day, it is JoeAnt. Love how they’ve maintained their quality throughout all these years. So always make sure you are present on JoeAnt.

 

It will give your site two things. A quality backlink and another diverse source that your backlinks is coming from. In this case, it would be from a directory.

 

3.4 Add Text

 

Try your best to turn your main pages (Pages you want to rank), into long copy.

 

These days, you need to have around 3000+ words. You don’t have to do it right away but keep adding text little by little.

 

3.5 Moving Forward

 

Most of these changes that I’ve mentioned here take the first 30-60 days. Usually 60 days as all the changes you are making need to get re-indexed as well. I used a service called One Hour Indexing for this.

 

Oh wait … What about backlinks?

 

Again, nothing special here either. Just you typical, manual, white hat link building (Blog comments, Forums, Directories, etc).

 

When it comes to SEO, it’s link building that takes most of my time. 2-4 hours a day at the very least. I recommend getting 15-20 links a month and see how your site reacts to it. Then you adjust your numbers accordingly. That’s really it.

 

3.6 Using Phrases As Anchor Text

 

Forget about one or two-word anchor texts. Most of the anchor texts that I use are between 5 to 7 words. Yes you read that right. It looks natural and it works.

 

In most cases, using the keywords isn’t necessary either.

 

4. The Link Building Strategy

 

Now, this is something I haven’t seen anyone discuss so I would like to share this. After finding quality link opportunities, where and how do you point them?

 

Well first, never link to the homepage. As the homepage usually already has enough links coming in. I’ve read case studies where they only link to the homepage and get great results. Like I said earlier, everyone has their own style and this is what has worked for me.

 

Once I reach the top, I randomize this pattern every 30 days so it all looks natural for Google as well. Think of link building as putting different coats of paint. Just make sure each coat (link building cycle) is different.

 

4.1 Forget About NoFollow Links

 

Forget about NoFollow or not. Just get links that fall in your pre-determined metric’s range (One I mentioned above).

 

You need both kinds of links (Heck I say get all kinds of links from all kinds of CMS’s) so you get quality links from diverse sources. That’s the key to successful link building.

 

Please stop ignoring ‘NoFollow’ links. If they are coming from a good authoritative place, get them. Nofollow links from Forbes or Wikipedia are still world-class links. Never check if a backlink is NoFollow or not. Haven’t done that in ages.

 

Conclusion

 

At the end of the day, if you do the following, you’ll get great results:

  • Pay attention to where and how the ‘link juice’ of a site flows. Be obsessed with link juice.
  • Make smart changes that work for you and Google. A healthy balance here can always be achieved.
  • Write about things your competitors aren’t.
  • Never stop getting backlinks.
  • Use proper tools to get a deeper look at your backlinks (Current and the ones that you want to acquire).
  • Run your own tests.

 

That’s pretty much it.

 

Disclaimer

This is not a paid post. cognitiveSEO made no agreement with the author.
This is Jibran’s success story, written and documented by himself.

 

Please feel free to share your thoughts on this story with us.

 

About the author

Jibran Qazi

  Jibran Qazi is an SEO consultant, growth hacker, and founder of Hunter Canada, a company that helps tech firms achieve exponential growth. 

The post How A Company from the Fintech’ Space Grew from 4000 to 420K Visitors in Just 6 Months appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
https://cognitiveseo.com/blog/22747/how-to-increase-website-traffic/feed/ 25
How Duplicate Content Affects SEO & Google Rankings | The Complete Guide https://cognitiveseo.com/blog/22150/duplicate-content-seo/ https://cognitiveseo.com/blog/22150/duplicate-content-seo/#comments Wed, 22 May 2019 05:39:14 +0000 https://cognitiveseo.com/blog/?p=22150 Intentional or unintentional, be it plagiarism or bad technical implementation, duplicate content is an issue that is affecting millions of websites around the web. If you’ve wondered what content duplication is exactly and how it affects SEO and Google rankings, then you’re in the right place.   Whether you think your site is affected by this […]

The post How Duplicate Content Affects SEO & Google Rankings | The Complete Guide appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
Intentional or unintentional, be it plagiarism or bad technical implementation, duplicate content is an issue that is affecting millions of websites around the web. If you’ve wondered what content duplication is exactly and how it affects SEO and Google rankings, then you’re in the right place.

 

Whether you think your site is affected by this issue or just want to learn about it, in this article you will find everything you need to know about duplicate content. From what it is to how you can fix it in specific cases, here you have it all, so keep reading.

 

Duplicate Content_-_The_Ultimate_Guide

 

  1. What Is Duplicate Content
  2. How Google Handles Duplicate Content
  3. The Myth of the Duplicate Content Penalty
  4. Why Doesn’t Google Like Duplicate & Very Similar Content?
  5. How Much Copy/Paste Is Considered Duplicate Content
  6. The Problems Caused by Duplicate Content
    1. Burning Crawl Budget
    2. Link Signal Dilution
    3. Not SEO Friendly URLs
    4. Bad User Experience
  7. Internal Duplicate Content Issues
    1. HTTP / HTTPS & WWW / non-WWW
    2. Hierarchical Product URLs
    3. URL Variations (Parameters & Session IDs)
    4. Bad Multilingual Implementation
    5. Indexed Landing Pages for Ads
    6. Boilerplate Content
  8. External Duplicate Content (Plagiarism)
    1. Someone steals your content
    2. You steal someone else’s content
    3. Content Curation
    4. Content Syndication
  9. How To Identify Duplicate Content Issues
  10. How to Fix Duplicate Content Issues
    1. Using 301 Redirects
    2. Using Canonical Tags (probably best)
    3. Using Noindex
    4. Using a mix of all
 

1. What Is Duplicate Content

 

Duplicate content is content that has been already written by someone, somewhere else. So, if you take a piece of content off one website with the infamous Copy/Paste and then publish it on your website, then you have duplicate content.

 

Duplicate content has many sides and can be caused by many things, from technical difficulties or unintentional mistakes to deliberate action. Before we get into more technical aspects, we must first understand what content duplication actually is. 

On the web, duplicate content is when the same (or very similar) content is found on two different URLs.

 

Another key thing here to remember is that the content is already indexed on Google. If Google doesn’t have the original version of the copied content in its index, then it can’t really consider it duplicate content, even though it is!

 

Around 5 years ago, I was actually contemplating scanning old news magazine pages and, using software, turning the images into text and then use it for PBNs or whatever worked at that time. While that might be illegal from a copyright point of view, it should pass Google’s duplication filters even today.

 

I would actually recommend publications which are moving from print to digital should repurpose old content in their magazines on their websites.

 

We all know Google likes to see quality content on your site, and not thin content. If you have it but it’s not on Google yet, it still is new and original, so why not take this opportunity? Sure, some news might be irrelevant today, but I’m sure magazines also featured evergreen content such as “How to lose weight fast”.

 

An article could even be modified into something like How people used to do fitness in the 80s’. You can keep the content identical this way (although a small original introduction might be required).

 

However, things are a little bit more complex than that. There’s a big discussion on what exactly makes for duplicate content in Google’s eyes. Is a quote duplicate content?

 

Will my site be affected if I publish someone else’s content but cite the source?

 

Duplicate Content SEO Google

 

Also, there isn’t one single solution for fixing duplicate content issues. Why? Because there are very many scenarios. There are multiple solutions and one of them might be better than the other. There are many things to be discussed and, hopefully, by the end of this article you’ll have all your questions answered.

 

However, we must first get some other things clear to better understand the nature of duplicate content. Then we will analyze different scenarios and give solutions for each and every one of them. 

 

2. How Google Handles Duplicate Content

 

There’s a lot of content out there in the world. Compared to that, Google knows only about a small part of it. To be able to truly say if the content on your site has been copied, Google would have to know every piece of paper that has ever been written, which is impossible.

 

When you publish something on your website, it takes a while for Google to crawl and index it. If your site is popular and you publish content often, Google will crawl it more often. This means it can index the content sooner.

 

If you publish rarely, Google will probably not crawl your site so often and it might not index the content very quickly. Once a piece of content is indexed, Google can then relate other content to it to see if it’s duplicate or not.

 

The date of the index is a good reference source for which content was the original version.

 

So what happens when Google identifies a piece of content as duplicate? Well, it has 2 choices:

 

  • Display it: Yes, Google might choose to display duplicate content in its search results if it finds it to be actually relevant to a user. A good example might be news publications making the same statements over and over again when something happens.
  • Don’t display it: Google throws your content into something often called Google Omitted Results. If you SPAM the web all the time, it might even consider not indexing your site anymore. 

 

what are google omitted results

 

3. The Myth of the Duplicate Content Penalty

 

Will you be penalized for duplicate content? No.

Is duplicate content hurting your site? Well, that’s another story.

 

Because Google doesn’t like duplicate content very much, people have assumed that it’s a bad practice which gets punished by Google. With a Penalty!

 

Despite popular belief and although content duplicate does cause issues, there’s no such thing as a duplicate content penalty!

 

At least not in the same way that we have other penalties, be them manual or algorithmic. Or, at least that’s what Gary Illyes said in a tweet.

 

 

This comes in contradiction with Google’s official page on duplicate content on the webmaster guidelines which states that:

 

“In the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we’ll also make appropriate adjustments in the indexing and ranking of the sites involved. As a result, the ranking of the site may suffer, or the site might be removed entirely from the Google index, in which case it will no longer appear in search results.” – Google

 

So while there’s no duplicate content penalty, if you ‘try to manipulate search results’ you might end up losing rankings or even getting deindexed. Here’s Google at its best again, contradicting itself, at least a little bit.

 

However, I tend to take Gary’s word for granted. Duplicate content isn’t something that you should avoid just because Google might hit you in the head. Also, Google won’t hit you just because you have duplicate content.

 

duplicate content penalty seo

 

It’s a different story with those who use content scrapers, deliberately steal content and try to get it ranked or use mass content syndication only for links. It’s not only about content duplication but actually about stealing content and filling the internet up with garbage.

 

The fact that there’s just so much ‘innocent’ duplicate content out there makes it even harder for Google to detect the evil-doers with a 100% success rate.

 

But even though Google won’t penalize you, it doesn’t mean that duplicate content can’t affect your website in a negative way.

 

Talking about duplicate content penalties, here’s what is written in the Google Search Quality Evaluator Guidelines from March 2017:

The Lowest rating is appropriate if all or almost all of the MC (main content) on the page is copied with little or no time, effort, expertise, manual curation, or added value for users. Such pages should be rated Lowest, even if the page assigns credit for the content to another source.

Also, you can check out the video below where Andrey Lipattsev, senior Google search quality strategist, repeated and said content duplication penalty doesn’t exist and also that:

 

  • Google rewards unique content and correlates it with added value;
  • The duplicate content is filtered;
  • Google wants to find new content and duplicates slows the search engine down;
  • If you want Google to quickly discover your new content, you should send XML sitemaps;
  • What the search engine wants us to do is to concentrate signals in canonical documents, and optimize those canonical pages so they are better for users;
  • It is not duplicate content that is hurting your ranking, but the lack of unique content.

 

 

Here’s even more about the duplicate content penalty.

 

4. Why Doesn’t Google Like Duplicate & Very Similar Content?

 

Well, the answer to that is very simple:

 

When you search something on Google, would you like to see the exact same thing 10 times? Of course not! You want different products, so that you may choose. You want different opinions, so that you can form your own.

 

Google wants to avoid SPAM and useless overload of its index and servers. It wants to serve its users the best content.

 

As a general rule of thumb, Google tries to display only 1 version of the same content.

 

However, sometimes, Google fails to do this and multiple or very similar versions of the same pages, many times even from the exact same website get shown.

 

For example, in Romania, the biggest eCommerce website, eMAG, generates pages dynamically from nearly all the searches that happen on their site. In the following image, you can see 3 top listings for keyword, keyword plural and keyword + preposition. All of these were searched internally on eMAG’s website so they automatically generated these pages and sent them to the index.

 

Google Duplicate Content Fail eCommerce

Romanian site featuring 3 duplicate pages in Google search for “damasc bed linen”

 

You can see the titles & descriptions are very similar and the content on those pages is identical.

 

Now this is a very smart move from this eCommerce site. Normally, Google shouldn’t allow this to happen. Multiple complaints are emerging in the Romanian eComm community regarding this issue but it seems to keep going (some requests even reached John Mueller).

 

Although I highly doubt it, it is possible that those results are actually the most relevant. But this doesn’t happen for every keyword out there. Some keyword searches are unique and, most of the time, Google only displays one page from eMAG’s website on the first page. 

 

In my opinion, although this site could canonicalize these versions to a single page, it’s not their fault that they get 3 top listings. It’s Google’s job to rank the pages, not theirs.

 

This is a classic example of duplicate content issue. From a user’s perspective, this might not be a very good thing. Maybe the user wants to see other websites. Maybe they’ve had a bad experience with the site in the past. Maybe they just want to see if it’s cheaper somewhere else.

 

Google is still trying to figure out ways to detect when this is an issue and when not. It’s not quite there, but it’s getting better and better.

 

I’ve encountered some queries where the first 3 pages were all occupied by eMAG results. I can tell you, it’s a scary sight! I highly doubt that they were the only ones selling that type of product, because eMAG is actually a retailer. They sell other people’s products and most of them have their own websites.

 

5. How Much Copy/Paste Is Considered Duplicate Content (What About Quotes?)

 

According to Matt Cutts, about 25-30% of the entire internet is made up of duplicate content. That figure might have changed in recent years, since the video is pretty old. Considering the expansion of the internet and the growing number of new websites (especially eCommerce ones, where content duplication is thriving), it has likely increased.

 

 

So what we get from the video above is that not all duplicate content is bad. Sometimes people quote other people for a reason. They bring quality to their content by doing that and it isn’t something bad.

 

In essence, think about it like this:

 

Duplicate content is when content is identical or very similar to the original source.

 

Now of course, very similar can be interpreted. But that’s not the point. If you’re thinking about these numbers, then you’re obviously up to something bad. If you’ve contemplated deliberately copying/stealing some content to claim it as your own, then it’s duplicate content. You can also get into legal copyright issues.

 

A popular type of duplicate content that is harmless are eCommerce sites product descriptions.

 

Either because they’re lazy or they have so many products to list, eCommerce sites owners and editors simply copy paste product descriptions. This creates a ton of duplicate content, but users might like to still see it on the web because of different prices or services quality.

 

What ultimately sells a product though is its copy. So don’t just list a bunch of technical specifications. Write a story that sells.

 

Many eCommerce website owners are complaining that other websites are stealing their description content. As long as they don’t outrank you, I’d see it as a good thing. If they outrank you, simply sue them due to copyright. However, make sure that you have an actual basis on which you can sue them. Some technical product specifications aren’t enough.

 

Another one is boilerplate content. Boilerplate content is content that repeats itself over and over again on multiple pages, such as the header, navigation, footer and sidebar content.

 

As long as you’re not trying to steal someone else’s content without their permission and claim it as your own, you’re mostly fine with using quotes or rewriting some phrases. However, if your page has 70-80% similarity and you only replace some verbs and subjects with synonyms… that’s not actually quoting.

 

Did You Know

Google Search Console no longer allows you to see your duplicate content issues. Some time ago, this was possible, but Google ‘let go’ of this old feature.

So how can you know if you have duplicate content issues?

You can use the cognitiveSEO Site Audit Tool for that. The tool has a special section for that, where it automatically identifies any duplicate content issues. Therefore, you can quickly take a look at your duplicate pages, duplicate titles, descriptions, etc.

More than that, the tool has a section that identifies near duplicate pages and tells you the level of similarity between them.

duplicate content screenshot

 

6. The Problems Caused by Duplicate Content

 

As Gary Illyes pointed above, some of the actual issues caused by duplicate content is that it burns up crawl budget (that especially happens to big sites) and it dilutes link equity, because people will be linking to different pages which hold the same content.

 

6.1 Burning Crawl Budget

 

Google has to spend a lot of resources to crawl your website. This includes servers, personnel, internet and electricity bills and many other costs. Although Google’s resources seem unlimited (and probably are), the crawler does stop at some point if a website is very, very big.

 

If Google crawls your pages and keeps finding the same thing over and over again, it will ‘get bored’ and stop crawling your site.

 

This might leave important pages uncrawled, so new content or changes might be ignored. Make sure all of your most important pages are crawled and indexed by reducing the number of irrelevant pages your site is feeding to Google.

 

Since duplicate content is usually generated by dynamic URLs from search filters, it ends up being duplicated not once, but thousands of times, depending on how many filter combinations there are.

 

One example is the one I gave above with eMAG. In reality, Google filtered a lot more results as doing a search for site:emag.ro + the keyword returns over 40.000 results. Many of those are probably very similar and some might be identical. For example, another variation is keyword + white. However, the landing page doesn’t only list white items, which makes it also irrelevant.

 

6.2 Link signal dilution

 

When you get backlinks, they point to a specific URL. That URL gets stronger and stronger the more links it gets. However…

 

If you have 10 versions of the same page and people can access all of them, different websites might link to different versions of that page.

 

While this is still helpful for your domain overall, it might not be the best solution for your website or for specific, important pages that you want to rank high.

 

We’ll talk soon about this issue, what causes it and how to fix it.

 

6.3 Not SEO Friendly URLs

 

The URL examples I gave above are rather search engine optimization friendly, but some filters might not look so friendly. We all know Google recommends that you keep your URLs user friendly. Weird long URLs are associated with viruses, malware and scams.

 

A while ago, we even made a research on thousand of websites and the conclusion was that the more concise the URL, the greater the chance to be higher up.

 

Example of not friendly URL: https://domain.com/category/default.html?uid=87YHG9347HG387H4G&action=register

Example of friendly URL: https://domain.com/account/register/

 

Try to keep your URLs short and easy to read, so that they would help and not hurt your sites. For example, people will figure out what those filters mean if you say order=asc&price=500&color=red. But, unless you’re a very big and trustworthy brand, like Google, they won’t be so sure what’s happening if the URL parameter extension is ei=NgfZXLizAuqErwTM6JWIDA (that’s a Google search parameter suffix).

 

6.4 Bad user experience

 

As I said previously, sometimes the duplication of a page can result in bad user experience, which will harm your website on the long run.

 

If you end up ranking a page to the top of Google when it’s not actually relevant, users will notice that immediately (ex. indexing a search page with color xyz when you have no items with that color). 

 

7. Internal Duplicate Content Issues

 

It’s finally time to list the most popular scenarios of how duplicate content gets created on the web. To check some SEO basics, let’s start with how it happens on websites internally, because it’s by far the most common issue.

 

7.1 HTTP / HTTPS & WWW / non-WWW

 

If you have an SSL certificate on your website, then there are two versions of your website. One with HTTP and one with HTTPS.

 

  • http://domain.com
  • https://domain.com

 

They might look very similar, but in Google’s eyes they’re different. First of all, they’re on different URLs. And since it’s the same content, it results in duplicate content. Second, one is secure and the other one is not. It’s a big difference regarding security.

 

If you’re planning to move your site to a secure URL, make sure to check this HTTP to HTTPS migration guide.

 

There are also two more versions possible:

 

  • domain.com
  • www.domain.com

 

It’s the same thing as above, whether they’re running on HTTP or HTTPS. Two separate URLs containing the same content. You might not see a big difference between those two, but www is actually a subdomain. You’re just so used to seeing them as the same thing because they display the same content and usually redirect to a single preferred version.

 

While Google might know how to display a single version on its result pages, most of the time it doesn’t always get the right one.

 

I’ve encountered this many times. It’s a technical SEO basic thing that every SEO should check, yet very many make this mistake and forget to set a preferred version. On some keywords Google displayed the HTTP version and on some other keywords it displayed the HTTPS version of the same page. Not very consistent.

 

So how can you fix this?

 

Solution: To solve this issue, make sure you’re redirecting all the other URL versions to your preferred version. This should be the case not only for the main domain but also for all the other pages. Each page of non-preferred versions should redirect to the proper page’s preferred version:

 

For example, if your preferred version is https://www.domain.com/page1 then…

 

  • http://domain.com/page1/
  • http://www.domain.com/page1/
  • https://domain.com/page1/

…should all 301 redirect to https://www.domain.com/page1/

 

Just in case you’re wondering, a WWW version will help you on the long term if your site gets really big and you want to server cookieless images from a subdomain. If you’re just building your website, use WWW. If you’re already on root domain, leave it like that. The switch isn’t worth the hassle.

 

You can also set the preferred version from the old Google Search Console:

 

Preferred Domain Search Console

 

However, 301 redirects are mandatory. Without them, the link signals will be diluted between the 4 versions. Some people might link to you using one of the other variants. Without 301 redirects, you won’t take full advantage of those links and all your previous link building efforts will vanish.

 

Also, we’re not yet sure of the course the GSC is taking with its new iteration, so it’s unclear if this feature will still be available in the future.

 

7.2 Hierarchical Product URLs

 

One common issue that leads to duplicate content is using hierarchical product URLs. What do I mean by this?

 

Well, let’s say you have an eCommerce store with very many products and categories or a blog with very many posts and categories.

 

On a hierarchical URL structure, the URLs would look like this:

 

  • https://www.domain.com/store/category/subcategory/product
  • https://www.domain.com/blog/category/subcategory/article

 

At a first look, everything seems fine. The issue arises when you have the same product or article in multiple categories.

 

Now one solution:

 

As long as you are 100% certain that your product/article won’t be in two different categories, you’re safe using hierarchical URLs.

 

For example, if you have a page called services and have multiple unique services with categories and subcategories, there’s no issue in having hierarchical URLs.

 

  • https://www.domain.com/services/digital-marketing/
  • https://www.domain.com/services/digital-marketing/seo
  • https://www.domain.com/services/digital-marketing/ppc
  • https://www.domain.com/services/digital-marketing/email-marketing
  • https://www.domain.com/services/website-creation/
  • https://www.domain.com/services/website-creation/presentation
  • https://www.domain.com/services/website-creation/blog
  • https://www.domain.com/services/website-creation/ecommerce

 

Solution: If you think your articles or products will be in multiple categories, then it’s better to separate post types and taxonomies with their own prefixes:

 

  • https://www.domain.com/store/category/subcategory/
  • https://www.domain.com/store/products/product-name/

 

Category pages can still remain hierarchical as long as a subcategory isn’t found in multiple root categories (one scenario would be /accessories, which can be in multiple categories and subcategories, but it’s only the name that’s the same, while the content is completely different, so it’s not duplicate content).

 

Another solution would be to specify a main category and then use canonical tags or 301 redirects to the main version, but it’s not such an elegant solution and it can still cause link signal dilution.

 

Warning: If you do plan on fixing this issue by changing your URL structure, make sure you set the proper 301 redirects! Each old duplicate version should 301 to the final and unique new one.

 

7.3 URL Variations (Parameters & Session IDs)

 

One of the most common causes of content duplication are URL variations. Parameters and URL extensions create multiple versions of the same content under different URLs.

 

They are especially popular on eCommerce websites, but can also be found on other types of sites, such as booking websites, rental services and even blog category pages.

 

For example, on an eCommerce store, if you have filters to sort items by ascending or descending price, you can get one of these two URLs:

 

  • domain.com/category/subcategory?order=asc
  • domain.com/category/subcategory?order=desc

 

These pages are called faceted pages. A facet is one side of an object with multiple sides. In the example above, the pages are very similar, but instead of being written A to Z they’re written Z to A. 

 

Some people will link to the first variant, but others might link to the second, depending on which filter they were on last. And let’s not forget about the original version without any filters (domain.com/category/subcategory). On top of that, these are only two filters, but there might be a lot more (reviews, relevancy, popularity, etc.).

 

This results in link signal dilution, making one of every version a little bit stronger, instead of making a single version of that page really strong. Eventually, this will lead to fewer rankings overall.

 

Duplicate content facet parameters

Walmart having brand facets on its baby wipes category

 

Sure, you might argue that because of pagination, the pages will actually be completely different. That’s true if you have enough products in a category to fill multiple pages.

 

However, I could also argue that the first page of “?order=desc” is a duplicate of the last page of domain.com/category/subcategory?order=asc and vice versa. One of them is also a duplicate of the main version, unless the main version orders them randomly.

 

I could also argue that Google doesn’t really care about pagination anymore. In fact, it cares so little that it forgot to tell us that it doesn’t care about them.

 

Google still recommends using pagination the same way as you did before (either with parameters or subdirectories).

 

However, you should also make sure now that you properly interlink between these pages and that each page can ‘kind of’ stand on its own. Mihai Aperghis from Vertify asked John Mueller about this and this was his response:

 

https://youtu.be/1xWLUoa_YIk?t=876

 

Just because parameters create duplicate content issues it doesn’t mean you should never index any pages that contain parameters. 

 

Sometimes it’s a good idea to index faceted pages, if users are using those filters as keywords in their search queries.

 

For example, some bad filters which you should not index could be sorting by price as shown above. However, if your users search for “best second hand car under 3000” then filters with price between X and Y might be relevant.

 

Another popular example are color filters. If you don’t have a specific color scheme for a product but the filter exists, you don’t want to index that. However, if filtering by the color black completely changes the content of the page, then it might be a relevant page to index, especially if your users also use queries such as “black winter coats”.

 

The examples above are for some general eCommerce store, but try to adapt them to your particular case. For example, if you have a car rental service, people might not necessarily search for color but they might search for diesel, petrol or electric, so you might want to index those.

 

One thing to mention is that anchor-like extensions & suffixes (#) are not seen as duplicate URLs. Google simply ignores fragments.

 

Ian Laurie from Portent talks about fixing a HUGE duplicate content issue (links with parameters to contact pages on every page of the site) like this. The solution was to use # instead of ? as an extension to the contact page URL. For some reason, Google completely ignores links with anchors.

 

However, in this article Ian mentions that he hasn’t even tried rel=canonical to fix the issue. While rel=canonical would probably not harm at all, in this case it might have not been helpful due to the scale of the issue.

 

Solution:  The best solution here is to actually avoid creating duplicate content issues in the first place. Don’t add parameters when it’s not necessary and don’t add parameters when the pages don’t create a unique facet, at least to some extent.

 

However, if the deed is done, the fix is to either use rel=canonical and canonicalize all the useless facets to the root of the URL or noindex those pages completely. Remember though that Google is the one to decide if it will take the ‘recommendation’ you give through robots.txt or noindex meta tags. This is also applicable to canonical tags, but from my experience, they work pretty well!

 

Remember to leave the important facets to be indexed (self referencing canonical), especially if they have searches. Make sure to also dynamically generate their titles.

 

The title of the facet should not keep the same title as the main category page. It should be dynamically generated depending on the filters. So if my category is Smartphones and the title is Best Smartphones You Can Buy in 2019 and the user filters after color and price, then the title of the facet should be something like “Best Blue Smartphones Under $500 You Can Buy in 2019”

 

7.4 Bad Multilingual Implementation

 

Another issue that can result in content duplication is a bad hreflang implementation.

 

Most multilingual websites have a bad hreflang implementation. That’s because most plugins out there implement the hreflang wrong.

 

Even I use some because I couldn’t find an alternative. I’ll present the issue:

 

When you have 2 languages and a page is translated to both languages, everything is fine. Each page has 2 hreflang tags pointing correctly to the other version. However, when a page is untranslated, the other language version points to the root of the other language, when it should not exist at all. This basically tells Google that the French language version of domain.com/en/untranslated-page/ is domain.com/fr/, which isn’t true.

 

Polylang has this issue. I know WPML also had it, but I’m not sure if they’ve addressed it yet.

 

However, it’s not the hreflang tag itself that causes duplicate content issues, but the links to these pages from the language selector.

 

The hreflang issue only confuses search engines into which page to display where. It doesn’t cause duplicate content issues. But while some plugins are smarter, others also create the pages and links to those other versions in the menu of the website. Now this is duplicate content.

 

When the pages aren’t translated, qTranslate (which has been abandoned) creates links to the other variants but lists them as empty or with a warning message saying something like “This language is not available for this page). This creates a flood of empty pages with similar URLs and titles (stolen from the original language) that burn crawl budget and confuse search engines even more.

 

Now you might think a fix is easy, but merging from one plugin to another isn’t always the easiest thing to do. It takes a lot of time and effort to get it right.

 

Solution: The simple solution is to not create any links to untranslated variants. If you have 5 languages, a page which is translated to all 5 languages should include the other 4 links in the menu (under the flag drop down let’s say) and also have the appropriate hreflang tags implemented correctly.

 

However, if you have 5 languages but a particular page is only translated in 2 languages, the flags dropdown should only contain 1 link, to the other page (maybe 2 links to both pages, a self-referencing link isn’t really an issue). Also, only 2 hreflang tags should be present instead of all 5.

 

If you want to know more about SEO & multilingual websites you should read this post on common hreflang mistakes.

 

7.5 Indexed Landing Pages for Ads

 

While it’s good to have landing pages that are focused on conversions everywhere, sometimes they’re not the best for SEO. So it’s a very good idea to create customized pages only for ads.

 

The thing here is that many times, they are very similar and offer the same content. Why similar and not identical? Well, there can be many reasons. Maybe you have different goals for those pages, or different rules from the advertising platform.

 

For example, you might only change an image because Adwords rules don’t let you use waist measuring tools when talking about weight loss products. However, when it comes to organic search, that’s not really an issue.

 

Solution: If your landing pages have been created specifically for ads and provide no SEO value, use a noindex meta tag on them. You can also try to canonicalize them to the very similar version that is actually targeted to organic search.

 

7.6 Boilerplate Content

 

Boilerplate content is the content that is found on multiple or every page of your site. Common examples are Headers, Navigation Menus, Footers and Sidebars. These are vital to a site’s functionality. We’re used to them and without them a site would be much harder to navigate.

 

However, it can sometimes cause duplicate content, for example when there is too little content. If you have only 30 words on 50 different pages, but the header, footer and sidebar have 250 words, then that’s about a 90% similarity. It’s mostly caused by the lack of content rather than the boilerplate.

 

Solution: Don’t bother too much with it. Just try to keep your pages content rich and unique. If some faucet pages from your filters list too little products, then the boilerplate content will be most of the content. In that case, you want to use the solution mentioned above in the URL Variations section.

 

It’s also a good idea if you keep it a little bit dynamic. And by dynamic I don’t mean random. It should still be static on each page, just not the same on every page.

 

For example, Kayak uses a great internal linking strategy in its footer. Instead of displaying the same cities over and over again, it only displays the closest or more relevant ones. So while the homepage displays the most important cities in the US, New York only displays surrounding cities. This is very relevant for the user and Google loves that!

 

8. External Duplicate Content (Plagiarism)

 

Duplicate content can occur cross-domains. Again, Google doesn’t want to show its users the same thing 6 times, so it only has to pick one, the original article most of the times. 

 

There are different scenarios where cross-domain content duplication occurs. Let’s take a look at each of them and see if we can come up with some solutions!

 

8.1 Someone steals your content

 

Generally, Google tries to reward the original creator of the content. However, sometimes it fails.

 

Contrary to popular belief, Google might not look at the publication date when trying to determine who was first, because that can be easily changed in the HTML. Instead, it looks at when it first indexed it.

 

Google figures out who published the content first by looking at when it indexed the first iteration of that content.

 

People often try to trick Google into thinking they published the content first. This has even happened to us, here at CognitiveSEO. Because we publish rather often, we don’t always bother to tell Google “Hey, look, new content, index it now!”

 

This means that we let the crawler do its job and get our content whenever it thinks it suitable. But this allows for others to steal the content and index it quicker than us. Automatic blogs using content scapers steal the content and then tells Google to index it immediately.

 

Sometimes it takes the links as well and then Google is able to figure out the original source if you do internal linking well. But often it strips all links and sometimes even adds links of their own.

 

Because our domain is authoritative in Google’s eyes and most of those blogs have weak domains, Google figures things out pretty quickly.

 

But if you have a rather new domain and some other bigger site steals your content and gets it indexed first, then you can’t do much. This happened to me once with my personal blog in Romania. Some guys thought the piece was so awesome they had to steal it. Problem was they didn’t even link to the original source.

 

Solution: When someone steals your content the best way to protect yourself is to have it indexed first. Get your pages indexed as soon as possible using the Google Search Console.

 

Request indexing GSC

 

This might be tricky if you have a huge website. Another thing you can do is to try and block the scrapers from crawling you from within your server. However, they might be using different IPs each time.

 

You can also file a DMCA report to Google and let them know you don’t agree with your content being stolen/scraped. There are very little chances that this will help, but you never know.

 

8.2 You steal someone else’s content

 

Well… I can’t say much about this. You shouldn’t be stealing other people’s content! It should be written somewhere on a Beginners guide to SEO that stealing content is not a content marketing strategy.

 

Content Thief

 

In general, having only duplicate content on your website won’t give you great results with search engines. So using content scraping tools and automated blogs isn’t the way to go for SEO.

 

However, it is not unheard of sites that make a decent living out of scraping content. They usually promote that through ads and social media, but sometimes also get decent search engine traffic, especially when they repost news stories.

 

This can result in legal action which might close your site and even get you in bigger problems, such as lawsuits and fines. However, if you have permission from the owners to scrape their content, I wouldn’t see an issue with that. We all do what we want, in the end.

 

While Google considers that there’s no added value for their search engine and tries to reward the original creator whenever possible, you can’t say that a news scarping site is never useful. Maybe a natural disaster warning news reaches some people through that site and saves some lives. You never know.

 

What if you don’t actually steal their content?

 

What about those cases in which you really find a piece of content interesting and want to republish it so that your audience can also read it? Or let’s say you publish a guest post on Forbes but also want to share it with your audience on your site?

 

Well, in that case there’s not really an issue as long as you get approval from the original source.

 

Solution: It’s not really an issue from an SEO point of view if you publish someone else’s content. However, make sure you have approval for this to not get into any legal issues. They will most probably want a backlink to their original post in 99.9% of the cases.

 

content marketing

 

You can even use a canonical link to the original post. Don’t use a 301 as this will send the people directly to the source, leaving you with no credit at all.

 

Don’t expect to rank with that content anywhere, especially if your overall domain performance is lower than the original source. That being said, don’t let other big websites repost your content without at least a backlink to your original source in the beginning of the article or, preferably, a full fledged canonical tag in the HTML source.

 

8.3 Content Curation

 

Content curation is the process of gathering information relevant to a particular topic or area of interest. I didn’t write that. Wikipedia did.

 

Curating content rarely leads to duplicate content. The difference between content curation and plagiarism is that in plagiarism, people claim to be the original owner of the content.

 

However, the definition has its issues, as things can be very nuanced. What if I actually come up with an idea, but I don’t know that someone has written about it before? Is that still plagiarism?

 

In fact, what did anyone ever really invent? Can you imagine/invent a new color? Even Leonardo Da Vinci probably inspired his helicopter from the maple seed.

 

This post, for example, is 100% curated content. I’ve gathered the information from around the web and centralized it here. Brian Dean calls this The Skyscraper Technique. While people have done this for ages, he gave it a name and now he’s famous for that.

 

Skyscraper Technique Backlinko

Skyscraper Technique by Backlinko

 

However, I didn’t actually steal anything. I’ve gathered the information in my head first and then unloaded it here in this article. I didn’t use copy paste and I didn’t claim to have invented these techniques or methods. I cited people and even linked to them. All I did was put all this information into one place by rewriting it from my own head.

 

Solution: When you write content using the Skyscraper Technique or by curating content or whatever you want to call it, make sure you don’t copy paste.

 

Make sure you look at the topic from a new perspective, from a different angle. Make sure you add a personal touch. Make sure you add value. That’s when it’s going to help you reach the top.

 

8.4 Content Syndication

 

After web scraping, content syndication is the second most common duplicate content around the web. The difference between these two is that it’s a willful action.

 

So the question arises! Will syndicating my content over 10-20 sites affect my SEO?

 

Content syndication SEO

 

In general, there’s no issue in syndicating content, as long as it’s not your main content generation method (which kind of looks like web scraping).

 

When syndicating content, it’s a great idea to get a rel=canonical to the original source, or at least a backlink.

 

Again, Google doesn’t like seeing the same thing over and over again. That’s actually the purpose of the canonical tag, so use it properly!

 

Solution: Make sure content syndication isn’t your main thing as it might be picked up by Google as scraping content. If you want to syndicate your content, do your best to try to get a canonical tag to the original URL on your site. This will ensure that other sites won’t be able to outrank you.

 

If rel=canonical isn’t an option, then at least get a backlink and try to get it as close to the beginning as possible.

 

9. How To Identify Duplicate Content Issues

 

If you’ve purposely reached this article, then you most probably already have a duplicate content issue. However, what if you don’t know you have one?

 

Some time ago, this was also possible in the old version of the Google Search Console. Unfortunately, this sections now returns “This report is no longer available here.” In their official statement, Google said that they will ‘let go’ of this old feature.

 

 

Image result for search console duplicate content

Image from SerpStat

 

Well, since that’s no longer available, the CognitiveSEO Site Audit Tool is a great way to easily identify duplicate content issues:

 

SEO Duplicate Content Issues

 

You can take a look at the content duplication or the Title, Heading and Meta Description duplication.

 

Duplicate content

 

The tool also has an awesome advanced feature that identifies near duplicate pages and tells you the level of similarity between them!

 

Every page in the tool has hints on how you can fix different issues, so if you’re trying to fix any duplicate content issues, the Site Audit Tool can definitely help you out.

 

11. How to Get Rid Of or Fix Duplicate Content Issues

 

In the examples I gave above for each scenario, you’ve seen that there are different solutions, from not indexing the pages to 301 redirects and canonical URL tags.

 

Let’s take a look at what each solution does so that you may better understand why each one might be better for particular cases. This way, you’ll be able to make the right choice when faced with your unique scenario.

 

First of all, remember that:

 

The best way to fix duplicate content is to avoid having it in the first place.

 

Duplicate content issues can escalate very quickly and can be very difficult to fix. That’s why it’s always a good idea to talk to an SEO specialist before you even launch a webiste. Make sure the specialist gets in contact with the development team. Preventing an illness is always better and cheaper than treating it!

 

11.1 Using 301 Redirects

 

The 301 redirect can fix a duplicate content issue. However, this also means that the page will completely vanish and redirect to a new location.

 

If you users don’t need to access that page, the 301 is the best way of dealing with duplicate content. It passes link equity and Google will always respect it. However, it’s not a good use case for facets, for example, because you want your users to be able to access those pages.

 

You can use a plugin to 301 redirect or redirect from the .htaccess file if you’re on an Apache server. There are also probably 1,000 other ways of setting up a 301 but you have Google for that.

 

11.2 Using Canonical Tags (probably best)

 

The canonical tag has actually been introduced by Google as a solution to content duplication.

 

Whenever you need users to be able to access those pages from within the website or anywhere else but don’t want to confuse search engines in which page to rank since they might be very similar, then you can use the canonical tag to tell search engines which page should be displayed in the search results.

 

11.3 Using Noindex

 

Not indexing some pages can also fix duplicate content issues. However, you have to make sure these pages are actually useless.

 

A big warning sign that you should not remove them from the index are backlinks. If these pages have backlinks, then a 301 or a canonical tag might be the better choice since they pass link signals.

 

You can either block the pages from being indexed through your robots.txt file:

 

User-agent: *
Disallow: /page-you-want-noindex-on/

 

Or you can add a noindex meta tag directly on the pages you don’t want Google to index:

 

<meta name=”robots” content=”noindex”>

 

However, Google will choose to follow or ignore your indications, so make sure you test the effectiveness of your action on your particular case.

 

11.4 Using a mix of all

 

Sometimes, you can’t fix them all. You might want to noindex some facets of your filtered pages, while you might want to canonicalize others. In some instances, you might even want to 301 them since they don’t have value anymore. This really depends on your particular case!

 

An awesome guide on how different duplicate content fixes affect pages and search engines can be found here. This is a screenshot of the table in that article presenting the methods and their effects:

 

Fix duplicate content

Screenshot from practicalecommerce.com

 

Conclusion

 

Duplicate content is an issue affecting millions of websites and approximately 30% of the entire internet! If you’re suffering from duplicate content, hopefully now you know how you can fix it.

 

We tried to create a complete guide where you can find anything you want about the subject “duplicate content and SEO.” This advanced guide to SEO and duplicate content will surely help you whether you’re doing some technical SEO audits or if you’re planning an online marketing campaign.

 

Have you encountered content duplication issues in your SEO journey? Let us know in the comments section below!

The post How Duplicate Content Affects SEO & Google Rankings | The Complete Guide appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
https://cognitiveseo.com/blog/22150/duplicate-content-seo/feed/ 5
How to Get 100% Year-Over-Year Organic Traffic Growth with Lukasz Zelezny https://cognitiveseo.com/blog/21693/lukasz-zelezny-organic-traffic/ https://cognitiveseo.com/blog/21693/lukasz-zelezny-organic-traffic/#comments Wed, 27 Mar 2019 10:25:27 +0000 https://cognitiveseo.com/blog/?p=21693 Within this cognitiveSEO Talks episode you’ll get the chance to get inspired by Lukasz Zelezny, a prolific keynote speaker, SEO consultant, and author. He started working in the SEO industry around 20 years ago while living in Poland. Every year he is actively participating in 10 to 20 events as a keynote speaker and he […]

The post How to Get 100% Year-Over-Year Organic Traffic Growth with Lukasz Zelezny appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
Within this cognitiveSEO Talks episode you’ll get the chance to get inspired by Lukasz Zelezny, a prolific keynote speaker, SEO consultant, and author. He started working in the SEO industry around 20 years ago while living in Poland. Every year he is actively participating in 10 to 20 events as a keynote speaker and he constantly worked for mid and large companies such as HomeAway, Thomson Reuters, The Digital Property Group,to mention just a few. 

 

 

subscribe to cognitiveSE) youtube channel

 

As Lukasz himself is mentioning, he is a hands-on person, spending lots of his time keeping up to date with the changes in the technology of online marketing.  He started his professional career in 2005 and has since been responsible for the organic performance of a number of companies including HomeAway, Thomson Reuters, The Digital Property Group and Fleetway Travel.

Lukasz traveled 75,000 km speaking at many SEO and social media conferences including ClickZ Shanghai China, ClickZ Jakarta Indonesia, SiMGA Malta, SES London in the United Kingdom, as well as conferences held in Europe – Marketing Festival in Brno, Brighton SEO in Brighton, UnGagged in London.

Additionally, whenever he has the chance he organizes workshops where he is sharing tips around SEO, Social Media and Analytics. And talking about SEO tips and tricks, hope you enjoyed the list of SEO tips Lukasz has shared with us within this interview. 

 

Tackled Topics:

  • How Lukasz has implemented strategies for organic traffic growth
  • The importance of page speed in SEO
  • The process of gap analysis used for his clients
  • The status of on-page and off-page SEO in 2019
  • Social signals vs. traffic influence on website rankings
  • The impact of brand tracking in SEO
  • Best use cases for SEO
  • How the SEO landscape will change in the next 10 years

 

Top 10 Marketing Nuggets:  

 

  1. Eliminating blockers in the organic strategy such as page speed, or situations when the website is keeping the blog or forum on subdomains (which I’m not a fan of) can achieve 100% growth YoY on organic traffic, as well on conversions. 3:15 
  2. Within the gap analysis process: first, find who your competitors are, second, use VLOOKUP or proper tool to find out the keyword that your website is not ranking at all and in the same time the first 3 competitors that rank together in the first 20 positions. You’ll get lots of keywords potentially relevant to your business. 10:40
  3. A unique strategy is the Snapshot strategy – this means to utilize the existence of content; it is generally that you should identify 2,3,4 keywords per URL that already deliver traffic and make those keywords perform better, rank higher, deliver a better quality of traffic. If you’re able to do this at scale and repeat the process for 20 days you’ll experience major growth. 12:21
  4. All forms of digital marketing such as free webinars, videos, courses are great strategies for SaaS companies to make them acquire their first 100 users. 16:37
  5. Gamification works great for getting more visitors to your website. There are websites that give lifetime access for free if you are to attract enough users on the website in a certain time acting like an MLM. 18:07

  6. When you have a very specific problem (like brand tracking, for example) it’s good not to close tightly the problem, but rather start writing around the problem and giving great content not focusing only on the product. 19:17

  7. In my opinion, there’s no correlation between social shares and rankings because I don’t think Google likes to include signals from third-party platforms in their algorithm, and it’s easier to get shares than links which will make it easier to gain ranks. 21:29
  8. Off-page in 2019 is not as critical as it was 10 years ago because nowadays people are thinking mostly at on-page SEO, especially page speed. Personally, I’m a fan of on-page optimization. 27:34

  9. I find brand tracking very important. When you find a mention about your brand you can always approach the person to thank them and ask them to link back to your website. 30:50

  10. I don’t think voice search will dramatically change in the next 10 years. 38:15

The post How to Get 100% Year-Over-Year Organic Traffic Growth with Lukasz Zelezny appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
https://cognitiveseo.com/blog/21693/lukasz-zelezny-organic-traffic/feed/ 2
Technical SEO Checklist – The Roadmap to a Complete Technical SEO Audit https://cognitiveseo.com/blog/17963/technical-seo-checklist/ https://cognitiveseo.com/blog/17963/technical-seo-checklist/#comments Wed, 20 Feb 2019 09:27:47 +0000 https://cognitiveseo.com/blog/?p=17963 While technical SEO is a topic that only some of us make use of rigorously, it is a part of everybody’s life. Well, which part of SEO is not technical if we were to look at it thoroughly?   SEO issues, mistakes, tips and recommendations are all included in today’s technical checklist. We wanted to […]

The post Technical SEO Checklist – The Roadmap to a Complete Technical SEO Audit appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
While technical SEO is a topic that only some of us make use of rigorously, it is a part of everybody’s life. Well, which part of SEO is not technical if we were to look at it thoroughly?

 

SEO issues, mistakes, tips and recommendations are all included in today’s technical checklist. We wanted to cover, in the most effective way possible, all the elements that are important for making your website user-friendly, efficient, visible in SERP, functional and easy to understand. Therefore, gather all the information you have on your site and let’s get better. 

 

Technical SEO the Complete List

 

I. Website Loading Speed Time

  1. Improve Server Response Time
  2. Optimize & Reduce Image Size Without Affecting the Visual Appearance
  3. Minimize the Render-Blocking Javascript and CSS
  4. Limit the Number of Resources & HTTP Requests
  5. Set a Browser Cache Policy
  6. Reduce the Number of Redirects & Eliminate Redirect Loop
  7. Avoid Loading Your Site With Too Much Stuff

 

II. Website Functionality & Usability

  1. Make Sure Your Website Is Mobile Friendly
  2. Build Search Engine Friendly URLs
  3. Use the Secure Protocol – HTTPs
  4. Set Preferred Version
  5. Set up Correctly the 301 Redirects After Site Migration
  6. Make Sure Your Resources Are Crawlable
  7. Test Your Robots.Txt File to Show Google the Right Content
  8. Verify the Indexed Content
  9. Review Your Sitemap to Avoid Being Outdated
  10. Review Blocked Resources (Hashbang URLs) with Fetch as Google
  11. Optimize Your Crawl Budget
  12. Avoid Meta Refresh for Moving a Site
  13. Use Redirect for Flash Site to the HTML Version
  14. Use Hreflang for Language and Regional URLs
  15. Make Sure Your Tracking Is Working Properly

 

III. Content Optimization

  1. Redirect/Replace Broken Links & Resources
  2. Audit Internal Links to Improve Your Chances to Rank Higher
  3. Get Rid of Duplicate Content
  4. Use Structured Data to Highlight Your Content
  5. Keep a Reasonable Number of Links On-Page
  6. Avoid Canonicalizing Blog Pages to the Root of the Blog

 

IV. User-Friendlier Website

  1. Set up Your AMP the Right Way – Mobile Friendlier
  2. Add Breadcrumbs for a Better Navigation
  3. Test On as Many Platforms and Devices as Possible
 

I. Website Loading Speed Time

 

On the web, time is of the essence. Websites all around the world load pretty slow, with an average of 19 seconds to load on a 3G mobile network. Testing has confirmed that around 50% of users abandon a website if it doesn’t load faster than 3 seconds, on average.

 

If your website loads slowly, you can lose a lot of visitors.

 

Disclaimer & Warning: Playing with PHP, servers, databases, compression, minification and other similar things can really mess up your website if you don’t know what you’re doing. Make sure you have a proper backup of the files and the database before you start playing with these options.

 

When we talk about speed, there are a few things we need to consider for making your site efficient and easy to access for your users. A faster loading speed time means higher conversion and lower bounce rates. For that, we’ve selected some mandatory speed optimization suggestions. Using Google’s Speed Test, you can perform easy and short analyses of your website’s loading speed time.

Audit & Fix Your Site Now

The tool has improved over the years and now you can see helpful charts for large websites to understand how each website is performing. One example is the Page Load Distributions.

 

Page load distribution

 

The Page Load Distribution uses two user-centric performance metrics: first, contentful paint (FCP) and DOMContentLoaded (DCL). The contentful paint marks the first bit of content there is on the screen when the browser starts to render pixels. The DOMContentLoaded marks the moment when the DOM is ready and there are no stylesheets that are blocking JavaScript execution. These two metrics show exactly which percentage of the content loads faster and the one that needs improvement by looking at those pages with average and slow speed (if you follow the chart).

 

Another example includes the speed and optimization indicators which show where each website is situated. In the picture showed below, we can see the FCP and DCP score. These two metrics use the data from the Chrome User Experience. It indicates the page’s median FCP (1.8s) and DCL (1.6s) ranks it in the middle third of all pages. That means this page has a low level of optimization because most of its resources are render-blocking. 

 

Speed and optimization

 

1. Improve Server Response Time

 

Server response time refers to the period of time it takes to load the HTML code to begin rendering the page from your server. Basicaly, when a you access a page, it sends a message to the server and the time it take to show you that information is considered to be the server response time.

 

There are lots of reasons why a website has a slow response time. Google announces just some of them:

There are dozens of potential factors which may slow down the response of your server: slow application logic, slow database queries, slow routing, frameworks, libraries, resource CPU starvation, or memory starvation.
Google logo Google Developers
 

The server response time depends on how much time the Googlebot needs to access the data. Be it 1, 2 ,3 seconds or more, it will convert your visitor or not. Google says that you should keep the server response time under 200ms.

 

There are 3 steps you need to follow to test and improve the server response time:

  1. Firstly, you need to collect the data and inspect why the server response time is high.
  2. Secondly, measure your server response time to identify and fix any future performance bottlenecks.
  3. Lastly, monitor any regression.

 

Many times, the reason why a website loads slow is the server itself. It’s very important to choose a high quality server from the beginning. Moving a site from a server to another might sound easy in theory, but it can be accompanied by a series of possible problems such as file size limits, wrong PHP versions and so on.

 

Choosing the right server can be difficult because of pricing. If you’re a multinational corporation, you probably need dedicated servers, which are expensive. If you’re just starting out with a blog, shared hosting services will probably be enough, which are usually cheap.

 

However, there are good shared hosting servers and bad dedicated ones and vice versa. Just don’t go after the cheapest or the most renowned. For example, Hostgator has excellent shared hosting services for the US, but not so excellent VPS ones.

 

2. Optimize & Reduce Image Size Without Affecting the Visual Appearance

 

If a website is loading really slow, one of the first things that come in mind are images. Why? Because they’re big. And we’re not talking in size on screen but in size on disk.

 

Besides all the information an image has, as mentioned before, it also downloads lots of bytes on a page, making the server take more time than it should to load all the information. Instead, if we optimize the page, the server will perform faster because we removed the additional bytes and irrelevant data. The fewer the downloaded bytes by the browser, the faster a browser can download and render content on the screen.

 

Since GIF, PNG, and JPEG are the most used types of extension for a picture, there are lots of solutions for compressing images.

 

image-compressor

Source: www.cssscript.com

Here are a few tips and recommendations to optimize your images:

  • Use PageSpeed Insights;
  • Compress images automatically in bulk with dedicated tools (tinypng.com, compressor.io, optimizilla.com) and plugins (WP Smush, CW Image Optimizer, SEO Friendly Images) and so on;
  • Use GIF and PNG formats because they are lossless. PNG is the desired format. The best compression ratio with a better visual quality can be achieved by PNG formats;
  • Convert GIF to PNG if the image is not an animation;
  • Remove transparency if all of the pixels are opaque for GIF and PNG;
  • Reduce quality to 85% for JPEG formats; that way you reduce the file size and don’t visually affect the quality;
  • Use progressive format for images over 10k bytes;
  • Prefer vector formats because they are resolution and scale independent;
  • Remove unnecessary image metadata (camera information and settings);
  • Use the option to “Save for Web” from dedicated editing programs.

 

Compress images

Source: www.quora.com

 

If you’re using WordPress, you can choose a simple solution, such as the Smush Image Compression Plugin.

 

Update: As of 2019, Google PageSpeed Insights recommends using new format images such as JPEG2000 or WEBP. However, not all browsers and devices display these formats well yet, so regular image compression is still recommended, despite Google making efforts to push this.

 

You can see which images are the biggest on your website with the Site Audit by CognitiveSEO. Simply head to the Images section, under Content. There you can see a list of images over 500kb (consider that for a photographer website, these images might be relatively small in size. However, it’s a good idea to display the full HD version under a sepparate download link).

 

The only real issue with PageSpeed Insights is that you can only check one page at a time.

 

We, here at CognitiveSEO, know that many of you want to check the PageSpeed Insights in bulk. So that’s why we’ve developed our tool to be able to bulk check the Page Speed Insights scores on multiple pages at the same time:

 

Check PageSpeed Insights in Bulk

 

However, note that if you have a very big website, this process might take a very long time. It’s better if you opt out of this process at first before the first analysis is done (so that you may have all the data and start fixing some of the issues) and start the PageSpeed process later. It can take up to 10 seconds per page, so if you have 60,000 pages it can take a week.

 

 

3. Minimize the Render-Blocking Javascript and CSS & Structure HTML Accordingly

 

When you perform a speed test with Google’s PageSpeed Insights, you will see this message: Eliminate render-blocking JavaScript and CSS in above-the-fold content in case you have some blocked resources that cause a delay in rendering your page. Besides pointing out the resources, the tool also offers some great technical SEO tips regarding:

  • Removing render-blocking JavaScript;
  • Optimizing CSS delivery.

 

You can remove render-blocking JavaScript by following Google’s guidelines and avoid or minimize the use of blocking JavaScript using three methods: 

  • Inline JavaScript;
  • Make JavaScript Asynchronous;
  • Defer loading of JavaScript.

 

If Google detects a page which delays the time to first render because it contains blocking external stylesheets, then you should optimize CSS delivery. In this case, you have two options:

  • For small external CSS resources, it is recommended to inline a small CSS file and help the browser to render the page;
  • For large CSS files, you have to use Prioritize Visible Content to reduce the size of the above-the-fold content, inline CSS necessary for rendering it and then defer loading the remaining style.

 

PageSpeed shows which files need to be optimized through the minifying technique. When we talk about resources, we understand HTML, CSS, and JavaScript resources. Basically, the tool will indicate a list of HTML resources, CSS resources, and JavaScript resources, depending on the situation. Below you can see an example of such kind:

 

Minify JavaScript Resources

 

For each kind of resources, you have individual options:

 

Below you can see an example on how to minify your CSS:

css minifier

Source: www.keycdn.com

 

There are 3 processes that need to be followed in the minifying process, explained by Ilya Grigorik, Web performance engineer at Google:

  1. Compress the data. After you eliminate the unnecessary resources, you need to compress the ones that the browser needs to download. The process consists in reducing the size of the data to help the website load the content faster.
  2. Optimize the resources. Depending on what sort of information you want to provide on your site, make an inventory for your files and keep only the one that is relevant, to avoid keeping irrelevant data. After you decide which information is relevant to you, you’ll be able to see what kind of content-specific optimizations you’ll have to do.

Let’s take, for example, a photography website that needs to have pictures with a lot of information, such as camera settings, camera type, date, location, author and other information. That information is crucial for the particular website, while for another website it might be irrelevant.

  1. Gzip compression is best used for text-based data. In the process, you are able to compress web pages and style sheets before sending them to the browser. It works wonders for CSS files and HTML because these types of resources have a lot of repeated text and white spaces. The nice part of Gzip is that it temporarily replaces the similar strings within a text file to make the overall file size smaller.

 

 

For WordPress users there are simpler solutions:

  1.  Autoptimize plugin to fix render blocking scripts and CSS. You need to install the plugin and afterward you can find it in Settings » Autoptimize to configure the settings. All you have to do is check the box for JavaScript and CSS, in our case, and click on Save Changes.

 

Autoptimize CSS and JavaScript

Source: www.webid-online.com

 

  1. W3 Total Cache to fix render-blocking JavaScript. This is another tool provided for WordPress users and it requires a little more work. After you install it, you need to go to Performance » General Settings and look for the Minify section.

 

Minify scripts with W3 Total Cache

Source: www.factoriadigital.com

 

Check the enable box from the Minify option and then in Manual mode. In the end, click on Save all settings and add the scripts and CSS that you want to minify. After that, you’re set.

 

However, don’t get tricked by Google. The truth is that PageSpeed Insights is just a guideline. For example, PageSpeed Insights shows Analytics and Tag Manager as being JS that blocks the loading of important content. Yet they force you to put it in the <head> section.

 

You can follow this guide to better set up the W3 Total Cache Plugin.

 

Never remove something that is essential for tracking or for your site’s functionality just to get 100% score on PageSpeed Insights or GT Metrix.

 

4. Limit the Number of Resources & HTTP Requests

 

One of the first actions that come to mind when we talk about website speed is reducing the number of resources. When a user enters your website, a call is made to the server to access the requested files. The larger those files are, the longer it will take to respond to the requested action.

 

Rapid, multiple requests always slow down a server. It’s a combination of multiple factors that lead to this, but you can compare it to copying 1 large file on a hard disk against copying a very large number of small files. Usually, the small files take longer to copy because the disk needle has to keep moving. This is different with SSD technology where there are no needles but there’s still a lot more work to do to copy multiple files than to copy a single larger file.

 

To check your HTTP requests, you can open an Icognito (to make sure you don’t have cached requests which won’t take place) Tab in Chrome, right click and hit Inspect (at the bottom). Then you need to find the network subtab and hit F5 to refresh the page. This will start monitoring the requests and at the end you’ll see the number of requests.

 

Check HTTP Requests Chrome

 

There’s no general number, we can say that you should try to keep this number under 100. This really depends on the page. If it’s a HUGE page, then it can have more requests. Then again, it could be a good idea to paginate it.

 

The best thing you can do is delete unnecessary resources (like sliders) and then minimize the overall download size by compressing the remaining resources.

 

Another thing you can do is combine the CSS and JS files in a single file so that 1 single request is being made. Plugins such as Autoptimize and W3 Total Cache (both mentioned above) can do this. Through the combine option, the plugin basically takes all the CSS and JS files and merges them into a single file.

 

This way, the browser will only have to make one request to the server for all those files instead of one request for each file.

 

However, be careful! This option can usually break an entire site or make it display really messed up, so make sure you have a proper backup of the files and database before you start making any changes.

 

5. Set a Browser Cache Policy

 

The browser cache automatically saves resources in the visitor’s computer the first time they visit a new website. When users then enter the site a second time, those resources will help them get the desired information at a faster speed, if they return to that page. This way, the page load speed is improved for returning visitors.

 

For visitors that want to return to a page or visit a new page that in a specific moment can’t be accessed, there’s the option to view the cached version directly from SERP.

 

Cached website in SERP

 

The best way to significantly improve the page speed load is to leverage the browser cache and set it according to your needs.

 

Most of the Minification, Compression and Combination plugins are actually cache plugins, so they all have this function. You can use W3 Total Cache or any other caching plugin suits you best. However, a combination between W3 Total Cache’s caching and Autoptimize’s compression and combining is best.

 

Using a cache will also make changes harder to spot. If you make a change to your website, open an Icognito tab to see the changes and go to the plugin settings from time to time to reset the cache.

Audit & Fix Your Site Now

 

6. Reduce the Number of Redirects & Eliminate Redirect Loop 

 

Redirects can save you from a lot of trouble regarding link equity/juice and broken pages, but it can also cause you lots of problems if you have tons of them. A large number of redirects will load your websites at a slower speed. The more redirects, the more time a user must spend to get on the landing page.

Plain and simple, WordPress redirects slow down your site. That’s why it’s worth taking the time to minimize the number of redirects visitors to your site experience. There are times that it’s appropriate to intentionally create and use redirection, but limit the use of redirection to necessary instances and make sure your visitors have the fastest experience possible when browsing your WordPress website.
Jon Penland Jon Penland
 Support Engineer at Kinsta@jonrichpen

 

One other thing worth mentioning is that you need to have only one redirect for a page, otherwise you risk having a redirect loop. A redirect loop is a chain of redirects for the same page, which is misleading because the browser won’t know which page to show and will end up giving a pretty nasty error.

 

Redirect loop

Source: www.matrudev.com

 

In case you have 404 pages, there are lots of ways to customize the page and give some guidelines to the users so you won’t lose them. Design a friendly page and send the user back to your homepage or to another relevant and related piece of content.

 

For finding the broken pages for your website, you can use the Google Search Console by looking at Crawl » Crawl Errors, then click on Not found (if any).

 

lots-of-crawl-errorsnot found

 

Site Explorer offers a similar feature, pointing out the link juice you are losing (the number of referring domains and links for each broken page).

 

Broken pages

 

You can also use the new Technical SEO Site Audit Tool to analyze all your site’s redirects. After you set up the campaign and the tool finishes crawling and analyzing your site, simply head to Architecture > Redirects.

 

Fix 301 Redirects

 

7. Avoid Loading Your Site With Too Much Stuff

 

Over time, sites tend to get clogged up with useless images, plugins and functions that are never used. Why?

 

If you use WordPress, for example, you might test a lot of plugins and install them on your website, only to find out that you don’t really need them. Sure, you can disable them and eventually uninstall them but the problem with WordPress uninstalls is that they’re often dirty, leaving traces in your Database which can make it a little slower.

 

websites with bad UX and very many ads usually load slow

Try not to get your site looking like this, it’s probably not the best UX.

 

Another very common type of plugin that webmasters use are Sliders. Sliders used to be popular but recent testing has shown over and over again that they kill conversions.

 

Not only that, but Sliders also usually load your site with a lot of things you don’t need. The first one is usually the Javascript file which tends to load on all pages (either in the footer or the head section of your HTML). However, the slider is most probably used only on the homepage.

 

Also, if you have 6 slides on your homepage, with 6 big pretty images, your site can be 2 or 3 times slower because of the size in bytes of the images. Unfortunately, nobody is probably going to look past the second image, if it auto-slides, of course.

 

A good workaround is having some sort of development environment where you can test 5-10 plugins until you find exactly what you need. Then, make a plan of implementation so that you know only the essentials you need to install on the live version.

 

After that, you can reset the development version by deleting it and copying the updated live version to it. This way, the live version will not be clogged either and will resemble the live version more.

Audit & Fix Your Site Now

 

II. Website Functionality & Usability

 

After you make sure your website can load fast for your users, it’s time to see what you can do to improve your visibility in the search engines. There are very many aspects that go into this, but the following ones are a mixture between the most important ones and the most common mistakes that webmasters make.

 

8. Make Sure Your Site Is Mobile Friendly

 

There’s nothing much to say here. Since more than 50% of all the users worldwide are using their mobile devices to browse the internet, Google has prioritized mobile indexation. You should make sure that your website is optimized for mobile devices.

 

This is usually meant in terms of design, but also in terms of speed and functionality. Generally, it’s preferred to have a responsive design rather than a fully separate mobile version, as the m.site.com subdomain requires extra steps to be implemented correctly using rel=alternate tag.

 

You can ensure that your site is mobile friendly by testing it on Google’s Mobile Friendly Test Page.

 

Page is mobile-friendly

 

 

9. Build Search Engine Friendly URLs

 

URLs are very important because it’s good not to change them. This means you have to get them right the first time. It’s useful for users and search engines to have URLs that are descriptive and contain keywords.

 

However, many people often forget this and build websites with dynamic URLs that aren’t optimized at all. It’s not that Google doesn’t accept them. They can rank but, eventually, you’ll get to the point where you’ll have to merge to new ones to improve performance, UX and search engine visibility and it’s going to be a struggle.

 

Changing page URLs very often results in issues with search engines. It’s always better if you get them good the first time.

 

We’ve talked on this topic various times before because it is important to have easy-to-follow URLs. Avoid having query parameters in URL. You can’t keep track of that URL in Analytics, Search Console and so on. Not to mention it is difficult to do link building. You might lose linking opportunities because of your URLs appearance.

 

URL-structure-query-parameter

Source blogspot.com

 

If you’re a WordPress user, you have the option to personalize and set up your permalink structure. If you take a look at the next picture, you can see the options you have for your URL structure.

 

wordpress-permalink-settings

 

Building friendly URLs is not so hard, you can follow the next 3 tips:

  • use dashes (-) instead or underscores (_);
  • make it shorter;
  • use the keyword (focus keyword).

 

Building easy-to-read and focus-keyword-URLs you are thinking about your users and therefore focusing on user experience. David Farkas has the same vision on the matter:

If you focus on user experience, you’ll be building sustainable links – and building trust with users. To build a truly great link, you have to look at every aspect of the link from the user’s perspective.
David Farkas David Farkas
Founder & CEO TheUpperRanks

 

You can always check your ‘unfriendly’ URLs using the CognitiveSEO Site Audit. After you set up your campaign, you just have to go to Architecture > URLs.

 

user friendly URLs for SEO

 

Then you’ll be able to see a list of your URLs that don’t contain any keywords. You can also identify other issues using this feature. For example, in the following screenshot, although the URLs are blurred  in order to protect the client’s identity, we’ve identified a hreflang problem. The titles and content for some secondary languages were generated in the main language when proper content in the secondary language was not provided.

 

Descriptive URLs for SEO

 

This means that the URLs were actually OK, just not descriptive due to the content being generated in the wrong language.

 

10. Use the Secure Protocol – HTTPS

 

On August 6, 2014, Google announced that HTTPS protocol is on their new ranking factors list and recommended to all the sites to move from HTTP to HTTPS.

 

HTTPS (Hypertext Transfer Protocol Secure) encryptes the data and doesn’t allow it to be modified or corrupted during transfer, while protecting it against man-in-the-middle attacks. Besides, the improvement in data security has other benefits, such as:

 

  • It helps your website have a boost in rankings, since it is a ranking factor.
  • It offers referrer details included under “Direct” traffic source in Google Analytics.
  • It assures the users that the website is safe to use and that the data provided is encrypted for avoiding hacking or data leaks.

 

If you use the HTTPS, will see a lock before the URL in the navigation bar:

 

https protocol

 

In case your website doesn’t use the HTTPS protocol, you’ll see an information icon and if you click on it, a new message will alert you that the connection is not safe, therefore the website is not secure.

 

http protocol

 

While it is best to move from HTTP to HTTPS, it is crucial to find the best way to recover all your data after moving your website. For instance, lots of users complained they lost all of their shares after moving the website and the same thing happened to us.

 

After we experienced the same issue, we’ve created a guideline on how to recover Facebook (and Google+) shares after an https migration that you could easily follow:

  • Find out how many Facebook shares you have at a URL;
  • Set both your HTTP and HTTPs social shares to zero;
  • Update rel=”canonical”;
  • Identify Facebook’s Crawler.

 

Again, this issue is related to URLs so, every time you need to do mass redirects, issues can occur. It’s always a good idea to have your URLs well set up from the beginning. However, if you really need to migrate the site from HTTP to HTTPS, you can check out this HTTP to HTTPS migration guide.

 

11. Set Your Preferred Version

 

You also want to make sure that all your other versions are pointing to the correct, preferred version of your site. If people access one version they should automatically be redirected to the correct version.

 

These are all the versions:

  • http://site.com
  • https://site.com
  • http://www.site.com
  • https://www.site.com

 

So, if your preferred version is https://www.site.com, all other versions should 301 directly to that version. You can also test if this is alright in the SEO Audit Tool. Simply head to Indexability > Preferred Domain. Look for the Everything is OK message. If you can’t find it, then guess what: not everything is OK.

 

Migrate HTTP to HTTPS

 

12. Set up Correctly the 301 Redirects After Site Migration

 

Site migration is a recommended operation in case the website is changed completely and the same domain won’t be used anymore. Setting up the 301 redirects can be applied in case you make a switchover from HTTP to HTTPS and want to preserve the link equity.

 

In case of a site migration, it is crucial to set up correctly the redirects. To avoid losing lots of links and have broken pages on your site, it is best to follow a correct 301 redirection procedure. For that, you need to take into consideration the next recommendations. For the vast majority, we’ve already covered some of them in the previous steps:

  • Set up the 301 redirect code from the old URLs to the new URLs;
  • Avoid redirection loops;
  • Remove invalid characters in URLs;
  • Verify the preferred version of your new domain (www vs. non-www);
  • Submit a change of address in Search Console;

 

Google-Search-Console-Change-address

 

  • Submit the new sitemap in Google; 
  • Check for broken links and resources.

 

 

13. Make Sure Your Resources Are Crawlable

 

Having non-crawlable resources is a critical search engine optimization technical issue. Crawling is the first step, right before indexing, which comes and puts your content in the user’s hands/eyes. Basically, Googlebot crawls the data and then sends it to the indexer which renders the page and after that, if you’re lucky, you’ll see that page ranking in SERP.

 

how search works

www.slideshare.net/ryanspoon

 

It is very important that the users see the same content that the Googlebot does.

 

If your CSS files are closed from indexing, Google won’t be able to see the pages like a users does. The same situation applies to Javascript, if it isn’t crawlable. With JavaScript, it is a little bit more complicated, especially if your site is heavily built using AJAX. It is necessary to write codes for the server to send an accurate version of the site to Google.

 

If you’re not blocking Googlebot from crawling your JavaScript or CSS files, Google will be able to render and understand your web pages like modern browsers.

 

Google recommends using Fetch as Google to let Googlebot crawl your JavaScript.

 

Fetch as Google in Google Webmaster Tools

Source www.webnots.com

 

Update: As of 2019, the Google Search Console has launched a new version which doesn’t have many of the features the old version had. Luckily, you can still access the old version if you need those features. However, they are likely to be completely removed at some point, who knows.

 

New Google Search Console

 

14. Test Your Robots.Txt File to Show Google the Right Content

 

Crawlability issues are usually related to the robots.txt file. Testing your robots.txt file helps Googlebot by telling it which pages to crawl and which not to crawl. By using this method, you give access to your data to Google.

 

You can view your robots.txt file online if you search for http://domainname. com/robots.txt. Make sure the order of your files is right. It should look similar to what you can see in the following picture:

 

View robots.txt file online

 

Use the robots.txt Tester tool from Search Console to write or edit robots.txt files for your site. The tool is easy to use and shows you whether your robots.txt file blocks Google web crawlers from specific URLs. The ideal situation would be to have no errors:

 

robots.txt Tester no errors

 

The errors appear when Google is unable to crawl the specific URL due to a robots.txt restriction. There are multiple reasons for that and Google names just some of them:

 

For instance, your robots.txt file might prohibit the Googlebot entirely; it might prohibit access to the directory in which this URL is located; or it might prohibit access to the URL specifically.
Google logo Google
 

 

The common issues that appear when Googlebot is blocked to access your website happen because:

  • There are DNS issues and Google can’t communicate with DNS server;
  • The firewall or DoS protection system is misconfigured ;
  • The Googlebot is intentionally blocked from reaching the website.

 

After you’ve checked the issues and found out which are the blocked resources pointed in the Tester tool, you can test again and see if your website is ok.

 

The site’s crawlability can be verified better on a larger scale using the CognitiveSEO Audit Tool. You simply have to go to Indexability > Indexable Pages and look for the Disallowed in Robots.txt links. Click on the red line and it will show you a list of URLs that have been disallowed.

 

Crawlability Issues

 

15. Verify the Indexed Content

 

James Parsons, expert in content marketing and SEO, explains in an article on AudienceBloom the crucial significance of the indexing phase for a website.

Indexed pages are those that are scoured by Google search engines for possible new content or for information it already knows about. Having a web page indexed is a critical part of a website’s Internet search engine ranking and web page content value.
James Parsons James Parsons
Blogger at JamesParsons.com

Search Console can provide lots of insightful information regarding the status of your indexed pages. The steps are simple, go to Google Index then to Index Status and you’ll be able to see a similar chart to the one shown below:

 

Index status in Search Console

 

The ideal situation would be that the number of indexed pages is the same as the total number of the pages within your website, except the ones you don’t want to be indexed. Verify if you’ve set up proper noindex tags. In case there is a big difference, review them and check for blocked resources. If that concluded with an OK message, then check if some of the pages weren’t crawled, therefore indexed.

 

In case you didn’t see something that was out of the ordinary, test your robots.txt file and check your sitemap. For that check the following steps (9 and 10).

 

You can also use the Site Audit tool to view the URLs that have been marked up with No-Index tag. They’re in the same section as the URLs blocked by Robots.txt (Indexability > Indexable Pages).

 

 

16. Review Your Sitemap to Avoid Being Outdated

 

An XML Sitemap explains to Google how your website is organized. An example you can see in the picture below:

 

Sitemap example

Source: statcounter.com

 

Crawlers will read and understand how a website is structured in a more intelligible way. A good structure means better crawling. Use dynamic XML sitemaps for bigger sites. Don’t try to manually keep all in sync between robots.txt, meta robots, and the XML sitemaps.

 

Search Console comes to rescue once again. In the Crawl section, you can find the Sitemap report, where you can add, manage and test your sitemap file.

 

Up to this point, you have two options: test a new sitemap or test a previously added one. In the first case:

  • Add the Sitemap;
  • Enter the URL of the sitemap;
  • Click on Test sitemap and then refresh the page if needed;
  • When the test is completed, click Open Test Results to check for errors. Fix your errors;
  • After you fix your errors, click Submit Sitemap.

 

Google-Webmaster-Tools-Add-a-Sitemap

 

In the second case, you can test an already submitted sitemap; click on Test and check the results.

 

Sitemap Tester in Search Console

 

There are three things you need to do in the situation explained in the second situation.

  • Update the Sitemap when new content is added to your site or once in a while;
  • Clean it from time to time, eliminating outdated and bad content;
  • Keep it shorter so that your important pages get crawled more frequently or break the sitemap into smaller parts. A sitemap file can’t contain more than 50,000 URLs and must not be larger than 50 MB uncompressed.

 

Using a sitemap doesn’t guarantee that all the items in your sitemap will be crawled and indexed, as Google processes rely on complex algorithms to schedule crawling. However, in most cases, your site will benefit from having a sitemap, and you’ll never be penalized for having one.
Google logo Google
 

 

 

17. Review Blocked Resources (Hashbang URLs) with Fetch as Google

 

Hashbang URLs (URLs that have the #! in them) can be checked and tested in Fetch as Google now. John Mueller acknowledged that Google has the ability to fetch & render hashbang URL’s via the Search Console.

 

Google stopped supporting them on March 30, 2014, and that changed when it announced on October 14, 2015 that it deprecates their AJAX crawling system. At the moment hashbang URLs  can be tested.

 

Below you can see two situations for the same website. In the first picture, you can see the list of resources before using the fetch and render feature with the hashbang URL and in the second one you can see the situation after the fetch and render action was performed.

 

Before and After crawling hashbang

Source: www.tldrseo.com

 

18. Optimize Your Crawl Budget

 

The term “crawl budget” started to collect more value when Gary Illyes explained on January 16, 2017 how Google uses it.

 

Crawl budget means how many resources are allocated for crawling by a server or how many pages are crawled by the search engines in a specific period of time. Google says that there is nothing to worry if the pages tend to be crawled every day. The issues appear on bigger sites. It is very important to optimize your crawl budget.

 

Maria Cieślak, search engine optimization expert, explains in an article on DeepCrawl the importance of optimizing your crawl budget.

Google is crawling only a particular number of pages on your website, and may sort the URLs incorrectly (I mean differently than you wish). For example, the “About us” page (that doesn’t drive sales) can gain more hits than the category listings with the new products. Your aim is to present to Google the most relevant and fresh content.
Maria Cieślak Maria Cieślak
SEO Specialist at Elephate

 

The crawl limit rate comes into discussion, which limits the maximum fetching rate for a given site.

 

The actions recommended for optimizing the crawl budget are:

  • Check the soft 404s and fix them using a personalized message and a custom page;
  • Get rid of duplicate content to avoid wasting crawl budget;
  • Remove hacked pages;
  • Prevent indexation for low quality and spam content;
  • Keep your sitemap up to date;
  • Correct infinite space issues;

 

 

19. Avoid Meta Refresh for Moving a Site

 

Since we’ve talked about the redirection plan for migrating a site, it is best to understand why Google doesn’t recommend using meta refresh for moving a website. There are three ways to define redirects:

  • HTTP responses with a status code of 3xx;
  • HTML redirections using the <meta> element;
  • JavaScript redirections using the DOM.

 

Aseem Kishore, owner of Help Desk Geek.com, explains why it is better not to use this meta refresh technique: 

Although not particularly dangerous, Meta Refreshes are often used by unscrupulous webpage programmers to draw you into a web page using one piece of content and then redirect you to another page with some other content. Referred to as a black hat technique, most of the major search engines are smart enough not to fall for this method of “cloaking” web content.
Aseem Kishore Aseem Kishore
Owner and Editor-in-Chief at Help Desk Geek.com

When possible, always try to use HTTP redirects, and don’t use a <meta> element. HTTP redirection is the preferred option, but sometimes the web developer doesn’t have control of the server or can’t control it. And they must use other methods. Although HTML redirection is one of them, Google strongly discourages web developers to use it.

 

If a developer uses the HTTP redirects and forgets the HTML redirects, they aren’t identical anymore and might end up in an infinite loop, which leads to other problems.

 

In case you want to move a site, Google guidelines recommend to follow the next steps:

  • Read and understand the basic knowledge of moving a website;
  • Prepare the new site and test it thoroughly;
  • Prepare a URL mapping from the current URLs;
  • Correctly configure the server to make the redirects to move the site;
  • Monitor the traffic for old and URLs.
 

20. Use Redirect for Flash Site to the HTML Version

 

Creating a flash site without a redirect to the HTML version is a big SEO mistake. Flash content might have an appealing look, but just like JavaScript and AJAX, it is difficult to render. The crawler needs all the help it can get to crawl the data and send it to the indexer. The Flash site must have a redirect to the HTML version.

 

If you have a pretty site, what’s the point if Google can’t read it and show it the same way you’d want it to? Flash websites might tell a beautiful story, but it’s all for nothing if Google can’t render it. HTML is the answer! Build an HTML version with SWFObject 2.0. This tool helps you optimize flash content.

 

21. Use Hreflang for Multi-Language Websites

 

Hreflang tags are used for language and regional URLs. It is recommended to use the  rel=”alternate” hreflang=”x” attributes to serve the correct language or regional URL in Search results in the next situations:

 

  • You keep the main content in a single language and use translate the template (navigation and footer). Best used for user-generated content.
  • You have small regional variations with similar content in a single language. For a website that uses the English language targeted to the US, GB, and Ireland.
  • You have a site content that is fully translated. For websites where you have multiple language versions of each page.

 

Maile Ohye, former Developer Programs Tech Lead, explains how site owners can expand to new languages variations and keep the search engines friendly:  

 

 

Based on these options, you can apply multiple hreflang tags to a single URL. Make sure, though, the provided hreflang is valid:

  1. It doesn’t have missing confirmation links: If page A links to page B, page B must link back to page A.
  2. It doesn’t have incorrect language codes: The language codes must use them in ISO 639-1 format and optionally the region must be in ISO 3166-1 Alpha 2 format.

 

We’ve documented a complete guideline on the vital hreflang & multi-language website mistakes that most webmasters make, that we recommend you to follow.

 

Also, you can use the Site Audit to quickly analyze and identify hreflang issues on your website. Simply head to Content > Hreflang/Languages to get a list of your implementation issues. In the following screenshot you can see that this site has a lot of missing confirmation links, which means that Language A Page points to the Language B Page but Language B Page doesn’t point back to Language A Page.

 

Hreflang Technical Issues

 

22. Make Sure Your Tracking Is Working Properly

 

Tracking your website is really important. Without tracking your results, you won’t be able to see any improvements.

 

Tracking issues are common after migrations from HTTP to HTTPS or after minifying and combining JS files. They can break the tracking code resulting in a loss of data.

 

You need to make sure that everything is working properly so that you can track the results of the improvements you’re making over time.

Audit & Fix Your Site Now

 

III. Content Optimization

 

Now that you’ve fixed the general issues that can create crawlability and indexability issues, you can focus more on specific issues regarding your content, such as broken pages, internal linking and so on.

 

This is very important if you really want to surpass your competition, especially in highly competitive markets.

 

23. Redirect/Replace Broken Links & Resources

 

Sometimes the images from a webpage aren’t available, so a broken image is displayed in the client’s browser. It can happen to everybody. There are lots of reasons for that. And it is not a pretty situation. You know the saying: A picture is worth a thousand words and a missing picture with an ugly icon with a message will say something as well…

 

Broken images

 

A solution would be to add an error handler on the IMG tag: 

<img src="http://www.example.com/broken_url.jpg"onerror="this.src='path_to_default_image'" />

Some webmasters say that Chrome and Firefox recognize when images aren’t loaded and log it to the console, while others have other opinions.

 

Sam Deering, web developer specialized in JavaScript & jQuery, offers some great steps to resolve these issues:

  1. Firstly, search for some information on the current images on page;
  2. Secondly, use AJAX to test if the image exists;
  3. Then refresh image;
  4. Fix broken images using AJAX;
  5. Check the Non-AJAX function version.
In most browsers, the ALT tag is shown if the image is not found. This could be a problem if the image is small and the ALT tag is long as it seems the output width of the element is not forced by the length of the alt tag.
Sam Deering Sam Deering
Front-end Web Developer

 

This is also the case with broken URLs. Although nothing weird will be displayed on the site, if the user clicks on a broken link, it will lead to a bad experience. You can view which resources are broken on your website by heading to the Architecture section in the Site Audit tool.

 

Broken URLs and Images are bad for SEO

 

 

24. Audit Internal Links to Improve Your Chances to Rank Higher

 

Internal links are the connection between your pages and, due to them, you can build a strong website architecture by spreading link juice, or link equity, as others refer to it. 

 

Creating connections between similar pieces of content creates the terminology of Silo content. This method presumes to create groups of topics and content based on keywords and it defines a hierarchy.

 

benifits-intrernal-linking

Source: www.seoclarity.net

 

There are a lot of advantages for building internal links because it:

  • opens the road to search engines spiders by making it accessible;
  • transfers link juice;
  • improves user navigation and offers extra information to the user;
  • organizes the pages based on the keyword used as an anchor text;
  • highlights the most important pages and transfers this information to the search engines;
  • organizes site architecture.

 

The more relevant pages are combined with each other when crawled repeatedly, and as the crawling frequency rises, so does the overall rank in search engines.

Kasia Perzyńska Kasia Perzyńska
Content Marketer Unamo

 

When you audit internal links, there are four things that need to be checked:

  • Broken links;
  • Redirected links;
  • Click depth;
  • Orphan pages;

 

You can easily do all of those using the CognitiveSEO Site Audit Tool under Architecture > Linking Structure.

 

Internal Linking Structure Audit Tool for SEO

 

 

25. Get Rid of Duplicate Content

 

When we talk about technical SEO, we also think of duplicate content, which is a serious problem. Be prepared and review your HTML Improvements from Search Console to remove the duplicates.

 

Keep unique and relevant title tags, descriptions within your website by looking into the Search Console at Search Appearance » HTML Improvements.

 

HTML-Improvements-Google-Webmaster-Tools

 

In Search Console, you can find a list of all the duplicate content leading you to the pages that need improvement. Remove or review each element and craft other titles and meta descriptions. Google loves fresh and unique content. Panda algorithm confirms it.

 

Another option would be to apply the canonical tag to pages with duplicate content. The tag will show to the search engines which is the original source with your rel=canonical tag. Canonicalizing irrelevant URLs to avoid content duplication is a recommended practice.

 

Jayson DeMers, Founder & CEO of AudienceBloom, considers that duplicate content can affect your website and discourage search engines to rank your website and it can also lead to bad user experience, as he says on Forbes.  

Just a few instances of duplicate content can trigger Google to rank your site lower in search results, leaving you unable to recover until those content duplication issues are addressed. Duplicate content can also interfere with your user experience, leaving your site visitors feeling that your site is more fluff than substance.
Jayson DeMers Jayson DeMers
 Founder & CEO of AudienceBloom

 

The CognitiveSEO Site Audit Tool can not only easily identify Duplicate Content, but it also has a feature to identify near duplicate content, which are pages that are very similar in content but not quite the same.

 

SEO Duplicate Content Issues

 

Fixing duplicate content issues is critical, especially for eCommerce websites where this practice/issue is common. The tool makes it very easy to fix.

 

 

26. Use Structured Data to Highlight Your Content

 

Structured data is the way to make Google understand your content and help the user choose and get directly on the page they are interested in through rich search results. If a website uses structured markup data, Google might display it in SERP as you can see in the following picture:

 

Rich snippets example

 

Beside rich snippets, structured data can be used for:

  • Getting featured in the Knowledge graph;
  • Gaining beta releases and having advantages in AMP, Google News, etc.;
  • Helping Google offer results from your website based on contextual understanding;

 

Structured data

Source: www.link-assistant.com

 

The language for structured data is schema.org. You can highlight your content using structured data. Schema.org helps webmasters mark up their pages in ways that can be understood by the major search engines.

 

If you want to get in rich search results your site’s page must use one of three supported formats:

  • JSON-LD (recommended);
  • Microdata;
  • RDFa.

 

After you highlight your content using structured data, it is recommended to test it using the Google Structured Data Testing Tool. Testing it will give you great directions to see if you set it right or if you didn’t comply with Google’s guidelines because you can get penalized for spammy structured markup.

 

Google doesn’t guarantee the appearance of each content highlighted using structured data markup.

 

27. Keep a Reasonable Number of Links On-Page

 

People from the web community often associate pages with 100 links or more with “link farms”. Also, UX has a significant impact on the number of links on a single page. A piece of content abundant of links will distract the users and fail to offer them any piece of information because most of it is linked. You need to add links only where you think it is relevant and it can offer extra information or you need to specify the source.

 

Patrick Sexton, Googlebot Whisperer, explains in an article on Varvy why it is important to keep a reasonable amount of links per page:

You may also wish to consider how well your webpage is linked to. If a webpage has many quality links pointing to it, that webpage can have many links (even more than 100) but it is important to remember the reasons why you shouldn’t have a huge amount of links on any given page.
Patrick Sexton Patrick Sexton
Googlebot Whisperer at Outspoken Media

In general, the more links on the page, the higher the need to keep that page more organized in order for the user to get the information they came for on that page. Also, be careful to search for natural ways to add links and don’t violate Google’s guidelines for building links. The same recommendation applies for internal links.

 

28. Avoid Canonicalizing Blog Pages to the Root of the Blog

 

John Mueller said in one Google Webmaster Hangout that Google’s doesn’t encourage canonicalizing blog subpages to the root of the blog as a preferred version. Subpages aren’t a true copy of the blog’s main page so doing that has no logic.

 

You can listen to the whole conversion from minute 16:28:

 

 

Even if Google sees the canonical tag, it will ignore it because it thinks it’s a webmaster’s mistake.  

 

Setting up blog subpages with a canonical blog pointed to the blog’s main page isn’t a correct set up because those pages are not an equivalent, from Google’s point of view.
John Mueller SEO John Mueller
Webmaster Trends Analyst at Google

 

Canonical links are often misunderstood and incorrectly implemented, so make sure you check all your URLs for bad canonical implementation. You can do this easily with the Technical SEO Audit tool by CognitiveSEO.

 

Canonical URLs Technical SEO

 

Remember that it’s a good idea to always have a self-referencing canonical tag pointing to your page. This will ensure there are less duplicate content issues.

Audit & Fix Your Site Now

 

IV. User-Friendlier Website

 

Google cares about UX, so why wouldn’t you? Many experts think that UX is crucial in the future of SEO, especially with all the evolution of machine learning technology. David Freeman, Search Engine Land Columnist, has a strong opinion on the role of UX:

 

Part of Google’s philosophy has always been focused on delivering the best user experience. With recent technological advances, Google and other search engines are now better placed than ever to deliver this vision.
Dave Freeman David Freeman
Group Head of SEO at Treatwell & Search Engine Land Columnist
 

29. Set up Your AMP the Right Way – Mobile Friendlier

 

Google recommends using AMP (Accelerated Mobile Pages) to improve the UX, highly valued by the company. Since the Google AMP change will affect a lot of sites, it is best to understand the way it works and the right way to set it up/install it on different platforms: WordPress, Drupal, Joomla, Concrete5, OpenCart or generate custom AMP Implementation.

 

On this topic, we created a guideline on how to implement AMP because it is a process which needs full understanding. Google AMP doesn’t directly affect SEO, but indirect factors that result from AMP can.

Historically, Google has acted as an index that points people away from Google to other websites. With its AMP search results, Google is amassing content on its own servers and keeping readers on Google.
Klint Finley Klint Finley
Content writer at Wired Business / @klintron

 

AMP is pretty difficult to implement correctly. You can always run into issues. Miss one tag closing bracket and you risk your AMP version not displaying at all. You can test the format of all your site’s AMP pages quickly using the CognitiveSEO Tool.

 

In the following example there are no AMP pages set up, but if you have any, you may want to take a look at the Incorrectly set up AMP Pages section to identify the problematic ones:

 

Test AMP pages SEO validator

 

 

30. Add Breadcrumbs for a Better Navigation

 

Breadcrumbs, used by Hansel and Gretel to find their way back home, are implemented by websites with the same purpose, to lead the user through the website. They help the visitors understand where are they located on the website and give directions for an easier accessibility.

 

location-based-breadcrumb-example-sitepoint

Source: www.smashingmagazine.com

 

Breadcrumbs can improve the user-experience and help search engines have a clear picture of the site structure. Fulfilling the need to a second navigation on the website, breadcrumbs shouldn’t replace the primary navigation though.

 

Another advantage of them is that they reduce the number of actions and clicks a user must take on a page. Instead of going back and forth, they can easily use the link level/category to go where they want. A technique can be applied to big websites or e-commerce sites.

 

W3Schools exemplifies how to add breadcrumbs in two steps.

  1. Add HTML

<ul class="breadcrumb">
  <li><a href="#">Home</a></li>
  <li><a href="#">Pictures</a></li>
  <li><a href="#">Summer 15</a></li>
  <li>Italy</li>
</ul>

 

  1. Add CSS

/* Style the list */
ul.breadcrumb {
	padding: 10px 16px;
	list-style: none;
	background-color: #eee;
}

/* Display list items side by side */
ul.breadcrumb li {
	display: inline;
	font-size: 18px;
}

/* Add a slash symbol (/) before/behind each list item */
ul.breadcrumb li+li:before {
	padding: 8px;
	color: black;
	content: "/\00a0";
}

/* Add a color to all links inside the list */
ul.breadcrumb li a {
	color: #0275d8;
	text-decoration: none;
}

/* Add a color on mouse-over */
ul.breadcrumb li a:hover {
	color: #01447e;
	text-decoration: underline;
}

 

If you want a simpler solution, you can use plugins for WordPress, such as Breadcrumb NavXT Plugin or Yoast SEO.

 

yoast-seo-breadcrumbs

 

Go to SEO in Dashboard, then click on Advanced and select Enable Breadcrumbs » Save changes. This method will apply the default setting for your breadcrumbs.

 

31. Test On as Many Platforms and Devices as Possible

 

People use different devices. If you want your users to have a good experience, you need to test on multiple devices. As many as you can!

 

You can start with Chrome by right clicking and hitting Inspect. Then you can toggle the device toolbar and select the type of device you want to view your site on.

 

SEO Testing on multiple devices

 

You can also use 3rd party tools, such as ScreenFly.

 

However, keep in mind that these tools only take the screen width into consideration. For example, if you don’t own an iOS device, you’ll never know that WEBM format videos don’t play on Safari Browser.

 

You really need to test on different devices and browsers. Test on Windows, iOS, Linux, Safari, Firefox, Edge, Chrome, Opera and even the sad, old and forgotten Internet Explorer.

 

If you don’t own an Android or an iPhone/iPad, go to a store if needed or find a friend. Whenever you can get your hands on a new device, take a minute or two to browse your website.

Audit & Fix Your Site Now

Conclusion

 

Firstly, this SEO guide offers solutions and points out directions on how to make a website fast and decrease the loading time by following the recommendations on Google Speed Insights and Google developers’ guidelines.

 

Secondly, we went through the functional elements of a website, by trying to check and resolve issues related to crawling errors, indexing status, using redirects and making a website accessible to Google. 

 

Thirdly, we looked for improving and optimizing the content by resolving critical technical SEO issues. We discussed how to remove duplicate content, replace missing information and images, make a strong architecture website, highlighting our content and making it visible to Google.  

 

Lastly, we pointed out two issues regarding the mobile-friendliness sites and navigational websites.

The post Technical SEO Checklist – The Roadmap to a Complete Technical SEO Audit appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
https://cognitiveseo.com/blog/17963/technical-seo-checklist/feed/ 19
27 Unique Techniques on How to Effectively Spy On Your Competitors https://cognitiveseo.com/blog/10591/27-techniques-spy-competitors/ https://cognitiveseo.com/blog/10591/27-techniques-spy-competitors/#comments Fri, 16 Nov 2018 11:04:09 +0000 http://cognitiveseo.com/blog/?p=10591 It’s almost impossible to come up with a digital marketing strategy to spy on your competitors without consistent information regarding the trends in the niche. And, if you think that it’s only what works that matters, you may not get too far.   In order to develop a way to leverage customers’ and influencers’ behavior […]

The post 27 Unique Techniques on How to Effectively Spy On Your Competitors appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
It’s almost impossible to come up with a digital marketing strategy to spy on your competitors without consistent information regarding the trends in the niche. And, if you think that it’s only what works that matters, you may not get too far.

 

In order to develop a way to leverage customers’ and influencers’ behavior to your advantage, you’ll have to make yourself comfortable in the office chair and start to seriously monitor the entire niche. That’s the wonder of a free market – it has made it compulsory to weigh everything on the scale of competitive advantage.

 

27 Unique Techniques on How to Effectively Spy On Your Competitors

 

However, it’s impossible to know what unique feature of your brand will make prospects convert unless you’re up to date with everybody else’s special ingredient. Moreover, there are also a series of tactics and free ways that aren’t brand specific and can boost your image regardless of your business name. Let’s browse them all for a complete competitive research.

 

I. Identify Missed Opportunities

 

  1. Websites’ History – The Way It Used to Be
  2. Spot Your Competitors’ Evergreen Content
  3. Get Email Notifications When Your Competitors Rankings Change
  4. Link Reclamation – Recover 404 Pages Link Juice

 

II. Find & Track Your Competitors’ Weaknesses

 

  1. Track Your Competitors’ Social Mentions
  2. Research and Track Fresh Web Mentions
  3. Track Competitors’ New Links
  4. How to Take Advantage of Your Competitors’ Google Penalty Risk

 

III. Comprehend Your Competitors’ Strategies

 

  1. Competitor Content Strategy Analysis to Boost Your Content Marketing Campaigns
  2. Content Marketing – Track the Social Evolution of Your Competitors’ Content
  3. Content Marketing – Monitor Fresh Topical Content in Your Niche & Create Better Content
  4. Track the Speed of Your Competitor’s Website to Improve Your Rankings
  5. Competitive Intelligence – Spy on the Technologies Used on Your Competitors’ Sites
  6. Check Your Competitors’ Link Velocity Trend to Understand Their Business Evolution
  7. Spy on Your Competitors Traffic Trend & Learn How Good They Really Are
  8. Check Your Competitor’s Traffic Sources
  9. Track Your Competitors’ PPC Campaigns
  10. Track Your Competitors’ Organic Keywords
  11. Monitor Your Competitor’s Site Indexation
  12. Track Your Competitors’ A/B Testing on Prices to Learn What Works Best for Them
  13. Stay Ahead of the Game by Tracking Your Competitors’ Facebook Pages
  14. Track Your Competitors’ Twitter Followers to Learn their Follower Growth Strategies
  15. Track Your Competitors Youtube Views & Subscribers
  16. Study the Historical SEO Visibility of Your Competitors & Understand Their Growth Trends

 

IV. Beat Your Competitors at Their Own Game

 

  1. Dominate Your Niche by Running In-Depth SEO Visibility Audits
  2. Identify Link Building & Growth Strategies Using Advanced Backlink Fingerprinting Techniques
  3. Advanced Content Marketing Full Niche Audit Strategy
 

Identify Missed Opportunities

 

1. Websites’ History – The Way It Used to Be

 

The Wayback Machine is a virtual space dedicated to archiving the web content and presenting it the way it used to look at different moments in the past. This ambitious project isn’t only a good deed for the online community, but it also has its practical purposes. You can analyze the previous design, map, content and structure of your competitors’ sites, along with the moments the major changes happened.

 

Wayback Machine - Website Changes Over Time

 

When you have access to how a site used to look in terms of history and the changes that were performed on, it’s easier to identify the types of improvements that your competitors thought were most relevant to their business. Unlike exclusively having access to the actual interface of a site, monitoring its step by step evolution puts the overall marketing efforts of your competitors into perspective.

 

2. Spot Your Competitors’ Evergreen Content

 

Evergreen content is a goldmine for any content marketer, regardless of the domain. See what kind of content works in the niche. Shareability is a matter of sustainable effort, and timeless articles are your best friend on this journey.

 

You can discover your competitors’ top pages. It’s highly beneficial to monitor the ideas that make outstanding content after you’ve understood what makes people endlessly share a piece of content, even if you’re not going to use the skyscraper technique, originally named by Brian Dean.

 

For example, if you have 4 competitors, the first step is documenting what kind of posts seemed to work on their pages in terms of social media engagement to create similarly successful content yourself. You can always start brainstorming from what you see them doing, having all the premises to create excellent articles yourself.

 

We are using Social Visibility to monitoring the competition and get insights on what type of content works best for them and what social platform gets more traction. You can look at the pages with the highest shares increase to identify competitors’ evergreen content. 

 

Pages with Shares Increase

 

We looked at the data for the last 3 months to get the pages that are truly valuable. If you want to see evergreen content, we need to check the data for a longer period.

 

3. Get Email Notifications When Your Competitors Rankings Change

 

Triggering alerts to be notified on the spot when relevant tracking changes happen for keywords of your choice and your competitors’ can give you a sense of propriety in terms of what works in the niche as keyword-centric content. Of course, the alerts can be set on whichever keyword, for any changes that you wish to monitor.

Intelligence is the ability to adapt to change. Stephen Hawking

 

Of course, when saying keywords we actually refer to idea-specific articles, documented in-depth and relevant in the industry; the keywords are simply the instrument for measuring content efficacy.

 

Using an alert system can ease up a lot of work. In cogntiveSEO campaigns, you can create new alerts and get email notifications on any keyword rank fluctuation. For example, you can get a mail the moment your site or your competitor rankings drop for at least 3 positions in Google.

 

Trigger email notification for new rankings

 

While it goes without saying that your strategy should be based on clear marketing objectives, knowing when to adjust them as to correspond to the trend is as imperative as having a plan in the first place. Consequently, you can choose to set alerts on cognitiveSEO tool for other areas that interest you as well, such as link building and changes on your backlink profile.

 

4. Link Reclamation – Recover 404 Pages Link Juice

 

There are all kinds of talks about opportunities you’re missing when not monitoring your competitors. Here’s one of the best examples. There are sites, in certain industries, whose products and services are the best part of a market share pie chart.

Monitor the broken pages of your competitors and contact the influencers. Link to their content and be a good Samaritan. This is a great chance of networking success and a link building opportunity. You know the saying – Sharing is caring. 

 

Site Explorer is a great help in this case. You can find there broken pages in real time. 

 

Broken pages from microsoft

 

Of course, whenever it’s possible, you can also present good quality, topically related content of your own for them to easily recover from the mistake. Needless to say, monitoring broken pages of your own is a great asset in terms of preventing these unfortunate situations and Link Reclamation is an instrumental tool for your needs.

 

Most of the times, it doesn’t make a difference if you’re the owner of an internationally popular site or the one of a small, local business. The bottom line is that you’re as susceptible to lose valuable link juice anyway, just as you can see in the example above. However authoritative and reputable, from sites like Microsoft’s to the fresh start-up you’ve funded, you can’t afford to lose important links.

 

II. Find & Track Your Competitors’ Weaknesses

 

5. Track Your Competitors’ Social Mentions

 

Tracking what’s being said about you on social media is paramount. Approximately half of the people engaging on social media expect to get an answer in less than an hour. Being prompt means being dedicated and this is how loyalty is triggered. Of course, the nature of competitive advantages makes it impossible for you to offer the best there is in your niche on social media without constant monitoring your competitors’ social media activity.

 

Track social mentions

 

Moreover, monitoring the mentions of your competitors also allows you to respond to all kinds of questions addressed to them, in case they’re not as prompt.

Try being as impartial and objective as possible. Under no circumstance should you advertise your product while at it, just let your implication speak for itself.

This way, the prospects interested in the niche will get the idea that you’re serious and devoted to your mission. Although this sounds like a long shot, it actually converts if you use it wisely.

 

6. Research and Track Fresh Web Mentions

 

Same rules apply for web mentions. Whenever your brand is being brought up on the web, a marketing opportunity is rising – whether the mention is positive or negative, you should manage it on the spot using BrandMentions.

 

Track web mentions

 

As far as your competitors are concerned, demystifying their online appearances gives you valuable insight into their content marketing strategies.

Generate top of mind awareness by implementing useful points for your own company’s objectives.

 

Of course, it’s not only influencers who mention brands on the web – many of the people are actually consumers. This is a great chance for you to better understand consumer behavior in an environment they consider familiar and which, as a matter of consequence, stimulates their honesty. Also, a mine gold opportunity is represented by finding unlinked brand mentions. Finding these link opportunities isn’t a ride in the park. It requires perseverance, attention, thinking outside-the-box, and also the right analysis tools.

 

Moreover, if you’d like to monitor the fresh web mentions for different keywords, you can always create an alert and be notified via mail on the most recent updates. You can turn this into a huge outreach opportunity.

 

7. Track Competitors’ New Links

 

Being instantly notified whenever you and your competitors have new links increases the chances of your marketing efforts to be doubled by great strategic insight with cognitiveSEO alerts.

 

Track-Competitors-New-Links

 

Knowing what kind of content influencers are interested in mentioning on their posts allows you to develop a functional strategy that can always be updated according to the trend.

 

8. How to Take Advantage of Your Competitors’ Google Penalty Risk

 

The Unnatural Link Detection tool is used to constantly monitor the profile naturalness of the sites – both yours and your competitors’.

Besides helping in detecting whenever you’re in danger of being penalized by the search engine, this tool is also useful to show you what’s working in the industry. You can either prevent or recover from penalties, and monitor your competitors’ strategies to know how they’re performing. Identifying their mistakes on time helps you avoid them in your approach.

 

Below you can see an example of a website that received lots of unnatural links and it’s at risk of getting penalized, if it hasn’t been already. 

Track unnatural links in cognitiveseo

 

Different approaches, although sometimes very similar, may end in totally different results in terms of SERPs – keeping an eye on the numbers constantly provides you with practical insights on the trends.

 

Knowing the exact distribution of ok/suspect/unnatural links is especially useful when monitoring the spikes and tendencies. See what is that your competitors are doing right (or wrong) and monitor their practices so that you’ll debunk their strategies and borrow what’s advantageous.

 

III. Comprehend Your Competitors’ Strategies

 

9. Competitor Content Strategy Analysis to Boost Your Content Marketing Campaigns

 

I love that feeling when you enter a blog and notice that one piece of content has a particularly spectacular performance. Discovering the most efficient social media channel can give you an idea where your audience is. Also, you can find out what your next blog post should look like, with the Social Visibility module mentioned in the second strategy of this list. 

 

But observing the overall picture of what works and what doesn’t in terms of social media shareability allows you to understand three important things:

  • what readers are most interested in, which means they are willing to come back for more if you happen to give them actionable information two or three times in a row;
  • what readers think their network may be interested in, which is an opportunity for you to exponentially grow on social media, supposing they know their contacts’ preferences;
  • what content marketing development ideas you can borrow from your competitors and implement on your own site, including the general style of the articles in question.

 

social visibility analysis on competitors

 

 

10. Content Marketing – Track the Social Evolution of Your Competitors’ Content

 

Besides having a good standing point for an overview, it can also come in handy to look at some pieces of content contextually. You don’t have to be a mammoth in the industry to realize that there’s no easy way to outline the success or failure of your campaigns.

We all get that at some point, usually when it’s frustrating how your very in-depth approach has 5 Facebook shares and your competitors’ superficial article of less than 1500 words won the Internet. Context matters.

 

Share increase for a specific URL

 

Spotting the VIPs of the latest reads grants you knowledge on the volatile trend – which, admittedly isn’t very easy to get.

 

11. Content Marketing – Monitor Fresh Topical Content in Your Niche & Create Better Content

 

Aside from monitoring web mentions on different keywords, you can also track new topical content – not because you wouldn’t have the greatest ideas for successful articles, but because BrandMentions provides you with fresh data.

 

It would be a shame to miss an opportunity to treat something extensively and create a massive, evergreen blog post that will still get a good social media score and links 3 years after you’ve written it.

 

Track an industry by monitoring fresh topical content

 

This topical comprehension can be used in crafting a creative road map that’s also going to be successful, following the latest trends.

 

12. Track the Speed of Your Competitor’s Website to Improve Your Rankings

 

Web Design is of uttermost relevance in user experience and it always has the last word when it’s a draw between two similar services (in terms of speed). But let’s not forget practicality. Technical data, such as the load time and the speed index are more important when it comes to building the loyalty of your website visitors. PageSpeed Insights is a great tool when monitoring a site’s performance.

 

Speed Test Example

 

Your site is eye candy, the users won’t be tempted to close it, which would explain a good bounce rate, for instance. But when it comes to the entire experience, as a process, the website speed has a decisive role.

 

13. Competitive Intelligence – Spy on the Technologies Used on Your Competitors’ Sites

 

There are tracking platforms that allow you to see what’s behind popular, successful sites in your industry. Finding out what’s different can answer a dramatic lot of questions regarding the user experience and the functionality of the web page.

 

Competitive intelligence

 

You’d like to always be in touch with what’s bringing success around you because this is suspicious when it comes to building business intelligence. Use BuiltWith to monitor your competition in this matter. It offers insights on internet technologies which include analytics, advertising, hosting, CMS and many more. Plus, it shows you the internet technology usage changes on a weekly basis.

Unmasking the strategies of your competitors you can merge different tips into a better formula that would best answer the needs of your audience.

 

14. Check Your Competitors’ Link Velocity Trend to Understand Their Business Evolution

 

As relevant as knowing new mentions in order to build a sustainable strategy, keeping in touch with the new referring domains of your competitors’ content detects fresh opportunities for your content marketing strategy.

 

The Site Explorer helps you gain a clear image of your link profile, in seconds, so that you can create further content that’s topically related and more in-depth, then trying to get in touch with editors on the sites that have the best authority. Knowing what editors from which sites are willing to share topics from your niche also frames your content approach and increases your chances for high-quality links. It is also a great competitor analysis tool because it allows you to do the same investigation as for your website. 

 

New monthly referring domains

 

Link Building helps you boost your image in terms of SERPs, which is why having a monthly situation of your referring domains is beneficial in terms of monitoring your overall performance as well as debunking competitors’ SEO strategies. This last use of finding fresh referring domains linking to your competitors is spotting uncovered opportunities, maybe getting new ideas of influencers to reach out to.

Moreover, this can transform your online marketing efforts of getting links – instead of making aimless steps towards no-name sites and get maybe dozens of links, you can get a few ones from high authority, great performance domains.

 

15. Spy on Your Competitors Traffic Trend & Learn How Good They Really Are

 

Traffic growth itself is a gold mine, as you can use it to compare the content to the traffic performance and analyze your competitors. Get a better understanding and the best practices of topics that could work in your industry.

Search traffic trend and global position

 

Aside from this, being able to compare performances gives a solid reality check, positioning your performance objectively, while also indirectly giving you insight on the strategies that best convert. Alexa helps you monitor your rankings and audience while also displaying a trend tendency compared to the global position.

 

16. Check Your Competitor’s Traffic Sources

 

There’s more than just readers and shares that make a site popular. If we’ve already established that contextualized information matters most, then where the traffic comes from is crucial when you’re trying to understand which strategic moves you should keep developing.

For instance, if bloggers in your industry seem to share a lot of topically related content and be open about tips and practices, you’re going to want to approach them. This would bring you a positive engagement rate on the web as well as social media.

But if you see that your competitors are cited in the press quite often and seem to make PR efforts to be as visible there, thus being cited by high authority sources and having a satisfactory ranking situation, maybe you’d like to try that as well.

 

Traffic sources analysis

 

Of course, this gives you competitive intelligence as it debunks what the competition is doing. If they’ve got lots of readers coming straight from the in-mail newsletter, maybe you should get some inspiration from there. SimilarWeb is a similar tool to spy on your competition but it can be used as a complementary resource when it comes to connecting the general performance of a site with its traffic sources such as direct referrals, search, social, mail and display networks.

 

17. Track Your Competitors’ PPC Campaigns

 

Unravel the PPC strategy of the niche influencers to gain keyword-specific insights. It depends entirely on you how you decide to use that; either to copy that strategy for your own PPC or, depending on your position, to create a suitable strategy.

 

There are lots of tools for spying your competitors’ rankings. SpyFu is an example, that gives you the most profitable keywords and ads for paid and organic search. You can keep track of your competitors’ keywords. Below you can see a screenshot from the tool. You can get a list of all organic and paid keywords from your competitors. Besides that, there is an interesting feature called Google AdWords History which shows every ad campaign for all the keywords tracked plus information about costs.  

 

Track Your Competitors PPC Campaigns

 

For instance, if you’re not ranking well, maybe you should try to move towards more popular keywords. If you’re already ranking top in your industry, perhaps it’s better to avoid association that would produce cannibalization, avoiding certain keywords that may steal your thunder.

 

18. Track Your Competitors’ Organic Keywords

 

Unlike paid keywords, organic ranking speaks for itself about the content and link building strategy of your competition. Identifying keywords and competitors separately helps you monitor the way each of them evolves in the niche.

 

It’s essential not to try creating promotion tactics without consulting the market and have a good overview of their performance and reasons so that you can leverage them to your advantage. Site Explorer shows you the top 20 keywords ranking in SERP. 

 

Top 20 rankings keywords

 

19. Monitor Your Competitor’s Site Indexation

 

Monitoring the way your site is indexed allows you to understand which content helps most and improve both your on-page and your off-page endeavors in order to gain popularity in SERPs.

 

Site Indexation

 

Of course, you’ll also know which pages are indexed and which not. Search for your site on Google and see how many results you’re returned. This may not seem clear enough until compared with your competitors’ performance, using the same method. While page indexing follows complex algorithms, including a site map, bookmarks and offsite content, spying on your competitors’ strategies can help you perform better.

 

For instance, blog content is crawled more often than a static website. Maybe seeing your competitor writing a blog post about their new feature on the site and linking to it will give you the idea to do the same in order to help the crawler easily find your content.

About Us and Contact pages start being more comprehensive and more customer-centric, as the search engines find it relevant to provide useful information to their users, therefore influencing the way your site is being crawled and indexed.

 

20. Track Your Competitors’ A/B Testing on Prices to Learn What Works Best for Them

 

Being a detective doesn’t always require a pipe, an old school hat and a long, dark coat. It mostly takes common sense and dedication towards your objectives. Of course, this and not rushing to a conclusion. I’ve mentioned before that we should be reluctant to hurried conclusions, as they may be the result of superficial analyses.

Instead, monitor all the aspects of your competition’s business and you can gain business intelligence and benchmark your results.

 

 

Track-Competitors-Prices

 

For instance, knowing competitors’ online prices allows you to get a good image of what a convenient pricing policy from your site should look like. Have they increased their prices? Maybe it’s time you made a discount to the following 100 customers. Be agile!

 

Using a dedicated crawler may help you be more specific and thorough when it comes to finding the most recent and exhaustive web pages. While using the search engine’s crawler can bring you great insight, when it comes to specific niches, it’s better to be more detail-oriented in your approach.

 

21. Stay Ahead of the Game by Tracking Your Competitors’ Facebook Pages

 

Insight on the social media progress can help you monetize your content marketing investment.

 

From posting at the right hours to only sharing interesting content that engages your audience, everything you want to know about Facebook performance lies in tracking Facebook pages whenever you think their strategies may divulge something of crafting successful businesses.

 

Pages to Watch by Facebook is an opportunity for businesses that have a page with more than 100 likes to analyze a ton of great suggested pages in their industry. It is a good practice for strengthening any social media strategy and get content ideas from direct competitors. 

 

Facebook Pages to Watch

 

Additionally, you can compare different Facebook performances of your competition and match them a sum of reasons for which their visibility progress looks a certain way.

 

You can get compared statistics on the pages’ performance in a time interval of your choice. Their rate of success can be associated with the posts they shared in that time frame. Correlating these two sets of data can give you fresh ideas for strategic promotion.

 

22. Track Your Competitors’ Twitter Followers to Learn Their Follower Growth Strategies

 

As relevant as monitoring the Facebook pages of your competitors can be monitoring the Twitter performance – of course, this can vary from an industry to another and from a geographic area to another.

 

Klear – Social Analytics tracks the fan base growth on Twitter and other social networks over time for brands and profiles, both for you and your competitors. You can use it to keep track of your social media performance during a time frame.

 

Track Competitors Twitter Followers

 

Depending on your niche, you might even weigh a site’s Twitter performance more than the one on Facebook, as in some niches it seems to be preferred for industry networking. From a marketing point of view, it’s there where most of the juicy conversations happen.

 

 

23. Track Your Competitors Youtube Views & Subscribers

 

As we’ve addressed the potential issues of different platforms, it’s quite necessary to talk about Youtube.

 

Whether you’ve got a channel through which you’re giving away great tutorials or your entire business is centered on Youtube, you should be up to date with your competition’s performance. Take a look at your competitor’s daily evolution and better understand the reasons behind their success – and maybe get an impulse to implement them yourself.

 

youtube-analytics-channelmeter

 

ChannelMeter provides you with YouTube analytics on different types of performance, from cumulative views to the number of subscribers and the daily evolution of a channel. Comparing channel performances gives you pragmatic insight on what goes best in your niche.

 

24. Study the Historical SEO Visibility of Your Competitors & Understand Their Growth Trends

 

What’s most useful about this widget is that we most often know what the former strategies of our competitors used to be.

Associating a performance level to their tactics may help you understand what worked and what not and, of course, establish which factors were temporary and which permanent.

After this, trying out some complementary steps that would work with your ongoing plans could help us boost our visibility and results.

 

Search visibility for skillshare in US

 

The Search Visibility metric measures the performance of a site during the last two years, which makes it a very powerful chart if you’re tempted to debunk your competitors’ ups and downs, along with their strategies.

 

The results of this chart make a wonder when paired with the Wayback Machine, overlapping the content, link building and digital marketing strategies used by your competitors with the associated results.

 

IV. Beat Your Competitors at Their Own Game

 

25. Dominate Your Niche by Running In-depth SEO Visibility Audits

 

It’s not advised to make content marketing strategies at the expense of context, regardless of how individual you wish your brand to be seen as. It’s always best to know how every competitor in the niche is performing to have a consistent viewpoint.

 

SEO-Visibility-Audit-on-an-Entire-Niche

 

As you can see above in the comparison of the sites in the niche of tires in the UK, unmasking the performance of your direct and indirect competitors helps building solid and pragmatic strategies to boost web performance.

It’s impossible to be top-line when you haven’t demystified the links and ties that dictate who the influencers are.

 

26. Identify Link Building & Growth Strategies Using Advanced Backlink Fingerprinting Techniques

 

The mechanism of ranking relies, among many other factors, on being able to identify what works in terms of link building – a natural profile with social media shares and good authority doesn’t grow on trees.

 

Niche-Link-Fingerprinting

 

That is why looking over the fence allows you to see which eccentric and unconventional strategies are used, depending on the marketing objectives. As mentioned above, being linked by blogs usually comes with a higher social media engagement rate, which you can later analyze in our InBound Link Analysis module.

 

Clearly, there’s no universal rule, nor is there a recipe, just a bunch of approaches that work better than others.

 

27. Advanced Content Marketing Full Niche Audit Strategy

Relativity works in physics, love and copywriting equally.

 

Your social media premise should always be flexible, to the extent where content is almost exclusively delivered to correspond to the readers’ interest, the top hours when they’re most active and, of course, identifying patterns.

 

Social-Visibility-per-Subdomains-Niche-Content-Audit

 

However, it’s not impossible to outline generic tendencies and try to leverage them to your advantage.

 

Conclusion

 

There’s no easy way to spy on your competition and it’s not as easy as math – because, unlike math, a pattern is never objectively defined when it comes to social media trends. It’s empirically observed and compared to a ton of criteria that spin the wheel of your audience.

 

What’s your source of inspiration when it comes to getting insights from your competitors?

 

Note: This article is an improved version of an older one we had. We considered this article to be valuable for those interested in outranking their competitors and the information needed a refresh because the online market is evolving very quickly. 

The post 27 Unique Techniques on How to Effectively Spy On Your Competitors appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

]]>
https://cognitiveseo.com/blog/10591/27-techniques-spy-competitors/feed/ 4