How We Increased Organic Traffic Over 50% With SEO Technical Updates

Languages

How We Increased Organic Traffic Over 50% With SEO Technical Updates

 

I must admit: we made several mistakes while developing and SEO Designing our own website. For a company that needs to be at the forefront of online marketing and SEO, there is no reason to be proud. But even experts are wrong, aren't they?

Instead of fixing, forgetting and moving on, we, on the contrary, decided to focus our attention precisely on mistakes. We will tell you where you were wrong and what you should do to avoid problems for the future. Moreover, we'll show you what and how we fixed and how those fixes resulted in more than 50% increase in our organic traffic in just one month.

How strong is your site? Rate it with our web analysts.

What is technical SEO?

Technical SEO are the questions you ask yourself when building your website. The list of criteria looks something like this:

Only one H1 per page.
Links to your top headings from the main navigation.
Adding alt attributes to all your images.
Generating Clean URLs Without Dynamic Symbols.
Minimizing page load time.

While all these factors are common knowledge and obvious, you will be surprised how many websites bypass these points. For example, constant technical updates can work wonders for organic traffic growth, (read: Marketing online presents) but many companies have done a website design once and never come back to it again.

When I joined seo10ex last year, I had the opportunity to take a fresh look at everything we have done until today. I began to ask myself more and more questions about our directions and site strategy. How we analyze and identify errors. What, in the end, are the conclusions drawn from this? I decided to find out. By the way, take a look at Optimizing video content for search.

The first thing I noticed were the factors that contribute to the success of some of our best materials. From here, I did a detailed SEO Audit of our website to identify potential problems and areas where there is room for growth. Soon I began to realize that we are not as perfect as we thought ...

Several technical SEO problems (and solutions), which we have learned the hard way.

Issue 1. Broken links, redirects and 404 page

Whenever users try to reach a URL that does not exist on our website, they are redirected to a 404 page. For example, you need to go to https://seo10ex.com/en/course/165/ what-seo, and you are redirected to this page.

It's a big problem if you have a lot of such non-existent pages on your site. Instead of crawling the normal pages on your site, Google has to crawl what is not on the site. But that's not the whole problem. A significantly bigger problem is that this particular URL is being referenced by 80 domains of which 380 backlinks (What are backlinks?) Indicate something that isn't there. And of course, Google doesn't like such events and it treats it extremely negatively. Does this affect your page rank and your site's PageRank?
Definitely yes!

How did we deal with this?

The first thing we did was remove the 301 redirect pointing any page on the offer.seo10ex.com or www.seo10ex.com subdomain to the “/ not-found” page. This means that unless a 301 redirect to the correct page has been configured for the wrong URL, it will jump to a 404 and tell Google not to include it in its index.

The second thing we did was remove the "/ not-found page" so that it goes to a 404 page and is not considered a site page on our website.

The third and final thing we did was set up a 301 redirect for all invalid URLs that were linked to point to the relevant or correct URL.

What is the positive effect? ​​

The number of pages indexed by Google should drop dramatically, and Googlebot will crawl more important pages for us more often, and less likely to crawl large volumes of non-existent URLs.
On top of that, all that great PageRank will be put into the content we want to rank for and give it a huge boost with the influx of new links pointing to it correctly. This is how websites work.

Problem 2: Blog Pagination (Pagination)

One circumstance that directly affected the blog content of our website was how pagination was performed on the list pages. Having problems with how we linked to blog ad pages, e.g. https://seo10ex.com/blogs?page=1, https://seo10ex.com/blogs?page=2, etc.

All we had were Next and Previous buttons, and those buttons weren't even properly styled. The problem wasn't how the buttons looked, the problem was the search engines.

When Google crawls a site for content, it follows the links of the web pages until it finds the page it is looking for. In order to find a page that was written, say, a year ago, the robot had to go to the blog and then follow each "Next" link until it gets to a page with a list of blog posts with a link to this article.

Every time a Googlebot (or any other search bot) clicks on a link, it goes one level deeper in the architecture of the website. The deeper it goes, the less authoritative the web page is in the eyes of search engines and the less it gets crawled. In some cases, if a page is very deep in architecture, it may not be crawled at all - read: Key Website Ingredients.

How did we answer this?

We wanted to design the paginated blog navigation so that Google can crawl multiple pages at once and cover most of our blog posts significantly higher in website architecture.
For this, we have implemented the following navigation:

What came of it?

This decision was inspired by my talented colleague Val Roman. Val is spearheading yet another project to re-publish old blog content, with the ultimate goal of pushing it higher in site architecture and ultimately ranking higher. (Read the article: The Ultimate Guide to Google Ranking Factors in 2021.)

If this works, we will see that a number of blog content will receive a ranking boost as a result. It's a simple and small change, but it can add a ton of buffs. Anyway, this is a big improvement in our blog's content architecture and overall user interface.

Problem: blog schema micro-markup

Until now, we have not applied Schema.org markup to our blog content (and indeed all site content, for that matter) so that Google can crawl specific elements on our web pages. What is Schema.org markup?

Schema.org is a micro-markup that allows you to structure data on a site for search engines. With its help, search engines understand what data to take to display an extended snippet. ... A snippet is a summary of information that is shown in search results.
In simple terms, Schema.org markup is used so that search engines can understand what type of content is on your web page.

What have we done with it

In the case of our blog, we have marked up the code on all our articles in order to give Google the following information about the article:

1. This is a blog post.
2. This is a featured image of the blog post.
3. This is the date and time of publication.
4. This is the title of the article.
5. This is the main content of the article.
6. This is the category the article belongs to.
7. This was posted by seo10ex.com.
8. This is the name of the author of the post.
9. This is the URL of the author's page.
10. This is a picture of the author.
11. Here is a short description of the article.

You can Check your Google structured markup data. Click on "go to test with extended results", enter
Your blog's URL, and then click Check Page. Once you complete these steps, the tool will show all the data in the BlogPosting dropdown list. A very handy tool that detects all possible errors in the markup code.

What does it do?

This gives the fact that, using markup, you can explicitly indicate to search robots that there is a product on the page, for example seo10ex.com, and pass the basic parameters: name, price, SKU, manufacturer, etc. Based on this data, Google generates extended snippets in search results.
It won't have a revolutionary impact, but it's worth it.

 

Problem: XML Sitemap

On our website seo10ex.com we post all the content of our offers. In short, this is all the content we use to generate leads - our ebooks, templates, webinars, and so on. This is the content we really want to post on the search engines.
Did we have a sitemap? We didn't even have an XML sitemap.

What have we done with this?

We reviewed and created a new XML sitemap for the content of the entire site, including our offerings, and submitted it to Google.

What are the results?

Site architecture, including keywords, still requires a lot of effort, but a sitemap will help Google discover any new content we publish and speed it up.
Results
The above list really speaks for itself. Moral of the story: Don't underestimate effective technical SEO changes.

An audio version of this article for visually impaired users:

SEO Agency

The definition of an SEO company is a company that partners with businesses like yours to boost your visibility in search engines. Increased visibility in search engines means more traffic coming to your website and — ultimately more leads, phone calls, and sales.