Home / Blog / How to Find and Fix 14 Technical SEO Problems That Can Be Damaging Your Site Now

How to Find and Fix 14 Technical SEO Problems That Can Be Damaging Your Site Now

Who doesn’t love engaged on low-hanging fruit website positioning issues that may dramatically enhance your website?

Throughout all companies and industries, the low-effort, excessive-reward tasks ought to leap to the highest of the listing of issues to implement. And it’s nowhere extra related than tackling technical search engine optimization points in your web site.

Let’s deal with straightforward-to-determine, simple-to-repair issues. Most of those points could be uncovered in a day, and it’s doable they will resolve months’ value of visitors issues. Whereas there will not be groundbreaking, complicated points that can repair search engine marketing as soon as and for all, there are simple issues to examine proper now. In case your website already checks out for all of those, then you’ll be able to go residence right now and begin decrypting RankBrain tomorrow.

thatwaseasy.gif

Source

Actual fast: The definition of technical search engine marketing is a bit fuzzy. Does it embody every thing that occurs on a web site apart from content material manufacturing? Or is it simply restricted to code and actually technical gadgets?

I’ll outline technical search engine optimisation right here as elements of a web site comprising extra technical issues that the common marketer wouldn’t establish and take a little bit of expertise to uncover. Technical website positioning issues are additionally usually, however not all the time, website-broad issues somewhat than particular web page points. Their fixes may help enhance your web site as an entire, fairly than simply remoted pages.

You’d suppose that, with all the knowledge on the market on the net, many of those could be frequent information. I’m positive my automobile mechanic thought the identical factor after I busted my engine as a result of I forgot to place oil in it for months. Easy oversights can destroy your machine.

Source

The audience for this put up is starting to intermediate SEOs and web site house owners that haven’t inspected their technical web optimization for some time, or are doing it for the primary time. If simply considered one of these 14 technical search engine optimisation issues beneath is harming your web site, I believe you’d contemplate this a invaluable learn.

This isn’t a whole technical search engine marketing audit guidelines, however a abstract of among the most typical and damaging technical web optimization issues you can repair now. I highlighted these primarily based alone actual-world expertise analyzing dozens of shopper and inner web sites. A few of these points I assumed I’d by no means run into… till I did.

This isn’t a substitute for a full audit, however these proper now can really prevent 1000’s of dollars in misplaced gross sales, or worse.

1. Examine indexation instantly

Have you ever ever heard (or requested) the query: “Why aren’t we rating for our model title?”

To the web site proprietor, it’s a head-scratcher. To the seasoned web optimization, it’s an eye fixed-roll.

Are you able to get natural visitors to your website if it doesn’t present up in Google search? No.

I adore it when complicated issues are simplified at a better stage. Sergey Stefoglo at Distilled wrote an article that broke down the advanced technique of a technical web optimization audit into two buckets: indexing and rating.

The idea is that, as a substitute of going loopy with a 239-level guidelines with various priorities, you sit again and ask the primary query: Are the pages on our web site indexing?

You will get these solutions fairly shortly with a fast website search straight in Google.

What to do: Sort website:yoursitename.com into Google search and also you’ll instantly see what number of pages in your web site are rating.

site-moz.png

What to ask:

  • Is that roughly the quantity of pages that we’d count on to be indexing?
  • Are we seeing pages within the index that we don’t need?
  • Are we lacking pages within the index that we need to rank?

What to do subsequent:

  • Go deeper and examine totally different buckets of pages in your website, resembling product pages and weblog posts
  • Test subdomains to verify they’re indexing (or not)
  • Test previous variations of your website to see in the event that they’re mistakenly being listed as a substitute of redirected
  • Look out for spam in case your web site was hacked, going deep into the search outcome to search for something unusual (like pharmaceutical or playing search engine marketing website-hacking spam)
  • Work out precisely what’s inflicting indexing issues.

2. Robots.txt

Maybe the only most damaging character in all of website positioning is an easy “/” improperly positioned within the robots.txt file.

Everyone is aware of to test the robots.txt, proper? Sadly not.

One of many greatest offenders of ruining your web site’s natural visitors is a properly-that means developer who forgot to alter the robots.txt file after redeveloping your web site.

You’d suppose this might be solved by now, however I’m nonetheless repeatedly working into random websites which have their whole web site blocked due to this one drawback

What to do: Go to yoursitename.com/robots.txt and ensure it doesn’t present “Person-agent: * Disallow: /”.

Right here’s a elaborate screenshot:

Screenshot 2017-01-04 17.58.30.png

And that is what it appears like in Google’s index:

2-robots-1.png

What to do subsequent:

  • If you happen to see “Disallow: /”, instantly discuss to your developer. There might be a very good motive it’s arrange that manner, or it could be an oversight.
  • When you have a posh robots.txt file, like many ecommerce websites, you must overview it line-by-line along with your developer to verify it’s right.

three. Meta robots NOINDEX

NOINDEX could be much more damaging than a misconfigured robots.txt at occasions. A mistakenly configured robots.txt gained’t pull your pages out of Google’s index in the event that they’re already there, however a NOINDEX directive will take away all pages with this configuration.

Mostly, the NOINDEX is about up when an internet site is in its improvement section. Since so many internet growth initiatives are working delayed and pushed to dwell on the final hour, that is the place the error can occur.

A great developer will make certain that is eliminated out of your dwell website, however you need to confirm that’s the case.

What to do:

  • Manually do a spot-examine by viewing the supply code of your web page, and on the lookout for one in every of these:
    4-noindex.png
  • ninety% of the time you’ll need it to be both “INDEX, FOLLOW” or nothing in any respect. In the event you see one of many above, it is advisable to take motion.
  • It’s greatest to make use of a device like Screaming Frog to scan all of the pages in your web site without delay

What to do subsequent:

  • In case your web site is consistently being up to date and improved by your improvement crew, set a reminder to verify this weekly or after each new website improve
  • Even higher, schedule web site audits with an search engine marketing auditor software program device, just like the Moz Pro Site Crawl

four. One model per URL: URL Canonicalization

The common consumer doesn’t actually care if your house web page exhibits up as all of those individually:

However the various search engines do, and this configuration can dilute hyperlink fairness and make your work more durable.

Google will usually determine which model to index, however they might index a combined assortment of your URL variations, which might trigger confusion and complexity.

Moz’s canonicalization guide sums it up completely:

For SEOs, canonicalization refers to particular person net pages that may be loaded from a number of URLs. It is a downside as a result of when a number of pages have the same content however totally different URLs, hyperlinks which might be meant to go to the identical web page get break up up amongst a number of URLs. Because of this the popularity of the pages will get cut up up.”

It’s probably that nobody however an search engine optimisation would flag this as one thing to repair, however it may be a straightforward repair that has a big impact in your web site.

What to do:

  • Manually enter in a number of variations of your house web page within the browser to see if all of them resolve to the identical URL
  • Look additionally for HTTP vs HTTPS variations of your URLs — just one ought to exist
  • In the event that they don’t, you’ll wish to work along with your developer to arrange 301 redirects to repair this
  • Use the “website:” operator in Google search to seek out out which variations of your pages are literally indexing

What to do subsequent:

  • Scan your complete web site directly with a scalable device like Screaming Frog to search out all pages sooner
  • Arrange a schedule to watch your URL canonicalization on a weekly or month-to-month foundation

5. Rel=canonical

Though the rel=canonical tag is carefully associated with the canonicalization talked about above, it needs to be famous in another way as a result of it’s used for greater than resolving the identical model of a barely totally different URL.

It’s additionally helpful for stopping web page duplication when you’ve gotten related content material throughout completely different pages — typically a difficulty with ecommerce websites and managing classes and filters.

I believe the most effective instance of utilizing this correctly is how Shopify’s platform makes use of rel=canonical URLs to handle their product URLs as they relate to classes. When a product is part of a number of classes, there are as many URLs as there are classes that product is part of.

For instance, Boll & Department is on the Shopify platform, and on their Cable Knit Blanket product page we see that from the navigation menu, the consumer is taken to https://www.bollandbranch.com/collections/baby-blankets/products/cable-knit-baby-blanket.

However wanting on the rel=canonical, we see it’s configured to level to the principle URL:

<hyperlink  href="https://www.bollandbranch.com/merchandise/cable-knit-child-blanket" />

And that is the default throughout all Shopify websites.

Each ecommerce and CMS platform comes with a distinct default setting on how they deal with and implement the rel=canonical tag, so undoubtedly take a look at the specifics on your platform.

What to do:

  • Spot-test vital pages to see in the event that they’re utilizing the rel=canonical tag
  • Use a web site scanning software program to record out all of the URLs in your web site and decide if there are duplicate web page issues that may be solved with a rel=canonical tag
  • Learn extra on the different use cases for canonical tags and when greatest to make use of them

6. Textual content in photographs

Textual content in photographs — it’s such a easy idea, however out within the wild many, many websites are hiding vital content material behind photographs.

Sure, Google can considerably perceive textual content on photographs, but it surely’s not practically as subtle as we’d hope in 2017. The perfect observe for search engine optimisation is to maintain necessary textual content not embedded in a picture.

Google’s Gary Illyes confirmed that it’s unlikely Google’s crawler can acknowledge textual content properly:

CognitiveSEO ran a great test on Google’s means to extract textual content from photographs, and there’s proof of some gorgeous accuracy from Google’s know-how:

6-text-google-extracts-pdf.jpg

Supply: Cognitive SEO

But, the conclusion from the take a look at is that picture-to-textual content extraction know-how isn’t getting used for rating search queries:

6-text-google-doesnt-extract-search.jpg

Supply: Cognitive SEO

The conclusion from CognitiveSEO is that “this search was proof that the search engine doesn’t, in truth, extract textual content from photos to make use of it in its search queries. A minimum of not as a common rule.”

And though H1 tags are usually not as essential as they as soon as have been, it’s nonetheless an on-web site search engine optimization greatest follow to prominently show.

That is really most essential for giant websites with many, many pages akin to huge ecommerce websites. It’s most vital for these websites as a result of they’ll realistically rank their product or class pages with only a easy key phrase-focused predominant headline and a string of textual content.

What to do:

  • Manually examine an important pages in your website, checking should you’re hiding essential textual content in your photographs
  • At scale, use an search engine optimization web site crawler to scan all of the pages in your website. Search for whether or not H1 and H2 tags are being discovered on pages throughout your website. Additionally search for the phrase depend as a sign.

What to do subsequent:

  • Create a information for content material managers and builders in order that they know one of the best observe in your group is to not conceal textual content behind photographs
  • Collaborate along with your design and improvement staff to get the identical design look that you simply had with textual content embedded in photographs, however utilizing CSS as an alternative for picture overlays

7. Damaged backlinks

If not correctly overseen by knowledgeable website positioning, a web site migration or relaunch challenge can spew out numerous damaged backlinks from different web sites. It is a golden alternative for recovering hyperlink fairness.

A few of the prime pages in your web site might have turn out to be 404 pages after a migration, so the backlinks pointing again to those 404 pages are successfully damaged.

Two sorts of instruments are nice for locating damaged backlinks — Google Search Console, and a backlink checker resembling Moz, Majestic, or Ahrefs.

In Search Console, you’ll wish to evaluate your prime 404 errors and it’ll prioritize the highest errors by damaged backlinks:

broken-backlinks.png

What to do:

  • After figuring out your high pages with backlinks which are useless, 301 redirect these to the very best pages
  • Additionally search for damaged hyperlinks as a result of the linking web site typed in your URL unsuitable or tousled the hyperlink code on their finish, that is one other wealthy supply of hyperlink alternatives

What to do subsequent:

  • Use different instruments corresponding to Point out or Google Alerts to keep watch over unlinked mentions which you could attain out to for an additional hyperlink
  • Arrange a recurring web site crawl or handbook verify to look out for brand new damaged hyperlinks

eight. HTTPS is much less non-obligatory

What was as soon as solely vital for ecommerce websites is now changing into extra of a necessity for all websites.

Google just recently announced that they’d begin marking any non-HTTPS website as non-safe if the location accepts passwords or bank cards:

“To assist customers browse the online safely, Chrome signifies connection safety with an icon within the deal with bar. Traditionally, Chrome has not explicitly labelled HTTP connections as non-safe. Starting in January 2017 (Chrome fifty six), we’ll mark HTTP pages that accumulate passwords or bank cards as non-safe, as a part of a protracted-time period plan to mark all HTTP websites as non-safe.”

What’s much more stunning is Google’s plan to label all HTTP URLs as non-safe:

“Ultimately, we plan to label all HTTP pages as non-safe, and alter the HTTP safety indicator to the purple triangle that we use for damaged HTTPS.”

https-non-secure.png

Going even additional, it’s not out of the realm to think about that Google will begin giving HTTPS websites much more of an algorithmic rating profit over HTTP.

It’s additionally not unfathomable that not safe website warnings will begin exhibiting up for websites straight within the search outcomes, earlier than a consumer even clicks by means of to the positioning. Google presently shows this for hacked websites, so there’s a precedent set.

This goes past simply search engine optimisation, as this overlaps closely with internet improvement, IT, and conversion fee optimization.

What to do:

  • In case your website presently has HTTPS deployed, run your website by means of Screaming Frog to see how the pages are resolving
  • Be sure that all pages are resolving to the HTTPS model of the location (similar as URL canonicalization talked about earlier)

What to do subsequent:

  • In case your web site isn’t on HTTPS, begin mapping out the transition, as Google has made it clear how necessary it’s to them
  • Correctly handle a transition to HTTPS by enlisting an search engine optimization migration technique in order to not lose rankings

9. 301 & 302 redirects

Redirects are an incredible instrument in an search engine optimisation’s arsenal for managing and controlling useless pages, for consolidating a number of pages, and for making web site migrations work with no hitch.

301 redirects are everlasting and 302 redirects are short-term. One of the best apply is to at all times use 301 redirects when completely redirecting a web page.

301 redirects may be complicated for these new to search engine optimisation attempting to correctly use them:

  • Must you use them for all 404 errors? (Not always.)
  • Do you have to use them as a substitute of the rel=canonical tag? (Sometimes, not always.)
  • Must you redirect all of the outdated URLs out of your earlier web site to the house web page? (Virtually by no means, it’s a horrible concept.)

They’re a lifesaver when used correctly, however a ache when you haven’t any thought what to with them.

With nice energy comes nice duty, and it’s vitally necessary to have somebody in your staff who actually understands tips on how to correctly strategize the utilization and implementation of 301 redirects throughout your complete web site. I’ve seen websites lose as much as 60% of their income for months, simply because these weren’t correctly carried out throughout a website relaunch.

Regardless of some statements launched not too long ago about 302 redirects being as environment friendly at passing authority as 301s, it’s not suggested to take action. Current research have examined this and proven that 301s are the gold commonplace. Mike King’s striking example exhibits that the ability of 301s over 302s stays:

What to do:

  • Do a full evaluation of all of the URLs in your website and take a look at a excessive stage
  • If utilizing 302 redirects incorrectly for everlasting redirects, change these to 301 redirects
  • Don’t go redirect-loopy on all 404 errors — use them for pages receiving hyperlinks or site visitors solely to reduce your redirects checklist

What to do subsequent:

  • If utilizing 302 redirects, talk about along with your growth group why your web site is utilizing them
  • Construct out a information on your group on the significance of utilizing 301s over 302s
  • Evaluate the redirects implementation out of your final main website redesign or migration; there are sometimes tons of errors
  • By no means redirect all of the pages from an outdated website to the house web page except there’s a actually good motive
  • Embody redirect checking in your month-to-month or weekly web site scan course of

10. Meta refresh

I although meta refreshes have been gone for good and would by no means be an issue, till they have been. I ran right into a consumer utilizing them on their model-new, trendy web site when migrating from an previous platform, and I shortly beneficial that we flip these off and use 301 redirects as an alternative.

The meta refresh is a shopper-aspect (versus server-facet) redirect and isn’t beneficial by Google or skilled SEOs.

If applied, it could seem like this:

Screenshot 2017-01-05 15.46.13.png

Supply: Wikipedia

It’s a reasonably easy one to examine — both you might have it otherwise you don’t, and by and huge there’s no debate that you just shouldn’t be utilizing these.

Google’s John Mu said:

“I’d strongly suggest not utilizing meta refresh-kind or JavaScript redirects like that when you’ve got modified your URLs. As an alternative of utilizing these sorts of redirects, attempt to have your server do a standard 301 redirect. Serps may acknowledge the JavaScript or meta refresh-sort redirects, however that’s not one thing I might rely on — a transparent 301 redirect is all the time significantly better.”

And Moz’s personal redirection guide states:

“They’re mostly related to a 5-second countdown with the textual content ‘If you’re not redirected in 5 seconds, click on right here.’ Meta refreshes do go some hyperlink juice, however aren’t advisable as an web optimization tactic because of poor usability and the lack of hyperlink juice handed.”

What to do:

What to do subsequent:

  • Talk to your builders the significance of utilizing 301 redirects as a typical and by no means utilizing meta refreshes except there’s a very good cause
  • Schedule a month-to-month verify to observe redirect sort utilization

eleven. XML sitemaps

XML sitemaps assist Google and different search engine spiders crawl and perceive your web site. Most frequently they’ve the most important affect for big and sophisticated websites that want to offer further path to the crawlers.

Google’s Search Console Help Guide is kind of clear on the aim and helpfulness of XML sitemaps:

“In case your web site’s pages are correctly linked, our net crawlers can often uncover most of your web site. Even so, a sitemap can enhance the crawling of your website, notably in case your web site meets one of many following standards:
– Your web site is de facto giant.
– Your web site has a big archive of content material pages which can be remoted or nicely not linked to one another.
– Your website is new and has few exterior hyperlinks to it.”

A couple of of the largest issues I’ve seen with XML sitemaps whereas engaged on purchasers’ websites:

  • Not creating it within the first place
  • Not together with the placement of the sitemap within the robots.txt
  • Permitting a number of variations of the sitemap to exist
  • Permitting previous variations of the sitemap to exist
  • Not protecting Search Console up to date with the freshest copy
  • Not utilizing sitemap indexes for giant websites

What to do:

  • Use the above record to evaluate that you just’re not violating any of those issues
  • Verify the variety of URLs submitted and listed out of your sitemap inside Search Console to get an thought of the standard of your sitemap and URLs

What to do subsequent:

  • Monitor indexation of URLs submitted in XML sitemap incessantly from inside Search Console
  • In case your website grows extra complicated, examine methods to make use of XML sitemaps and sitemap indexes to your benefit, as Google limits each sitemap to 10MB and 50,000 URLs

12. Unnatural phrase rely & web page dimension

I not too long ago bumped into this challenge whereas reviewing a website: Most pages on the positioning didn’t have quite a lot of hundred phrases, however in a scan of the positioning utilizing Screaming Frog, it confirmed practically each web page having 6,000–9,000 phrases:

Screenshot 2017-01-05 16.25.58.png

It made no sense. However upon viewing the supply code, I noticed that there have been some Phrases and Situations textual content that was meant to be displayed on solely a single web page, however embedded on each web page of the location with a “Show: none;” CSS model.

This may decelerate the load pace of your web page and will presumably set off some penalty points if seen as intentional cloaking.

Along with phrase depend, there will be different code bloat on the web page, corresponding to inline Javascript and CSS. Though fixing these issues would fall beneath the purview of the event workforce, you shouldn’t depend on the builders to be proactive in figuring out some of these points.

What to do:

  • Scan your website and evaluate calculated phrase depend and web page dimension with what you count on
  • Evaluate the supply code of your pages and suggest areas to scale back bloat
  • Be sure that there’s no hidden textual content that may journey algorithmic penalties

What to do subsequent:

  • There may very well be a great purpose for hidden textual content within the supply code from a developer’s perspective, however it could possibly trigger pace and different search engine optimization points if not mounted.
  • Overview web page dimension and phrase depend throughout all URLs in your website periodically to maintain tabs on any points

thirteen. Pace

You’ve heard it 1,000,000 occasions, however pace is vital — and positively falls below the purview of technical search engine marketing.

Google has clearly stated that velocity is a small a part of the algorithm:

“Like us, our customers place a variety of worth in velocity — that’s why we’ve determined to take web site velocity under consideration in our search rankings. We use quite a lot of sources to find out the pace of a web site relative to different websites.”

Even with this clear search engine optimization directive, and apparent UX and CRO advantages, pace is on the backside of the precedence record for a lot of website managers. With cell search clearly cemented as simply as essential as desktop search, velocity is much more essential and may not be ignored.

On his superior Technical SEO Renaissance post, Mike King mentioned velocity is crucial factor to give attention to in 2017 for website positioning:

“I really feel like Google believes they’re in a superb place with hyperlinks and content material so they may proceed to push for velocity and cellular-friendliness. So one of the best technical search engine optimisation tactic proper now’s making your website sooner.”

Moz’s page speed guide is a superb useful resource for figuring out and fixing velocity points in your website.

What to do:

  • Audit your web site pace and web page velocity utilizing web optimization auditing instruments
  • Until you’re working a smaller web site, you’ll wish to work carefully along with your developer on this one. Make your web site as quick as attainable.
  • Constantly push for assets to give attention to website pace throughout your group.

14. Inner linking construction

Your inner linking construction can have a huge effect in your website’s crawlability from search spiders.

The place does it fall in your listing of priorities? It relies upon. For those who’re optimizing a large website with remoted pages that don’t fall inside a clear website structure a number of clicks from the house web page, you’ll have to put plenty of effort into it. In case you’re managing a easy website on a normal platform like WordPress, it’s not going to be on the high of your checklist.

You wish to take into consideration these items when constructing out your inside linking plan:

  • Scalable inside linking with plugins
  • Utilizing optimized anchor textual content with out over-optimizing
  • How inside linking pertains to your most important web site navigation

I constructed out this map of a fictional website to show how totally different pages on a web site can join to one another by each navigational web site hyperlinks and inner hyperlinks:

Website navigation with internal links diagram.

Supply: Green Flag Digital

Even with a rock-strong website structure, placing a deal with inner hyperlinks can push some websites greater up the search rankings.

What to do:

  • Check out manually how one can transfer round your web site by clicking on in-content material, editorial-sort hyperlinks in your weblog posts, product pages, and vital web site pages. Notice the place you see alternative.
  • Use website auditor instruments to search out and set up the pages in your web site by inner hyperlink depend. Are your most vital pages receiving ample inner hyperlinks?

What to do subsequent:

  • Even should you construct out the proper web site structure, there’s extra alternative for inner hyperlink movement — so at all times hold inner linking in thoughts when producing new pages
  • Practice content material creators and web page publishers on the significance of inner linking and the way to implement hyperlinks successfully.

Conclusion

Right here’s a newsflash for web site homeowners: It’s very doubtless that your developer shouldn’t be monitoring and fixing your technical website positioning issues, and doesn’t actually care about visitors to your website or fixing your search engine optimization points. So when you don’t have an search engine optimization serving to you with technical points, don’t assume your developer is dealing with it. They’ve sufficient on their plate and so they’re not incentivized to repair search engine optimisation issues.

I’ve run into many technical web optimization points throughout and after web site migrations when not correctly managed with search engine optimisation in thoughts. I’m compelled to spotlight the disasters that may go mistaken if this isn’t sorted carefully by an knowledgeable. Case research of web site migrations gone terribly incorrect is a subject for one more day, however I implore you to take technical web optimization critically for the advantage of your organization.

Hopefully this publish has helped make clear among the most vital technical search engine optimization points that could be harming your website as we speak and the way to begin fixing them. For many who have by no means taken a have a look at the technical aspect of issues, a few of these actually are simple fixes and may have a massively optimistic influence in your website.

About Tanjil Abedin

Check Also

How Much Should a Quality Logo Really Cost?

Most firms know that they merely can’t afford to botch their brand: it is going ...

Leave a Reply

Your email address will not be published. Required fields are marked *