SEO 101: Part 4 (Technical SEO)

Our last part of the SEO 101 series - technical SEO. We'll go into the things that prevent your site from getting crawled or indexed well, but not too deep that you can't action on it. Plus some suggestions on tracking your SEO performance.
Written by
Jennifer Chu
Published on
March 7, 2024

This is the last part in our SEO 101 series. We saved Technical SEO for last, because it can get quite technical (surprise!) and make peoples’ eyes glaze over. And then we’ll never get you back.  BUT, we’ll try to keep this as simple as possible and prioritize the most impactful action items for you. If you’d like to (re)visit the other articles in this series, here are the four pillars of a successful SEO strategy:

  1. Content
  2. User Experience
  3. Authority
  4. Technical SEO

Technical SEO is the methodology that ensures a search engine can crawl you, index and rank your website without any issues. Search engines must be able to read and understand the content you’ve created before you can rank well and be discovered.

How Search Engines Work

There are three stages of Google search1:

  1. Crawling: Google crawls through the web to find out what pages and websites exist.  It discovers new sites through links from other sites.
  2. Indexing: Google tries to understand what a page is about by looking at meta tags, backlinks, images, and all the stuff we’ve talked about.  Google then decides if it wants to index the page.
  3. Ranking Results: when a user searches on a keyword, Google will serve up the highest quality results that best match that query.

Technical SEO affects all three stages, but especially the first two stages.

Crawlability

A website must be crawled first by search engines.  The factors that influence crawlability are:

Website Structure

Website structure and hierarchy for SEO

A good website structure can improve crawlability by search engines. Organizing your pages into a hierarchy of directories or categories can reduce the likelihood that you have orphan pages, pages with no internal backlinks that don’t get crawled by search engines.  Good organization also reduces the number of clicks it takes to get from the home page to the destination pages, which helps spread equity across your pages more effectively.

Take this example of Bench.co.

The home page links to category pages (like Blog), some of which link to sub-category pages (like Tax Tips) which then link to the articles in that sub-category.

URL Structure

If you look at the URL structure, you’ll see that it follows the same organizational structure:

anatomy of a URL sructure

The categories and sub-categories appear as sub-folders. The article itself has the targeted keywords in the URL.  The URL format is a lower direct ranking factor, but it can help with user experience.  When Google displays the search results, the search listings include the sub-folders and/or name of the article, so it may help the user determine if the destination content will answer their query.

One more note about URLs -- the use of sub-folders in the URL is less important than website structure. Sub-folders in the URL can also be more difficult to implement depending on your website hosting platform or CMS. I really like Webflow, because their CMS allows us to do this very easily.  We plan to create a lot of content, so this is important for us, despite Website's steeper learning curve. Squarespace is not good at this, but their platform is super easy to use and is a great place to start if you've never built a website and don't plan to have tons of pages.

HTTPS

Let’s go back to the URL structure mentioned above for one more important point:

It is ESSENTIAL that you use a HTTPS protocol. This stands for Hypertext Transfer Protocol Secure. It is the secure version of HTTP, protects users’ information and prevents data breaches if you collect any kind of information on your site. Most websites now are HTTPS that it almost feels silly to even mention this. In fact, most hosting providers automatically provide a Secure Sockets Layer (SSL) certificate to ensure your domain is HTTPS.

HTTPS has been a ranking factor since 20142.

HTTP Status Codes

Have you ever gotten that Page Not Found error? That’s called a 404 and is one of many HTTP status codes. HTTP status codes are part of the response code that a web server returns in response to a request made to it. They can help identify the result of the request — each different status code communicates something. Here are the ones you should know:

  • 200: OK, a successful request of a URL or page — the page is live.
  • 301: a permanent redirect. The requested page has been permanently moved to a different URL.For example, let’s say that you’ve launched a new and improved version of your product and you are now ready to phase out the old version.  You can add a 301 redirect so that any visits to the old page through historical links out there will be automatically redirected to the new product page. This helps pass on the old page’s link equity onto the new page. It’s also a better user experience than someone getting a 404 error.
  • 302: a temporary redirect. The requested page has been temporarily moved to a different URL. You might use a 302 if you have a short-term promotional page, like a holiday offer, that you want users to see instead of another evergreen page. 302s will help you maintain your link equity and search rankings until you remove the 302 and switch back to the original page.
  • 404: page not found. Since inbound links are a ranking factor and internal links help you spread equity across your site, 404s can hurt your SEO performance. If you find 404 errors, you should 301 redirect them to other pages or for internal links, fix those links.

Redirect chains

Redirect changes are when pages redirect multiple times from the original request to the final destinatiom page. This can happen if you’ve added a redirect without realizing another redirect is already in place, or you’ve gone through a site migration (eg HTTP to HTTPS or changing CMS platforms). Redirect chains create a poor user experience for both users and crawlers by reducing page load speeds, decreasing link value to specific URLs and eating up your crawl budget.

Indexation

Once Google crawls a website, it then decides if it will index the pages.  Some reasons why Google would not index pages are low quality content, duplicative content, and the technical SEO elements below:

Robots.txt

Robots.txt is your site's map legend to crawlers, telling them where they're welcome and where they should steer clear. It's a file placed at the root of your server that enables you to communicate what and where crawlers should or shouldn't index. When a search engine crawls your site, it will look for the XML sitemap first.

Example: https://www.bench.com/robots.txt

Robots.txt is not required, and most websites don’t need one. Reasons to have a robots.txt file are if you want to block sections or types of content from being crawled, like PDFs or images where you can't add a NoIndex tag (more below); or, you are using up your crawl budget and need to block the less important pages from getting crawled so that the search engine bots can crawl the priority ones.

XML Sitemaps

XML sitemaps are like the table of contents of your website - a designated roadmap for search engines. Having an organized XML sitemap means that all your pages have a higher chance of getting indexed.

Your sitemap is usually located in the top level directory of your site:

https://www.le-herring.com/sitemap.xml

The contents of the file look like the below, and can be auto-generated by most website hosting providers.  In our case, Webflow created this Sitemap.

Example of XML sitemap

It’s best practice to submit your sitemap to Google Search Console. If you don’t have a GSC account, you should create one.  

Submitting a sitemap on GSC helps get your site indexed faster.  Through GSC, you can also request Google to crawl or re-crawl your URLs (instead of waiting), if you’ve just made a bunch of edits or added new pages.

The XML sitemap helps with faster indexation, particularly when you have a large site with tens of thousands of pages. It can also help if your pages are not linked very well, requiring 6, 7 or 8 clicks to get to.

Noindex tags

If you want to exclude certain pages from being indexed, those tags (instructions) would show up here.  The actual tagging of those pages would be done on the page settings in your web hosting platform. On Webflow, there’s an option to disable indexing of pages  with the Sitemap indexing toggle.

Examples of when you’d not want to index pages:

  • Paid  landing pages
  • Thank you pages
  • Login pages

In short, the robots.txt is the central location directing bots on where to go and where not to go. XML sitemaps are the detailed atlas they actually read.

Other Technical SEO Factors

There are many more factors, but then we start getting really technical.  Your SEO needs also depend on the complexity of your sites, the number of pages you have and the type of business you are in. If you’d like to know more about these, you can review the below list with links to other references:

  • Canonicals - if you have multiple versions of your pages, you would want to use canonicals to tell search engines which version to index so you don't get penalized for having duplicate content
  • Nofollow links - Nofollow links have tags that tell search engines not to crawl those links and pass equity.
  • Schema - schema markup is additional code added to page to help standardize certain attributes so search engines understand it better. It's used in recipes, products, articles, reviews and 30+ other types of content.

SEO Tools

What good is putting all this effort into your SEO if you don’t know how well you are doing or if you are not measuring improvement? You also need tools to alert you if something breaks, which will inevitably happen.

To monitor and analyze your website and organic performance and diagnose issues at scale, you need the right tools in place.  Here are some popular ones that we recommend:

  • Google Analytics (GA) - there’s really no reason to not set this up.  It’s a free tool, and most web hosting platforms have an integration with GA.  GA reports on all traffic to your site, the channel generating the traffic (organic, paid, direct, email, social), the behavior and path of users who enter your site, and much much more.
  • Google Search Console (GSC) - another must-have, free tool that focuses on organic traffic. Here’s where you get visibility on which keywords users searched on to find your site, your rank position in the results page, and what the click-through rate was for them clicking on your link. GSC can also help monitor certain issues, such as 404 errors, broken internal links, and more.
  • SEMRush - A great and affordable SEO tool with many capabilities which we’ve mentioned before. For technical SEO, they offer a site auditing tool where you can schedule regular crawls, as you would with a dentist or doctor when getting a health check. Get alerted to any new issues or confirm that improvements you’re making are decreasing the number of errors.

Summary of Technical SEO Best Practices

In summary, here are the most important considerations for Technical SEO:

  • Good website structure that is well-organized and it doesn’t take more than a few clicks from Home to get to a page
  • Clean URL structure that is readable by search engines and humans
  • HTTPS protocol is required
  • Address 404 errors. Replace internal links to 404s with 200s, redirect external link to 404s with 301s
  • Submit your XML Sitemap to Google Search Console
  • Use SEO tools to automate the measuring of your organic and site performance and flagging of issues, so that you can diagnose the causes and fix them.

And that's a wrap for SEO 101 Series!

Review our other SEO 101 Articles here

  1. Content
  2. User Experience
  3. Authority
  4. Technical SEO

Related Articles

How to Come Up With Article Ideas

Do I Need a Website?

Media and Entertainment Lead The Way For New Audience Trends

The Business Startup Checklist

References

1 https://developers.google.com/search/docs/fundamentals/how-search-works

2 https://www.semrush.com/blog/technical-seo/

Startup Moms
by LeHerring
We are more than the professional expertise we offer. Follow our journey and musings as mompreneurs.
Read about our privacy policy.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Related Articles

No items found.

DISCLAIMER: Information on this site is for educational purposes only. LeHerring LLC does not provide, legal, accounting, tax or investment advice. Although care has been taken in preparing the information provided to you, we are not responsible for any errors or omissions, and we accept no liability whatsoever for any loss or damage you may incur. Always seek financial and/or legal counsel relating to your specific circumstances as needed for any and all questions and concerns you now have or may have in the future.

We cannot guarantee your success, nor are we responsible for any of your actions. Our role is to support and assist you in reaching your own goals, but your success depends primarily on your own effort, motivation, commitment, and follow-through. We cannot predict, and we do not guarantee, that you will attain a particular result.

AFFILIATES: From time to time, we may promote, affiliate with, or partner with other individuals or businesses whose programs, products, and services align with ours. In the spirit of transparency, we want you to be aware that there may be instances when we promote, market, share or sell programs, products, or services for other partners. In exchange, we may receive financial compensation or other rewards.