April 29, 2021 • News, Knowledge, Business
May 23, 2022 •
Table of Contents
SEO’s history dates back to the early 90’s and was born out of the necessity to optimize your website to be favored by search engines.
Why do you need to optimize your website?
Because a website that is recognized as “good” by search engines is likely to appear at the top of the engine’s result page! Organically. For free.
Let us tell you what ranking at the top on the Search Engine Result Page (SERP) would mean for your website:
✔ More visits – the higher you rank, the higher the chance your website will be visited. The rate of visitation will decrease dramatically down the ranks. Dramatically.
✔ More conversion – especially if your website is designed as a part of a business entity. The conversion rate tells you how many of the visits actually turn into purchases, engagement, or any metrics actions that are favorable to your business
✔ More credibility – top appearing website results for a certain keyword WILL appear more trustworthy. Nobody trusts bottom-dwelling websites – or worse, next page appearance.
In short, search engine users use search engine platforms to find solutions for their problems, and they are more likely to visit, trust, and spend on websites that appear near the top of the result page.
Need we say more?
Now, let’s get into what are the things you need to pay attention to when optimizing your website!
But before that, watch this video on SEO best practice:
Role of the Robots in SEO
How do search engines know how to rank your website on their SERP?
By employing a tool called the “robots” – or spiders. Or crawlers. The choice of the term depends on who you ask. In deference to their latter alias, these robots do just that.
Robots “crawl” the web (or the internet) to venture and find web pages such as the ones you have on your website. Once these robots have explored your website and found the pages, they start to document and index these findings into their search engine’s huge database.
In the documentation phase of the robots’ endeavor, they will analyze your web pages’ content and evaluate what your pages are about, determine its relevance, and decide how high your page should rank for certain keywords and topics.
Each page they’ve analyzed would then be categorized neatly into “catalogs”, which would then enable the search engine to quickly (less than a second) provide you with the result of the keywords you’ve input in the search box.
Automation is pretty neat, huh?
Especially knowing that in the past, webmasters had to manually submit their websites and pages to search engines’ databases. What a bother.
To put it briefly, the fate of your web pages is at the mercy of these robots, so you better understand what pleases them.
On-Site SEO deals with the content and structure of your website/pages such as keywords, relevant content, MetaData and Meta Description.
Long gone were the days when keywords were about the only thing that mattered.
Webmasters used to deliberately and distastefully stuff their pages with keywords so that the algorithm would place them high on SERP. Remember the days when every other word on a web page or article was your search keyword? Yeah, those days.
Search engine algorithms have evolved and become more sophisticated nowadays. Keyword density is still a major factor in rank determination, but it’s no longer the only one. Keyword overstuffing now carries a significant penalty for major search engines – your webpage would lose its authority.
Try it. Be careless in stuffing your content with the same keywords just for the sake of ranking high and find your web pages ranking first… On the third page of the search engine result.
What was the reason for the change?
To make the user experience when browsing the web more pleasant. Keywords that are placed without regard to relevance and context make a very confusing experience to users.
The following are some of the best practices when it comes to keyword optimization:
✔ Do use keywords that are relevant to the content’s topic.
✔ You may use words that are semantically related to your topic-relevant keywords throughout your content. Major search engines such as Google are able to recognize and understand related words and will regard them as a factor that would positively contribute to your ranking consideration.
✔ Don’t overuse keywords. Think of your users and how comfortable an experience you’d want them to have when reading your content.
Meta Tags and Meta Data
When users do a search on major search engines such as Google or Yahoo!, the SERP would show them 3 features of each search result that are known collectively as MetaData.
The anatomy of a typical Meta Data on major search engines consist of the Title Tag, Website’s URL, and the Meta Description.
Some tips on optimizing your Meta Data and Meta Tags:
✔ Being one of the most important aspects to optimize, your page’s title tag should contain keywords that describe the content of the page. Try to place your keywords at the beginning of the Title Tag. Be accurate and concise.
✔ Title Tags should contain between 55-60 characters. No more, no less.
✔ Meta Description, while it bears no significant effect on SEO, can help to steer users to visit your website. Don’t rely on the automatically generated Meta Description by search engine. Write your own accurate and concise Meta Description. Using Call to Actions (CTA) here is recommended.
✔ Keep your Meta Description below 160 characters. Get as close possible to 160 characters.
✔ Use keywords in your URL. This is an SEO common sense.
✔ Don’t change your URL unless necessary, especially if your website or page has been running for some time. The authority and rank associated with your pages are tied to the current URL, and changing it might cause you to lose some.
As the name suggests, Off-Site SEO is any form of optimization that you perform outside of your website.
Optimizing Backlink Density
One of the more classic factors that would influence your web page ranking is the number of backlinks to your website. Just like keywords, backlinks used to be a major determinant of how your webpage would rank.
Webmasters in the olden days also had the tendency to abuse this knowledge and stuffed their webpages with backlinks. And that worked! Web Pages would rank highly with this indiscriminate backlinking and it worked for a while – until it didn’t.
As the people behind major search engines got wiser, so did their search engine algorithms. Nowadays, the algorithm would penalize haphazard backlinking just like it does keyword abuse.
Now, backlinks are still pretty much one of the major factors of the robot’s evaluation process. If the quality of backlinks is high, and the link source is relevant and appropriate to your webpage so as to bolster a friendlier user experience, then the robots would reward you by indexing your content as a web page with authority.
Things to mind when it comes to backlinks:
✔ The quality of content of your page. Yes. Make as interesting and engaging content as possible that will solve your users problems, and let other websites organically link your pages as references. Deliberately manipulating backlink density to get rank and authority transfer (also known as link juice) through purchasing links is illegal and will be penalized by search engines.
✔ Use link roundups. Link roundups are blog posts that recommend pages and share the links to said pages. Make a kind and totally not forceful pitch to link roundup bloggers to feature your page on their routine roundups. Make sure you have very usable content that would be appropriate to be featured.
✔ 5W 1H (What, Where, When, Who, Why, How) content is very linkable. Stick to that format.
What is technical SEO?
Well, technical SEO has everything to do with how your site is structured – kind of like the architecture of your webpage.
The purpose of optimizing technical SEO is to make it easier for the robots to crawl, navigate, and index your pages. Anything that pleases the robot is a good thing for webmasters, as a technically optimized page is easier for users to discover.
Aspects of technical SEO are your HTMLs, XML sitemaps, robots.txt, and some of the common error codes such as the 404 and 501.
The function of sitemaps is to direct search engines to your pages so that the robots do not ignore your pages and which information to include in search engines whenever users are looking for search results.
There are two kinds of sitemaps: HTML and XML sitemaps – both with different looks and specific functions.
To put it simply, HTML sitemap is a page’s directory that can be easily understood by human users who are using the search engine.
In other words, an HTML sitemap is like the index page of a book. You refer to HTML sitemap to see the links that would direct you to the pages of a website.
Best practice for HTML optimization:
✔ As previously discussed, optimize your Title Tag using topic-relevant keywords.
✔ Use headers to segment your content. Optimize the headers using topic-relevant keywords.
✔ Use ALT Tags for your images. Robots can’t understand images. The ALT Tags let them understand the relevance of the image to the page’s topic. ALT Tags also provide accessibility for users who use screen readers as the ALT Tags would be read out.
In contrast to HTML sitemap, XML sitemap exists to be read by search engine crawlers. XML sitemap optimization can be very beneficial if your website is new and uncharted by the robots.
XML sitemaps contain behind the scene information about your page, such as URL information, frequency of update, when was your page last updated, and its significance to the other content of your website.
The data provided by XML sitemap lets the search engine robots analyze your content better, in a more thorough and structured fashion. It’s like a guidebook for the crawlers when they’re trying to make sense of your page.
Optimize XML sitemap with the following tips:
✔ Create an XML sitemap using free tools you can find on the internet. Some of the more popular tools among SEO practitioners are xml-sitemaps.com and the Screaming Frog. Both tools allow you to create, crawl, and analyze up to 500 pages for the free version. Consider purchasing the full version if you have a bigger website, or are managing the optimization for several clients.
✔ Sign up for Google Search Console and submit your XML sitemap. This is a much faster way to be noticed by the robots than letting the crawlers discover you through links.
This file is the gatekeeper for search engine robots. The robots.txt basically tells crawlers which pages within your website they can and cannot analyze and index.
Yes. There are websites with pages that webmasters do not want the robots to index, and that’s where the robots.txt file comes in handy!
There’s a catch though: the robot.txt file is less of a command and more of a suggestion to robots. Crawlers CAN choose to ignore the robot.txt file and index the information on that page nonetheless, but major search engine crawlers generally respect the text.
Now, not all robots are created equal. Some robots are created with malicious intents by shady programmers. These robots will disregard your robot.txt file because they want to know things in your web pages you’re trying to prevent them from knowing.
For what purpose?
For hacking of course. Fun!
Sensitive data that your web pages contain such as customer data can be collected by these thug robots and sold for cash to companies for spamming, but that doesn’t mean that you shouldn’t take the time to optimize your page’s robot.txt file!
How do you optimize the robot.txt? Follow these tips:
✔ Provide introductory information on your robot.txt file. Something like: this robot.txt file is to prevent crawlers from crawling and indexing some pages on your website by major search engines such as Google and Yahoo!.
✔ Provide specific instructions for search engine robots on what and how to crawl. Instructions can include: the length these robots should wait before crawling your site so as to not overload your site and prevent crashes. Use the “Disallow: /” command to prevent robots from crawling certain pages.
Error codes, while creating an unsavory experience for users, can also impact your SEO negatively.
There are many error codes and each code speaks for specific errors. The purpose of these codes is to tell both web users and search engines about the problems when loading a specific page.
Error codes fall under 2 main groups: Codes that start with numbers 4 and 5.
The most common status code that starts with the number 4 is the 404 status code. This code appears when a page cannot be found when a user is trying to access it. The reason could be that the page no longer exists, or it was removed.
Codes that begin with the number 5 refer to server related errors. Code 500 is similar to 404, but with no specific reason provided. Code 500 usually resolves itself automatically, so you won’t need to worry about it. Other 5## codes can also appear when your website is overloaded or is undergoing maintenance.
Now, let’s pay attention to a very important 4## error code.
If a page returns a 404, search engine robots will not index that page.
Okay, so what’s the problem?
The problem is, there’s a thing called the “soft 404”. The “soft 404” means that the page is still available and does not show the 404 error message, but the content of that page has been removed. This is a common occurrence in e-commerce and real estate sites where products and listings are out-of-stock or have been removed.
When you have multiple “soft 404” pages within your website, the search engine robots would deem those pages as duplicates. Duplicates within the same website beget you search engine penalties. Ultimately, your website’s authority will be negatively impacted and your SEO suffers.
How to manage error codes so that their appearance does not severely affect user experience and SEO:
✔ Do a regular sitewide audit to discover any “soft 404”. You can do this manually or with the help of online tools such as the Google Search Console.
✔ Design 404 pages to reflect the DNA of your business/website aesthetic. This ensures a seamless experience for the users, even for an error page.
✔ Alternatively, you could engineer the error 404 page to redirect users to your homepage. This will create less confusion than displaying a basic 404 error page and your users can navigate other pages on your website.
There you go. Now you know some SEO best practices for your website. Remember, SEO is free but slow to bear fruit, but once you get it going, it’s there to stay.
Here at Timedoor, we offer website development service with SEO optimization for businesses – such as yours!
Hit us up at this link to get the best consultation and service for all of your website development needs.
some other blog posts you might be interested in