What is SEO?

Search engine optimization (SEO) is a crucial aspect of any online business strategy. With billions of searches conducted on search engines like Google every day, having a strong online presence is essential to attract and retain users, increase brand awareness, and drive more traffic to your website. In this article, we will cover the basics of SEO, its importance, how search engines work, and some best practices to help improve your website's ranking and visibility.

Understanding the Role of SEO in Digital Marketing

Search engine optimization (SEO) is a critical component of digital marketing. It involves improving your website to increase its visibility in search engine results pages (SERPs), making it easier for potential customers to find your products, services, or information. SEO is part of search engine marketing (SEM), which combines both organic (SEO) and paid (PPC) strategies to drive traffic and visibility from search engines.

How Search Engines Work: Crawlers, Indexing, and Ranking

Search engines like Google and Bing use crawlers, also known as bots or spiders, to gather information about web content. These crawlers start from a known webpage and follow internal and external links to other pages, helping them understand the content and how it's connected to other pages in their vast index. When users enter a query, search engines use complex algorithms to provide the most relevant and accurate results, taking into account numerous factors that are constantly evolving.

The Importance of SEO for Online Businesses

SEO is essential for businesses looking to harness the potential of search engine traffic. By optimizing your website and targeting specific keywords relevant to your audience, you can rank higher in SERPs and attract more visitors. Studies have shown that higher-ranked pages receive significantly more clicks than lower-ranked ones, with the top three organic results getting over 50% of all clicks. SEO, therefore, plays a crucial role in driving traffic, increasing brand awareness, and ultimately boosting your bottom line.

Focusing on User Experience and Google's Priorities

Creating an optimized website is not just about catering to search engine algorithms; it's also about providing an exceptional user experience. Google aims to deliver the best search experience possible by providing relevant, high-quality results as quickly as possible. By focusing on improving your site's user experience, you'll also be aligning your website with Google's goals and increasing your chances of ranking higher in search results.

The Anatomy of Search Engine Results Pages (SERPs)

SERPs typically consist of paid search results (PPC ads) and organic search results, which are ranked based on their relevance and quality. Depending on the query, Google may also include additional elements like maps, images, or videos. SEO aims to improve your website's ranking in organic search results, which can be a powerful and cost-effective way to drive traffic to your site.

Best Practices for Effective SEO

To make the most of your SEO efforts, it's essential to follow best practices that cater to both search engines and users. This includes creating high-quality, unique content that is relevant to your target audience, optimizing your website for both desktop and mobile devices, and ensuring that your site is easy to navigate and user-friendly. By implementing these strategies, you'll be well on your way to improving your website's visibility and performance in organic search results.

SEO is a vital aspect of any online marketing strategy, helping businesses improve their website's visibility and user experience while driving more traffic and potential customers. By understanding the basics of SEO, how search engines work, and the importance of user experience, you can better optimize your website and stay ahead of the competition. Remember, the key to successful SEO lies in continuous learning, adaptation, and implementation of best practices.

What is Technical SEO?

SEO (Search Engine Optimisation) is the marketing discipline of removing Technical SEO barriers within your website so search engines can crawl, index and return that relevant content for their users search queries. There are many types of ways that websites are created, from bespoke hand built and coded sites through to a wide variety of open source and paid Content Management Systems (CMS). All are coded in varying languages but all have the fundemental HTML, CSS & JavaScript at their core. These languages and a lot more beside are the usual cause of crawling and indexing issues making it harder for search engines to firstly "crawl", Index and rank your content.

Diagnosis of these issues begins with a Technical SEO Audit, whether it is for the beginning of an SEO Campaign or it is to root out why content is not being discovered, Technical SEO audits get to the bootom of why and provide the solutions & recommendations to address and how to address them.

What Does a Typical Technical SEO Audit Look at?

XML Sitemap - Mobile (if applicable)

An XML sitemap specifically for mobile pages of a website. It is a file that lists the URLs for the mobile pages of a website, and it allows search engines to more easily discover and crawl those pages. The purpose of a mobile XML sitemap is to ensure that search engines are able to access and index the mobile version of a website's content. This can be especially important for websites that have a separate mobile version, as it helps search engines understand the relationship between the desktop and mobile versions of the site and can improve the website's ranking in mobile search results.

Mobile XML Sitemap(s) Listed in Robots.txt

The robots.txt file is a text file that specifies which pages or files the search engine should not crawl. By listing a mobile XML sitemap in the robots.txt file, a website can prevent search engines from crawling the mobile pages listed in the sitemap. This might be done, for example, if the website's mobile pages are not yet ready to be indexed or if the website has decided to stop supporting a mobile version of the site.

HTML Sitemap

A HTML sitemap is a type of sitemap that is written in HTML (Hypertext Markup Language) and provides a list of pages on a website. HTML sitemaps are typically used to help users navigate a website and find the information they are looking for. They can be especially useful for websites with a large number of pages or complex site structures, as they provide an overview of the content available on the site and allow users to easily access different pages. HTML sitemaps can be linked to from the website's footer or other prominent locations, and they are often organized into categories or subcategories to make it easier for users to find what they are looking for.

Found in Google Images?

Found in Google Images refers to whether the images on a website can be found when searching on Google Images. Google Images is a search service owned by Google that allows users to search the Web for image content. When a user searches for an image on Google Images, the search results will include images from a variety of sources, including websites. If a website's images can be found in Google Images, it means that the images are indexed by Google and are being included in the search results. This can be beneficial for the website, as it can drive traffic to the site from users who are looking for images related to the website's content.

XML Sitemap - Images

An XML sitemap is a file that lists the URLs for the pages of a website and helps search engines discover and crawl those pages. An XML sitemap for images is similar to a regular XML sitemap, but it specifically lists the URLs for image pages on the website. By creating an XML sitemap for images, a website can help search engines understand the relationship between the images on the site and the pages that they are used on, which can improve the website's visibility in image search results.

Image XML Sitemap(s) Listed in Robots.txt

The robots.txt file is a text file that specifies which pages or files the search engine should not crawl. By listing an image XML sitemap in the robots.txt file, a website can prevent search engines from crawling the image pages listed in the sitemap. This might be done, for example, if the website's image pages are not yet ready to be indexed or if the website has decided to stop supporting a particular set of images.

Videos

Found Videos refers to whether videos are present on a website. If a website is found to have videos, it means that there are one or more video files hosted on the site. Videos can be a useful form of content for a website, as they can engage and entertain users, and they can also help to improve the website's visibility in search results. However, it is important for website owners to ensure that their videos are optimized for search engines and are easily accessible to users. This can involve creating transcripts or captions for the videos, providing detailed descriptions, and using appropriate tags and titles.

XML Sitemap - Videos

An XML sitemap is a file that lists the URLs for the pages of a website and helps search engines discover and crawl those pages. An XML sitemap for videos is similar to a regular XML sitemap, but it specifically lists the URLs for video pages on the website. By creating an XML sitemap for videos, a website can help search engines understand the relationship between the videos on the site and the pages that they are used on, which can improve the website's visibility in video search results.

Video XML Sitemap(s) Listed in Robots.txt

The robots.txt file is a text file that specifies which pages or files the search engine should not crawl. By listing a video XML sitemap in the robots.txt file, a website can prevent search engines from crawling the video pages listed in the sitemap. This might be done, for example, if the website's video pages are not yet ready to be indexed or if the website has decided to stop supporting a particular set of videos.

XML Sitemap - Pages

An XML sitemap is a file that lists the URLs for the pages of a website and helps search engines discover and crawl those pages. An XML sitemap for pages is similar to a regular XML sitemap, but it specifically lists the URLs for pages on the website. By creating an XML sitemap for pages, a website can help search engines understand the relationship between the pages on the site and the content they contain, which can improve the website's visibility in search results.

Page XML Sitemap(s) Listed in Robots.txt

A robots.txt file is a text file that is used to instruct web robots (also known as "spiders" or "crawlers") about which pages on a website to crawl and which pages to ignore. If a page XML sitemap is listed in the robots.txt file, it means that the sitemap is available for the web robots to crawl and use as a reference when indexing the pages on the website. The sitemap can help the web robots discover pages on the website that might not be linked to from any other pages or that might be difficult to find by following links.

Page XML Sitemap with Broken Links

Broken links can occur for a number of reasons. The page that the link points to might have been deleted or moved, or the link might have been typed incorrectly. When a web robot crawls a sitemap and encounters a broken link, it will be unable to access the page and will report an error. This can make it more difficult for search engines to index the pages on the website and can negatively impact the website's search engine rankings.

XML Sitemap - News/Blog

An XML sitemap for a news or blog website can be especially useful for search engines, as it allows them to more easily discover and index new articles as they are published. The sitemap can also include information about the articles, such as the publication date, title, and author.

Found subdomains?

When analyzing a website, we might discover that there are subdomains associated with the main domain. This can occur for a variety of reasons. For example, a company might use subdomains to host different parts of their website, such as a blog, a store, or a support center. Alternatively, a company might use subdomains to target specific regions or languages.

Subdomain XML sitemaps

A subdomain XML sitemap is a sitemap that specifically lists the URLs for a subdomain. It is used to tell search engines about the pages on a subdomain that they should know about and can help them discover and index the pages on the subdomain.

Subdomain robots.txt

A subdomain robots.txt file is a robots.txt file that is specific to a subdomain. It is used to instruct web robots about which pages on the subdomain to crawl and which pages to ignore.

Subdomain Indexation Rate

The indexation rate of a subdomain refers to the percentage of pages on the subdomain that are indexed by search engines. When a page is indexed, it means that it is included in the search engine's database and can be returned as a result for relevant searches.

Robots.txt (root) condition and best practices

A general check on the current index status of files, pages and directories

Crawl vs. Index Status

It is important to understand the difference between crawl and index status because they can have a big impact on how a website is discovered and accessed by users. For example, if a page has a low crawl rate, it might not be discovered by search engines and therefore will not be indexed. On the other hand, if a page has a high crawl rate but is not being indexed, it might be because the page is not considered to be high quality or relevant by the search engine.

Breadcrumb Presence (and links), Visual Representation, & Availability

A general check on the presence, technical implementation and best practice optimisation

Breadcrumbs NOT in JS

It is important to consider the accessibility of your breadcrumbs when implementing them. Using HTML and CSS to create breadcrumbs can make them more accessible to users with disabilities, as they will be able to access the breadcrumbs even if they have JavaScript disabled.

Links coded in HTML (NOT JavaScript)

It is generally best to use HTML to create links, as they are simple, lightweight, and widely supported by web browsers. HTML links are also more accessible to users with disabilities, as they do not require JavaScript to work.

Server Uptime

Server uptime is important because it directly affects the availability and reliability of the services that the server is hosting. If a server has a low uptime, it can lead to downtime for the services, which can be frustrating for users and can have a negative impact on the business or organization that is relying on the server.

Image ALT Attributes

ALT attributes, also known as "ALT tags" or "ALT descriptions," are a piece of HTML code that is used to describe the content of an image on a web page. The ALT attribute is added to the IMG tag in the HTML code for the page and is intended to provide a text alternative for the image.

Image file size optimization

Image file size optimization refers to the process of reducing the size of image files without sacrificing quality.

Image redirects

An image redirect is a type of redirect that specifically involves an image file. A redirect is a way to send users and search engines to a different URL than the one they originally requested.

Video detail page and URL

A video detail page is a web page that is specifically designed to display information about a particular video. This information can include the title of the video, a description, the duration, the date it was published, and other details.

General crawl behaviors

General crawl behaviors refer to the way that web robots (also known as "spiders" or "crawlers") interact with and discover the pages on a website. Crawl behaviors are important because they can have a big impact on how a website is discovered and indexed by search engines, and ultimately, how it is ranked in search results.

Multimedia Issues (general accessibility & Indexing)

It is important to ensure that all multimedia content is properly coded and optimized and that it is designed to be accessible to all users. This can help improve the user experience on the website and can also help improve the website's visibility in search results.

Duplicate content

Duplicate content refers to content that is identical or substantially similar to content that is already published on the internet. Duplicate content can be a problem because it can confuse search engines and make it more difficult for them to determine which version of the content is the most relevant and should be ranked higher in search results.

Proper use of canonical tags

Canonical tags are a type of HTML tag that is used to specify the "canonical" or preferred version of a web page. They are used to help search engines understand which version of a page should be indexed and ranked in search results, and which versions of the page are duplicate or substantially similar.

Faceted navigation optimization

Faceted navigation optimization refers to the process of improving the performance and effectiveness of a faceted navigation system. This can involve a number of techniques

Hyphens used as default delimiter in URLs

Hyphens are often used as the default delimiter in URLs because they are easy to read and understand and they do not have any special meaning in URLs.

Use of splash or gateway pages

In general, it is best to avoid using splash pages unless they serve a specific and necessary purpose. Instead, you should focus on providing users with direct and easy access to the content that they are interested in.

Architecture Issues

Architecture issues refer to problems with the structure and organization of a website. These issues can have a negative impact on the performance and usability of the website, as well as on its ranking in search results.

Site accessible without Javascript

While JavaScript can be an important part of a website, it is not essential for all websites. Some users may have JavaScript disabled in their web browser for security or performance reasons, and it is important to ensure that these users can still access and use the website.

TLM Reflects Order in Terms of Relevance of Content

It is important that the TLM elements of a page reflect the order of relevance of the content on the page. This means that the title, link text, and meta description should accurately and concisely describe the main topics and themes of the page and should prioritize the most important information.

Overall URL optimization (usage of target keywords)

It is important to use target keywords in the URL in a natural and relevant way. Stuffing a URL with irrelevant or spammy keywords can harm the visibility and ranking of the page in search results. It is also important to keep URLs short and concise, as longer URLs can be more difficult to read and understand.

Absolute vs Relative URL References in source code

It is generally best to use relative URLs whenever possible, as they are more efficient and can be more portable. Absolute URLs should be used only when necessary, such as when linking to resources on other websites.

Footer optimization

Footer optimization refers to the process of improving the design and content of the footer area of a website in order to make it more user-friendly and to improve the overall performance and effectiveness of the website.

Use of HTTPS vs HTTP

In general, it is recommended to use HTTPS whenever possible, especially for websites that handle sensitive information or that require a high level of security. Using HTTPS can help protect the data of your users and can also improve the trust and credibility of your website.

Broken redirects

A broken redirect is a redirect that leads to an error page or a page that no longer exists. This can happen when the destination page of a redirect has been deleted or moved, but the redirect itself has not been updated to point to the new location. Broken redirects can be a problem because they can lead to a poor user experience and can also cause issues with search engine optimization. If a search engine follows a broken redirect, it may not be able to find the intended destination page, which can result in the page not being indexed and ranked appropriately in search results. To avoid broken redirects, it's important to regularly check and update any redirects to ensure that they are pointing to the correct destination pages.

Internal link ecosystem

An internal link ecosystem is a network of links within a website that connect pages within the same website. These links can help to organize the content on the website, making it easier for users to navigate, and they can also help search engines understand the structure and hierarchy of the site. Internal links are especially important for large websites, as they can help distribute link equity (a measure of the value of a link) throughout the site, helping to improve the overall search engine ranking of the site.

Article timestamp and author link (if applicable)

An article timestamp is a date and time that is associated with a particular article or piece of content. The timestamp typically indicates when the article was published or last updated, and it can be displayed in a variety of formats, such as "January 4, 2023" or "4 Jan 2023."

Pagination optimization

Pagination optimization is the process of improving the performance and user experience of paginated pages on a website.

Custom 404 Page

Custom 404 pages are useful because they allow you to provide a more user-friendly experience for your visitors when they encounter a broken link or mistype a URL. A custom 404 page can include a friendly message explaining the error, as well as links to other pages on your site that the user might be interested in.

Odd parameter indexing - Alphanumericals

Odd parameter indexing refers to the practice of adding unnecessary or irrelevant parameters to URLs in order to try and improve the search engine ranking of a page. It can also arise from the way a site has been engineered and may be completely unintententional.

403 Server Response

A 403 server response indicates "forbidden" access to content, sometimes this intentional, sometimes it isn't, we look at each case on merit

Use of Temporary Redirects (302, 307)

Temporary redirects are used for content that really is temporary in nature, whether it is to be deleted or redirected in the future, sometimes these temporary redirects are used in place of the what should be a more permanent (301) redirect. We' look at each URL in turn to determine why this has been imlemented

Redirect Chains

Redirect chains are a stacking of redirects when content is moved, if you have had several site architecture redesigns over a period of time these redirects can amount to multiple redirects, we'll look at each in turn and recommend the best practice.

Directive rel="Noindex"

The rel="noindex" is a directive for search engine crawlers not to index content, sometimes this may be intentional, it can also be accidental, we will advise and recommend on a URL by URL basis

Use of meta refreshes

A meta refresh is often used to redirect users when engineering does not offer the ability to use other redirection rules. It can also be used to alert a user that they are about to be redirected and can display a message before being redirected after a set amount of seconds.

Homepage Root Redirect (www / non-www)

A homepage root redirect (also known as a "www / non-www" redirect) is a type of redirect that is used to ensure that users always access the same version of your website's homepage, regardless of whether they include "www" in the URL or not.

Flash avoidance

Adobe Flash was a popular platform for building interactive websites, games, and video content, but it has been in decline in recent years. Many modern web browsers no longer support Flash, or have announced plans to stop supporting it in the near future.

Use of frames

Frames were once a popular way to divide a webpage into multiple sections, each of which could load a separate HTML document. The idea behind frames was to allow developers to create more complex layouts and to reuse common elements (such as headers and footers) across multiple pages.

Content pulled in via iFrames

An iFrame (short for "inline frame") is an HTML element that allows you to embed another webpage within the current webpage. iFrames are commonly used to embed videos, maps, and other types of interactive content on a webpage.

Use of sessions IDs or other unique identifiers

A session ID (also known as a "session token") is a unique identifier that is used to identify a user's session on a website. Session IDs are typically stored in a cookie on the user's computer, and they are used to keep track of the user's activity as they navigate from page to page on the site.

Use of the rel="canonical" tag

The rel="canonical" tag is an HTML attribute that is used to specify the "canonical" or preferred version of a webpage. The canonical version of a page is the one that you want search engines to treat as the authoritative version of the page.

Use of meta robots tag (X-Robots)

The meta robots tag (also known as the "X-Robots" tag) is an HTML tag that can be used to control how search engines index and crawl a webpage.

Client's business targets: multiple countries or languages?

Language and localization: If the business targets multiple languages, it is important to ensure that the website is translated accurately and that it is easy for users to switch between languages.

Use of Multiple ccTLDS

A ccTLD (country code top-level domain) is a domain extension that is specific to a particular country or territory. For example, ".ca" is the ccTLD for Canada, ".co.uk" is the ccTLD for the United Kingdom, and ".de" is the ccTLD for Germany.

Lang Tags (Rel="Alternate" Hreflang="x")

The HREFLANG attribute is an HTML attribute that helps search engines understand the language and intended audience of a webpage. It tells search engines which language the webpage is written in and which region it is intended for. By using the HREFLANG attribute, you can help search engines properly index and serve the correct version of your webpage to users based on their language and location.

Alternate language directive set in page XML sitemap.

In addition to using the HREFLANG attribute in the head of your HTML source code, you can also specify the language and intended audience of your webpages in your XML sitemap. An XML sitemap is a file that lists all the URLs on your website and provides additional information about each URL, such as when it was last updated and how often it changes.

Page Speed Review - Avg. Speed Score (ms)

The speed at which a web page loads can have a significant impact on its search engine rankings. Search engines like Google place a high value on user experience, and slow-loading pages can negatively affect the user experience. As a result, pages that load slowly may rank lower in search engine results.

Time-To-First-Byte (TTFB)

TTFB stands for "Time to First Byte." It is a measure of how long it takes for a web server to send the first byte of data in response to a request from a client, such as a web browser. TTFB is an important metric for evaluating the performance of a web server, as it can have a significant impact on the overall loading time of a web page.

Download Time

Download time is the amount of time it takes for a web page or other content to be downloaded from a server to a client, such as a web browser. Download time is an important factor in the user experience, as it can affect how quickly a user is able to access and interact with a web page.

Google Page Speed Insights (desktop) - home page, landing page

Google Page Speed Insights is a tool that provides performance information about a web page, with a focus on how the page can be improved to load faster. When you run a Page Speed Insights report, the tool will analyze the page and provide a score based on how well the page performs on both desktop and mobile devices.

Optimized title tags per page

Title tags are HTML elements that specify the title of a web page. They are displayed in the title bar of the web browser and are also used by search engines as a ranking factor in search results.

Page Headings (H1) Issues

Using relevant and descriptive H1 tags can help improve the SEO of a webpage by making it easier for search engines to understand the content and context of the page.

Optimized meta description tags per page

Meta description tags provide a summary of the content on a webpage. They appear in search engine results and can help influence a user's decision to click on a result.

Content Readability ("readable chunks")

Content readability refers to the ease with which a reader can understand the information presented in a piece of text. There are many factors that can affect readability, including the length and complexity of sentences, the use of technical terms and jargon, and the overall structure and organization of the content.

What Does a Typical SEO Strategy Look Like?

  1. Conduct keyword research: Identifying the keywords and phrases that potential customers are using to search for your products or services is crucial for creating an effective SEO strategy. By using tools such as Google Keyword Planner and SEMrush, you can determine which keywords are most relevant to your business and have the highest search volume. This information can then be used to optimize your website's content and improve its search engine visibility.
  2. Optimize website content: Once you've identified the keywords that are most relevant to your business, you can use them to optimize your website's content. This includes things like headlines, meta tags, and body text. By incorporating these keywords in a natural and relevant way, you can help search engines understand what your website is about and make it more likely to appear in relevant search results.
  3. Build backlinks: Backlinks are links from other websites that point to your website. These links are important for improving your website's domain authority and search engine ranking. By acquiring backlinks from high-authority websites, you can increase the credibility of your website and make it more likely to rank well in search results.
  4. Use social media: Social media can be a powerful tool for driving traffic to your website and building brand awareness. By promoting your website and its content on social media platforms like Facebook, Twitter, and Instagram, you can reach a wider audience and encourage more people to visit your website.
  5. Monitor and measure: In order to know if your SEO strategy is effective, it is important to monitor and measure your website's traffic and search engine rankings. Tools such as Google Analytics can provide valuable insights into how people are finding your website, what pages they're visiting, and how long they're staying. This information can be used to make adjustments to your strategy and improve your website's performance.
  6. Local SEO: If your business serves a specific geographic area, it's important to optimize your website for local search. This can be done by including your business's address, phone number, and business hours on your website, and by getting listed on local directories such as Google My Business. By doing so, you increase the chances of appearing in relevant local search results.
  7. Optimize website structure: The structure of your website is important for both search engines and users. A well-structured website with clear URLs, easy navigation and internal linking, makes it easier for search engines to crawl and index your website, and also improves user experience.
  8. Create quality content: Creating high-quality, informative, and engaging content is key to establish your website as an authority in your industry. By consistently producing valuable content, you will attract links and social shares, both important ranking factors for search engines.
  9. Mobile optimization: More and more people are accessing the internet on their mobile devices, so it's important to make sure your website is optimized for mobile. This means that your website should load quickly and be easy to navigate on all devices. By optimizing for mobile, you can improve the user experience and increase the chances of appearing in mobile search results.
  10. Stay up-to-date: The world of SEO is constantly evolving, with new trends and algorithm updates being introduced on a regular basis. In order to stay ahead of the competition, it is important to stay up-to-date with the latest developments and adjust your strategy accordingly. This can be achieved by subscribing to industry blogs, attending SEO conferences and keeping track of search engine announcements.

User Experience

Fundamental to every website is how easy is it to use, does it meet the search intent of why they landed on your website. Creating easy to use websites, creating an information architecture that makes sense, making it easy to understand exactly what you offer within seconds of landing on a page within your website whether they use desktop search or a mobile device.

Quality Content

Satisfying the intent of a search query begins with the quality of content you present, whether this is branded, informational or transactional content.

  • Branded search query - Already familiar with your brand
  • Informational search query - Research before making a decision
  • Transactional search query - Ready to buy or inquire about your service

Every visitor to your site via the Organic traffic channel meets one or a combination of those search query types, satisfying that intent with content that represents what you offer is the key to not only ranking better but also satisfying the intent of other channels that drive traffic to your website such as Paid Search, Referral Traffic, Social Media, Email Marketing).

How do our SEO campaigns help?

After we get to know why your website exists, we set clear goals within your Google Analytics account (unless they are there already), we use those goals and work backwards to create a definitive SEO campaign strategy and tactics.

Developing an Organic search traffic marketing strategy we use different techniques to optimise existing content, remove any Technical SEO barriers and create content or assets that attract links.