web design development cost
Comprehensive On- page Audit checklist
  1. Home
  2. Blogs
  3. Comprehensive On- page Audit checklist

Comprehensive On- page Audit checklist

G etting ready to audit your site’s SEO. You are set to launch into enhancing the SEO while figuring out all the potential clients that your website might have.

Every site has its negative aspects which end up wrong. The good part is that which every problem you face you climb a step ahead to improve. As an SEO, the point of auditing your site is to find ways to get more out of it. The basis for a  good SEO strategy , is the a precondition or prerequisite that the content must be well optimized.


SET UP TECHNICAL SITE MONITORING AND GATHER AUDITING TOOLSET

B efore getting started to auditing, you need to make sure that your website has access to few important tracking tools to keep a track of the  technical aspects by monitoring and diagnosing  of your site in the long term. Some of them include:

These two elements need to be verified and  have to be set up, which is an easy task. Its suggested that the setup and verification is done for the desktop and mobile device of your site.

Additionally, you can consider to  set up and get yourself  acquainted to some tools which support the auditing purpose which goes well with a big website having several features and functionalities.

These tools  serve most purposes, and  there are other tools which fulfill the auditing aspects which are a part of the checklist.


FIND AND ELIMINATE SEARCH ENGINE CRAWL BARRIERS

C rawl barriers, implies the things which happen on your site, which hinder the crawling process to fetch the index of the site to the Google’s search results.

Removing the crawl obstacles from your site is an important step from SEO perspective regardless if a site has issues arising or not.

Even if your website has beautiful content, if you are contributing directly  or indirectly preventing or hindering the path of  search engines  from crawling and indexing it, no matter what quality the content is the results are not fetched until the obstacles are removed.

  • Most  frequent/impactful issues:
  • Robots.txt file disallow

Avoid  hindering any directories or pages that you shouldn’t be via your site’s /robots.txt file. These directives pass instructions to  Google as to which parts of the site they could not crawl into.

For instance if you are hindering a key area,you need to remove that blocking file for the crawling process to take place smoothly.

The best  way to test this is with Google’s Robots.txt tester within Google Search Console .


Meta Noindex, Nofollow tag

  • Be cautious when using a Meta Noindex, Nofollow tag within the page’s HTML head section on your webpages, especially if they are supposed to be indexed. The noindex, nofollow tag tells Google not to index the URL or follow any of the links.
  • If you’re using this elsewhere, you  have to get rid of it to have your site’s content to be indexed.
  • You can check this with Screaming Frog and/or Deep Crawl.

Internal linking / breadcrumbs

  • Make sure that you are linking to all pages that need to be  indexed either via navigation, contextual internal links, or breadcrumbs. Search engines crawl links on your site to capture  information, so if you’re not linking to something, they cannot fetch the information.
  • If not, maybe you need to reconsider your site’s architecture and/or look for opportunities to link to key pages contextually. The most important pages on your site should be no more than a few clicks away from the homepage.
  • You can check this with Screaming Frog and/or Deep Crawl.

Rel=Nofollow tags

  • Ensure that your site’s important links doesn’t have a rel=nofollow attribute. This attribute prompts the search engines  to not to follow the link.

XML Sitemaps

  • Every site  must have an XML sitemap
  • Every site’s XML sitemap must be submitted to search engines.
  • All XML sitemaps must be internally or externally linked to from a site’s robots.txt file.
  • XML sitemaps must contain all valid URL’s on the site (e.g. those that don’t return an error).
  • XML sitemap must contain fewer than 50,000 URL’s, and the file size must be less than 5 MB in  size.

Javascript, AJAX, iFrames, or Other Accessibility Issues

If you’re using Javascript or other types of code configurations, ensure that they aren’t hindering the navigation and links from being crawled

Here are some other potential issues:

  • Blocked resources: Are you blocking your site’s CSS and JS files from being crawled? If so, here’s why you shouldn’t. You can see this in Google Search Console.
  • Broken links or web crawl errors at key access points: Does your site have any broken links within key access points of navigation such as main or secondary navigation? You can check these in Screaming Frog, Deep Crawl, or in Google Search Console’s Web Crawl Errors report.
  • Pages restricted by cookies, logins, or forms: Is any key content hidden behind cookies, logins, or forms?
  • URL parameter handling in Google Search Console: Did you accidentally set the wrong setting for your parameters in Google Search Console?
  • Page rendering: Is Google having trouble rendering your site’s content? You can track this using the  Fetch and Render tool within Google Search Console.

Any one of these problems affect the  site’s ability to get crawled and indexed, so it’s very  crucial  to organize the tabs on  such types of things for a long term  basis.


ENSURE PROPER HTTP STATUS CODES ARE SERVED

Apart from removing the crawl barriers, make sure that you are sending the  right codes to search engines on a page-by-page, resource-by-resource basis is  crucial for crawling and indexing to happen correctly.

200 “Okay” Status

All live assets including HTML pages, CSS & JS files, images, etc. should serve a 200 “Okay” status to search engines. Its primary purpose is to prompt search engines that the URL/file in question is live and credible.

301 and 302 Redirects

Fundamentally, there are two basic types of redirects in play namely  the 301 and 302 redirect.

A 301 “Permanent” redirect indicates the  search engine, that the initial URL has shifted  perpetually to a new address and cannot revert back.

A 302 “Temporary” redirect   indicates a search engine that the initial URL has been shifted for the time being to a new location, and the original URL will be back to use in the near future.

Some redirect guidelines must be followed to meet the SEO standards .

  • Uniformly  use a 301-redirect when a page is shifting perpetually.
  • Consistently  use a 302-redirect only when the initial page doesn’t respond anymore.
  • Redirect the sequence of  chains (e.g. page A > page B > Page C), which is a sequence of redirects with several “hops” before it reaches  destination.
  • Shun redirect loops (e.g. page A > page B > page A), which is a sequence of infinite loop that redirects.
  • Evade usage of  Meta Refresh, rather use 301 redirects.
  • Wherever it’s  possible, to boost the crawl efficiency, update old URL’s to new when the location of the initial URL has been redirected

404 “Not Found” Status

All pages or files that no longer valid or exist must prompt a 404 “Not Found” status code. 404 Errors as occur frequently happen when a page which was once live is deleted or  gets redirected to a new location.

You can look for  site’s 404 errors either by  running a site crawl in Screaming Frog or Deep Crawl, or by looking at your Crawl Errors report in Google Search Console.

Google told earlier that 404 errors don’t hurt your site, and while it’s valid truth, some technical issues could arise, process issues in terms of page retirement, which leads to a bad user experience if the configuration is not done right.

Also if a high-value page gets deleted, failing to redirect to  a relevant counterpart implies that any  historical data will not be held which affects the page’s behavior.

Here are some things to watch out for with 404’s:

  • Make sure that when a page is missing, it redirects to a  proper 404 response code rather than another status such as 200 (which are called Soft 404’s).
  • Always serve errors using a Custom 404 Error Page to inform the users as what has happened they were previously accessing, also to keep a user within the site experience.
  • When a page shall be removed, if there is a close or exact-match counterpart, redirect to  301-which is from the old page to the existing.
  • To boost crawl performance and enhanced user experience, find the  missing or  broken links in the site as a result of 404 errors and update them to go to an relevant accessible URL.

500-Level Status Codes

500-level response codes imply that  problems arise at the server level. If your site is returning these response codes, it could mean that there’s a defect with the server.

Such types of  bugs frequently occur when a site is down currently or when certain  elements of the site are not available currently. You can check if any of such  issues  have  occurred in Google’s Crawl Errors report.

While such a situation is  unexpected, Google  suggest or recommend serving a specific 500-level response code when your site has decided to be down on maintenance.

They suggest using the 503 “Service Unavailable” response to avoid bad problems as a result of a crawler in order to  reach the page while it’s low for maintenance. This status code indicates search engines that the service is down temporarily and the page shall be active soon.

There are other significant HTTP status codes to get acquainted with, these are the important one for many SEO’s and developers.


OPTIMIZE YOUR ON PAGE CONTENT’S SETUP

T ill now we discussed what could be done to make sure that your site’s content is monitoring, crawled relevantly, and prompting the correct signals to Google. Now comes  the picture of the setting up the  actual pages, to make them  SEO-friendly .

Following are few guidelines to be  followed on a page-by-page basis.

URL Structure

Must and should your URL’s structure must be in all lowercase, and (while allowed) must avoid uppercase, as URLs are case sensitive and can hinder with direct traffic.

  • Use the page’s targeted keyword in the URL (e.g. example [dot] com / on - page - seo
  • Segregate the  keywords using dashes than underscores or spaces.
  • Keep your URL’s short & simple; too long URL’s my look spammy to users and search engines (e.g. example [dot] com / seo - is - literally - the - best - thing - in - the - history - of - ever - in - 2017 - you - get - the - picture )
  • Restrict the usage of parameters or otherwise ugly URL’s (e.g. example [dot] com / page = 1234 & color = blue )
  • Restrict usage of  unnecessary directories (e.g. example [dot] com / articles / english / 2017 / 09 / 08 / seo - is - the - best )
  • Avoid using non-escaped ASCII characters

Page Title Tags

Your page’s title tag is a crucial aspect of SEO ranking.  Because every page’s title is clearly visible in your browser or in the code.

  • Make sure you don’t miss title tags,make sure that every page on the site has active or valid titles.
  • Ensure that every page has a title tag value that is 100% exclusive.
  • Title tags must describe the page title tags in short and clearly.
  • It’s best practise to attach every page’s individual targeted keyword(s) in the page title,  avoid stuffing of keywords.
  • Begin your sentence using  your target keyword(s) at the very beginning of the title tag.
  • Make sure to include  your brand name in your title tag, preferably at the last.
  • Avoid title tag that are very  long — 60 characters is the maximum size. Moz supports a great title tag length preview tool.
  • Avoid title tags which are very short (under 30-40 characters), also the space is counted and you want to make the best out of it.
  • Avoid usage of values in the title tag that provide no value of information.
  • Avoid repetition or boilerplate titles
  • Meta Description Tags
  • Your page’s Meta Description tags, is not the basis for the  ranking , but still it’s important to optimize. Why? As they use it to describe in the search results, which implies that a well-written Meta Description  has a positive impact on user click through rates from SERP’s.

Evade  missing of any  Meta Descriptions,make sure that every page on your site has valid

Meta Description tags

  • Ensure that every page has Meta Description content that is 100% different.
  • Meta Descriptions  must describe in few sentences as few as 1 to 3 sentences.
  • Always include every page’s individual targeted keyword(s),  but don’t stuff too much keywords.
  • Avoid Meta Descriptions  which  are very long. 160 characters is the maximum cut off point.
  • Avoid Meta Descriptions which are very short (under 100 characters), as the space is valuable and you want to make use of it
  • Avoid repetition. or boilerplate Meta Descriptions

Heading Tags

  • There are around six heading tags which are visible on a web page's (H1, H2, H3, H4, H5, H6). H1 tags are most important in the SEO perspective.
  • Heading tags have the most critical pieces of  important content in a webpage and is often broken into chunks of information for readability.
  • All pages must have one H1 tag.
  • It’s not suggested to have multiple H1 tag in any webpage.
  • Preferably, the page’s first headings must be H1.
  • The page’s H1 must include the  targeted keyword(s).
  • H2’s – H6’s can be used  several times in a page depending on the sub headings, it must be used to break the content to organize the areas of a web page.
  • It’s acceptable to use target keyword(s) within H2-H6’s tags  as long as it’s not  considered spam.

On-Page Content

Apart from other  areas of optimization, the most overlooked area is the actual on page content. It comprises of headings,  images, links, text etc.

  • Pages which are too long (1,900-2,000 words) tend to secure better SEO rankings according to a recent  study.
  • Write engaging and innovative content that is written in the user’s perspective and then search engines.
  • Also, consider using Latent Semantic keywords within your page’s copy.
  • Include images, videos, graphics within the content to give enriched user experience.

Internal & External Linking

As mentioned  previously in SEO for Beginners guide, linking is essential for search engines to know the pattern of how they crawl on the web. Search engines make  use of links to find content, and use links to gain popularity.

It’s utmost important to ensure that your site and it’s pages are well-linked.

  • All pages must have a minimum of one link directing them to other pages for  discoverability purposes  as you wish to avoid any orphaned pages which have no links pointing.
  • The most critical pages on your site must not be more than very few clicks away from home page.
  • Your fundamental & secondary navigation should be clear & well-structured.
  • You must  include plenty of contextual links within your page’s body content.
  • Don’t refrain yourself, use internal links  such as 2-3 times per article.
  • Links must comprise of  descriptive,contextual, & keyword-rich.

Structured Data

What does structured data mean?

Structured data is a standardized approach to provide what’s the information on a page and segment the page based on the content. Google and other search engines  try hard to figure out the content on a page, you must provide some clues to inform the meaning of a page.

Search engines use structured data algorithm to figure out the  content of the page and to collect data. 

Also, search engines use structured data to enhance special search result features and enhancements. For instance, a page describing the  recipe along with structured data has more chances to be visible in  a graphical search result.

Social Media Buttons & Tagging

While social signals don’t contribute directly to the  Google’s search ranking algorithm, which makes its work easy for users to share your content  and gain visibility of the site which directly impacts the ranking.

According to a recent  study, the usage of  social sharing buttons promotes upto  7x more clicks.  But sadly having social sharing buttons as well as social markup has been overlooked  which has impact on the SEO.

  • Make sure all pages having social sharing buttons that are  displayed clearly and must be known how to use.
  • To  keep a track of the flow of information  shall be presented when using social media platforms.  Ensure all pages use valid Open Graph markup, and make sure all the components are well configured.

HTML Sitemaps

While it’s not directly does not contribute to the SEO success,  it’s evident that good HTML sitemaps  are helpful in promoting  crawlability, distributing equity across key site pages, also linking the pieces of  content.

  • HTML sitemaps must be accessible across all pages – typically via a site-wide footer link
  • Should be well structured by site section.
  • Must include links to  top-level sections.
  • Should NOT include links to every page on site (save that for the XML sitemap), unless the site is very small.
  • Do not include broken or missing links to redirect the pages.

ELIMINATE AND CONSOLIDATE DUPLICATE CONTENT

D uplicate content refers to URL’s with chunks of content that either match or appear the same.  Duplicate content implies to the pages within your site and the external pages.

Google doesn’t charge for  duplicate content but it impacts the SEO performance.

Duplicate content can:

  • Potentially water down the performance of a given URL by splitting the page’s value up across multiple URL’s rather than a single page. Hypothetically, if page A has 50 links, and a duplicate page B has 50 links, they may not perform as well individually as a single page with 100 inbound links.
  • Duplicate content  affects the  Google’s crawl budget – the number of URLs Googlebot can and wants to crawl – for your site. This is primarily a problem for websites that are huge when compared to the smaller ones.

Here’s how to fix duplicate content across your site:

  • Make sure that all versions of any given URL on your site  direct to the 301-redirect to a single location as opposed to multiple (e.g. http, https, www, and non-www versions of a URL should all redirect to a single preferred location).
  • Arrange a preferred version of your domain in Google Search Console.
  • Ensure every page has  canonical tags, and that canonical tags point to the preferred absolute URL of a given page. Most canonical tags will be self-referencing. And  remove duplicate Page Titles across the site.
  • Avoid duplicate Meta Descriptions across the site.
  • H1 tags must be restricted to single.
  • Avoid duplicate  throughout the  site.

OPTIMIZE ALL DIGITAL MEDIA INCLUDING IMAGES AND VIDEO

Apart from optimizing your site from a crawlability and page structure perspective, it’s critical to  optimize your digital media such as images, videos, etc.

The reason optimizing your digital media is important is three-fold:

Google and other search engines  cannot look out for  an image or video and grab the meaning of the  image or video  without some sort of contextual description. Images and videos must be well optimized. Images and video contribute to  boost the user experience and engagement

Optimizing your site’s images and video’s is simple:

  • Ensure the  images and videos  are well described.
  • Make sure that the file-naming follows the same conventions as URL-based best practices, as the file-name will be part of the URL when published.
  • Make sure that every images has Alt Tag attributes that are descriptive, correct and provide value.
  • Not suggested to  embed important text within images.
  • Incorporate some small captions for images and videos within your web page.
  • Crawl your site for broken images (e.g. 404 errors) and try to resolve them.

OPTIMIZE FOR ALTERNATIVE LANGUAGE/GEO-TARGETING

I f your site provides content to other users in several other languages or other regions, it’s a must to mention the version of the content being used.

How to optimize for language or geo-region:

  • Along with your primary site, consider  housing a version of your site on a Country Code Top-Level Domain (also called a ‘ccTLD’). ccTLDs  which tells to which specific geographic locations the user belongs to. For ex: au denotes Australia.
  • Use rel=alternate hreflang=x tags to mention if a page on your site has any other  alternative-language. This helps Google to send the valid version of the page based on The URL.
  • Use Google Search Console’s International Targeting settings  which serves as a basis to understand which country a URL belongs to and must be targeted.

HAVE A MOBILE PAGE SETUP STRATEGY

T oday mobile device is a must and having  a proper mobile strategy is  good. The key is that even to date many companies do not have a suitable mobile strategy. In the SEO’s perspective, mobile SEO or responsiveness gains more ranking according to Google.

Here’s how to optimize for mobile at the most basic-level:

  • Make sure that your site has at least one of three mobile design components such as Responsive Design, Dynamic Serving, or Mobile-Separate.
  • Use Google Search Console’s mobile usability reporting to understand.
  • Whether  your site’s pages are designed responsive features such as wider  screen, clickable elements  which are very close, text information that’s small to read etc.

“Nice-to-Have” mobile strategies

If you’re looking for sophisticated or highly advanced or “nice to have” strategies, consider the following:

  • Use Accelerated Mobile Pages (AMP’s)
  • Usage of  App Indexing and Deep Linking if you have an app to enable app content which displays in the  Google SERP’s of  your  webpages.

IMPROVE SITE SPEED TO 3 SECONDS OR LESS

A website’s  speed is the  most neglected  aspects of  SEO strategy.
Google has repeatedly said that users are patient enough to wait till 3 seconds. And if a page takes longer than 3 seconds the performance of a site is considered poor and users quit.

Google has  come up with a  useful tool to figure out the amount of traffic the site is losing the valuable customer only because of the low speed.,However efforts are put to substantially reduce the loading speed and cut down the time of loading.

 

Best site speed tools

Some of the best tools to keep a track of the site speed:

  • Google Page Speed Insights.
  • WebPageTest.org
  • GT Metrix
  • Varvy Pagespeed

 

Most common site speed issues

While there are several complexities, the most  frequently happening problem which affects the page loading in  desktop and mobile device are:

  • Number of calls you get from your server.
     
  • Not using Gzip to transfer the files across the network
     
  • Your server response time
     
  • The overall size of your web page which is primarily impacted by: the # of images you have per page, the size of each image, how effectively the images are optimized so that the loading time is compressed or reduced.
     
  • How effectively you’re using browser caching to store static resources on your user’s local devices, thus slowing doen the  secondary page load times.
     
  • How well you defer parsing of render-blocking JavaScript.
     
  • Usage of  lack thereof) of a Content Delivery Network.
     
  • Usage of  (or lack thereof) of HTTP/2.

While there are other components that affect the pagespeed,  targeting on these areas does significantly improve your site’s load times.


ENSURE SITE SECURITY AND PROTECT USER INFORMATION

G oogle has been pushing for increased site security for several years, with their initiative really coming to the forefront in their 2014 “HTTPS Everywhere” presentation.

Making sure the website has well or good security is mainly achieved by encrypting or coding the  information via the HTTPS (Hypertext Transfer Protocol Secure). HTTPS is an internet communication protocol which secures the data and transactions that happen between a user and a site. Users prefer a secure and safer experience when accessing a website.

 

Best practices for implementing HTTPS on your site:

  • Get genuine and secured  certificate from the hosting provider (minimum of 2048-bit key)
     
  • Make sure that you enforce an HTTPS URL preference via 301 server-side redirects across  every pages of the site.
     
  • Ensure that you frequently update all internal links on your site to point to HTTPS rather than HTTP.
     
  • Ensure that  you don't block the HTTPS version of your site from being indexed either via the robots.txt file or non index tags.
     
  • Setup the HTTPS version of your site in Google Search Console.
     
  • Watch out for security issues that must be  reported via Google Search Console including threats, virus, malware, deceptive pages, harmful downloads, etc.

How Can We Help You Today?

Award Winning Web Design, Web Development and Digital Marketing Agency. Since 2009, over 700 happy clients. Hyderabad | vijayawada | coimbatore | bangalore India and Dubai. FREE Consultation CALL +91 9908334546.