On Page SEO: 55+ On Page Optimization Techniques For Perfect SEO Audit

  • December 10, 2016
  • SEO
No Comments
on-page-seo-audit

Get actionable tips

Subscribe to our mailing list and get exclusive personalized content right in your Inbox.

Thank you for subscribing.

Something went wrong.


Are you having a hard time doing on page seo for your website or blog? I know I did when I first started doing on page optimization. Luckily you won’t have to dig here and there for checking and improving on page seo. Considering the importance of on page factors in seo, here are 55+ on page seo techniques checklist, that would help you to perform a seo audit flawlessly:

Here’s What I Am Going To Cover In On Page SEO Factors

1. Indexing and Crawlability 2. URL Redirection
3. Encoding and technical factors 4. Link Structure Optimization
5. Image Optimization 6. Title Tags and Meta Tags Optimization
7. Body Content 8. Markup
9. Page speed (Desktop) 10. Page usability (Mobile)

Indexing and Crawlability

Resources with 4xx status code: 4xx errors often point to a problem on a website. For example, if you have a broken link on a page, and visitors click it, they may see a 4xx error. It’s important to regularly monitor these errors and investigate their causes, because they may have negative impact and lower site authority in users’ eyes.

Resources with 5xx status code: 5xx error messages are sent when the server is aware that it has a problem or error. It’s important to regularly monitor these errors and investigate their causes, because they may have negative impact and lower site authority in search engines’ eyes.

404 page setup correctly: A custom 404 error page can help you keep users on the website. In a perfect world, it should inform users that the page they are looking for doesn’t exist, and feature such elements as: HTML sitemap, navigation bar, and a search field.

But most importantly, a 404 error page should return 404 response code. This may sound obvious, but unfortunately it’s rarely so. See why it happens and what it can result in, according to Google Search Console:

“Just because a page displays a 404 File Not Found message doesn’t mean that it’s a 404 page. It’s like a giraffe wearing a name tag that says “dog.” Just because it says it’s a dog, doesn’t mean it’s actually a dog. Similarly, just because a page says 404, doesn’t mean it’s returning a 404…


Returning a code other than 404 or 410 for a non-existent page… can be problematic. Firstly, it tells search engines that there’s a real page at that URL. As a result, that URL may be crawled and its content indexed. Because of the time Googlebot spends on non-existent pages, your unique URLs may not be discovered as quickly or visited as frequently and your site’s crawl coverage may be impacted.

I recommend that you always return a 404 (Not found) or a 410 (Gone) response code in response to a request for a non-existing page.”

Robots.txt file: Robots.txt file is automatically crawled by robots when they arrive at your website. This file should contain commands for robots, such as which pages should or should not be indexed. It must be well-formatted to ensure search engines can crawl and read it.

If you want to disallow indexing of some content (for example, pages with private or duplicate content), just use an appropriate rule in the robots.txt file.

For more information on such rules, check out robotstxt.org.

Please note that commands placed in the robots.txt file are more like directives rather than absolute rules for robots to follow. There’s no guarantee that some disobedient robot will not check the content that you have disallowed. Therefore, if you’ve got any secret or sensitive content on your site, robots.txt is not a way to lock it away from public.

.XML sitemap: An XML sitemap should contain all of the website pages that you want to get indexed, and should be located on the website one directory structure away from the homepage (ex. http://www.site.com/sitemap.xml). In general, it serves to aid indexing and saturation. It should be updated when new pages are added to the website, and needs to be correctly coded.

Besides, in this sitemap you can set the <priority> of each page, telling search engines which pages they are supposed to crawl more often (i.e. they are more frequently updated).

Learn how to create an .xml sitemap at sitemaps.org.

Resources restricted from indexing: A resource can be restricted from indexing in several ways:

– in the robots.txt file
– by Noindex X-Robots tag
– by Noindex Meta tag.

Each of this is a line of HTML code that says how crawlers should handle specific resources on the site. Specifically, the tag tells crawlers if they are not allowed to index the page or resource, follow its links, and/or archive its contents.

So make sure that your unique and useful content is available for indexing.

Redirects

Fixed www and non-www versions: Usually, websites are available with and without “www” in the domain name. This issue is quite common, and people link to both www and non-www versions. Fixing this will help you prevent search engines from indexing two versions of a website.

Although such indexation won’t cause penalty, setting one version as a priority is best practice, especially because it helps you save link juice from links with and without www for one common version.

Issues with HTTP/HTTPS site versions: Using secure encryption is highly recommended for many websites (for instance, those taking transactions and collecting sensitive user information.) However, in many cases, webmasters face technical issues when installing SSL certificates and setting up the HTTP/HTTPS versions of the website.

In case you’re using an invalid SSL certificate (ex. untrusted or expired one), most Web browsers will prevent users from visiting your site by showing them an “insecure connection” notification.

If the HTTP and HTTPS versions of your website are not set properly, both of them can get indexed by search engines and cause duplicate content issues that may undermine your website rankings.

Pages with 302 redirect: 302 redirects are temporary so they don’t pass any link juice. If you use them instead of 301s, search engines might continue to index the old URL, and disregard the new one as a duplicate, or they might divide the link popularity between the two versions, thus hurting search rankings.

That’s why it is not recommended to use 302 redirects if you are permanently moving a page or a website. Instead, stick to a 301 redirect to preserve link juice and avoid duplicate content issues.

Pages with 301 redirect: 301 redirects are permanent and are usually used to solve problems with duplicate content, or if URLs are no longer necessary. The use of 301 redirects is absolutely legitimate, and it’s good for SEO because 301 redirect will funnel link juice from the old page to the new one. Just make sure you redirect old URLs to the most relevant pages.

Pages with long redirect chains: In certain cases, either due to bad .htaccess file setup or due to some deliberately taken measures, a page may end up with having two or more redirects. It is strongly recommended to avoid such redirect chains longer than 2 redirects since they may be the reason of multiple issues:

  • There is a high risk that a page will not be indexed as Google bots do not follow more than 5 redirects.
  • Too many redirects will slow down your page speed. Every new redirect may add up to several seconds to the page load time.
  • High bounce rate: users are not willing to stay on a page that takes more than 3 seconds to load.

Pages with meta refresh: Basically, meta refresh may be seen as violation of Google’s Quality Guidelines and therefore is not recommended from SEO point of view.

As one of Google’s representatives points out: “In general, we recommend not using meta-refresh type redirects, as this can cause confusion with users (and search engine crawlers, who might mistake that for an attempted redirect)… This is currently not causing any problems with regards to crawling, indexing, or ranking, but it would still be a good idea to remove that.”

So stick to the permanent 301 redirect instead.

Pages with rel=”canonical”: In most cases duplicate URLs are handled via 301 redirects. However sometimes, for example when the same product appears in two categories with two different URLs and both need to be live, you can specify which page should be considered a priority with the help of rel=”canonical” tags. It should be correctly implemented within the <head> tag of the page and point to the main page version that you want to rank in search engines. Alternatively, if you can configure your server, you can indicate the canonical URL using rel=”canonical” HTTP headers.

Encoding and technical factors

Pages with multiple canonical url’s: Having multiple canonical URLs specified for a page happens frequently in conjunction with SEO plugins that often insert a default rel=canonical link, possibly unknown to the webmaster who installed the plugin. Double-checking the page’s source code and your server’s rel=”canonical” HTTP headers configurations will help correct the issue.

Conflicting character encoding: Character encoding is an important line of code that tells the browser which character set (charset type) a website uses. This helps ensure the browser displays the page correctly across all media, which also helps reduce the page’s load time among other things.

If the character set in your HTTP header and the one used on the page do not coincide, the webpage may be rendered incorrectly, not to mention that this will look as sloppy coding to search engines and may affect your rankings.

Pages with Frames: Frames allow displaying more than one HTML document in the same browser window. As a result, text and hyperlinks (the most important signals for search engines) seem missing from such documents.

If you use Frames, search engines will fail to properly index your valuable content, and won’t rank your website high.

Pages with W3C HTML errors and warnings: The validation is usually performed via the W3C Markup Validation Service. And although it’s not obligatory and will not have direct SEO effect, bad code may be the cause of Google not indexing your important content properly.

I recommend checking your website pages for broken code to avoid issues with search engine spiders.

Pages with W3C CSS errors and warnings: The validation is usually performed via the W3C Markup Validation Service (W3C stands for World Wide Web Consortium).

CSS styles are used to control the design and formatting of the page, and to separate styles from the structure, which ultimately makes the page load faster.

Errors in CSS may be not that important to search engines, but they can lead to your page being incorrectly displayed to visitors, which, in turn, may affect your conversion and bounce rates. So, make sure the page is displayed as intended across all browsers (including mobile ones) important to you.

Too big pages: Naturally, there’s a direct correlation between the size of the page and its loading speed, which in its turn is one of the numerous ranking factors.

Basically, heavy pages load longer. That’s why the general rule of thumb is to keep your page size up to 3MB.

Of course, it’s not always possible. For example, if you have an e-commerce website with a large number of images, you can push this up to more megabytes, but this can significantly impact page loading time for users with a slow connection speed.

Dynamic URL’s: URLs that contain dynamic characters like “?”, “_” and parameters are not user-friendly, while they are not descriptive and are harder to memorize. To increase your pages’ chances to rank, it’s best to setup dynamic URLs so that they would be descriptive and include keywords, not numbers in parameters.

As Google Webmaster Guidelines state, “URLs should be clean coded for best practice, and not contain dynamic characters.”

Too long URL’s: URLs shorter than 115 characters are easier to read by end users and search engines, and will work to keep the website user-friendly.

Links

Broken links: Broken outgoing links can be a quality signal to search engines and users. If a site has many broken links it is logical to conclude that it has not been updated for some time. As a result, the site’s rankings may be downgraded.

Although 1-2 broken links won’t cause Google penalty, try to regularly check your website and fix broken links if any, and make sure their number doesn’t grow. Besides, your users will like you more if you don’t show them broken links pointing to non-existing pages.

Pages with excessive number of links: According to Matt Cutts (head of Google’s Webspam team), “…there’s still a good reason to recommend keeping to under a hundred links or so: the user experience. If you’re showing well over 100 links per page, you could be overwhelming your users and giving them a bad experience. A page might look good to you until you put on your “user hat” and see what it looks like to a new visitor.”

Although Google keeps talking about users experience, what they can really hurt if they see way too many links on a page is its rankings. So the rule is simple: the fewer links on a page, the fewer problems with its rankings.

In fact, there’s nothing to add here. Just try to stick to the best practices and keep the number of outgoing links (internal and external) up to 100.

Dofollow external links: Simply speaking, dofollow links are links missing the rel=”nofollow” attribute. Such links are followed by search engines and pass PageRank (please note that links can also be restricted from following in bulk via the nofollow <meta> tag).

While there is nothing wrong with linking to other sites via dofollow links, if you link extensively to irrelevant or low-quality sites, search engines may conclude your site sells links or participates in other link schemes, and it can get penalized.

Images

Broken images: While broken images on the website don’t influence its search engine rankings directly, they definitely deserve being fixed for two reasons.

First and foremost, broken images are a crucial factor for user experience and may result in visitors bouncing away from the site without completing their goals.

And second, missing images may impede the site’s crawling and indexation, wasting its crawl budget and making it hard for search engine bots to crawl some of the site’s important content.

Empty ALT texts: While search engines can’t read text off images, alt attributes (also known as “alternative attributes”) help the former understand what your images portray.

The best practice is to create an alt text for each image, using your keywords in it when possible to help search engines better understand your pages’ content and hopefully rank your site higher in search results.

Title tags

Empty title tags: If a page doesn’t have a title, or the title tag is empty (i.e. it just looks like this in the code: <title></title>), Google and other search engines will decide on their own, what content to show on the results page. Thus if the page ranks on Google for a keyword, and someone sees it in Google’s results for their search, they may not want to click on it simply because it says something absolutely not appealing.

No webmaster would want this, because in this case you cannot control what people see on Google when they find your page. Therefore, every time you are creating a webpage, don’t forget to add a meaningful title that would attract people.

Duplicate titles: A page title is often treated as the most important on-page element. It is a strong relevancy signal for search engines because it tells them what the page is really about. It is of course important that title includes your most important keyword. But more to that, every page should have a unique title to ensure that search engines have no trouble in determining which of the website pages is relevant for this or that query. Pages with duplicate titles have fewer chances to rank high. Even more, if your site has pages with duplicate titles, other pages may be hard to get ranked as well.

Too long titles: Every page should have a unique, keyword rich title. At the same time, you should try to keep title tags not too long. Titles that are longer than 55 characters get truncated by search engines and will look unappealing in search results. You’re trying to get your pages ranked on page 1 in search engines, but if the title is shortened and incomplete, it won’t attract as many clicks as you deserve.

Keywords in title: The title tag is considered one of the most important on-page elements, so remember to use keywords in it. For higher rankings, place important keywords towards the beginning of your title.

Because the title text is usually the most prominent part of your page displayed in search results, make it click-enticing and appealing to human searchers.

Meta tags

Empty meta description: Although meta descriptions don’t have direct influence on rankings, they are still important while they form the snippet people see in search results. Therefore, it should “sell” the webpage to the searcher and encourage him to click through.

If the meta description is empty, search engines will themselves decide what to include in a snippet. Most often it’ll be the first sentence on the page. As a result, such snippets may be unappealing and irrelevant.

That’s why you should write meta descriptions for each of your website pages (at least for the landing pages) and include marketing text that can lure a user to click.

Duplicate meta descriptions: According to Matt Cutts, it is better to have unique meta descriptions and even no meta descriptions at all, then to show duplicate meta descriptions across pages. That’s why make sure that your top important pages have unique and optimized descriptions.

Too long meta descriptions: Although meta descriptions don’t have direct influence on rankings, they are still important while they form the snippet people see in search results. Therefore, it should “sell” the webpage to the searcher and encourage him to click through. If the meta description is too long, it’ll get cut by the search engine and may look unappealing to users.

Keywords in meta description tag: The meta description is an important on-page element, so pay particular attention to it. Not only do search engines consider it to understand what your page is about, but they also display your meta description in search results.

Hence, make your meta description both keyword-rich and appealing to human searchers. For higher rankings, place important keywords towards the beginning of the meta description.

Note: Meta descriptions that are not in line with the page’s content (or are overly long/short) may be ignored by search engines, and the latter may pick a random text snippet to show in search results instead of your description.

Keywords in meta keywords tag: The meta keywords tag used to play a big role at the dawn of the Internet, but is largely ignored by search engines today. Besides, if it has your keywords, you run a risk of handing your keyword strategy over to the competitors.

However, if you still choose to use the meta keywords tag (e.g., if you target search engines like Baidu that are believed to still rely on meta keywords), make sure it’s relevant to the page’s content and contains a moderate number of keywords. Also, use comma (,) to separate keywords within the tag.

Body content

Keywords in body: If your target keywords are present in the webpage’s body, the page stands a better chance of high rankings. For greater impact, place keywords towards the beginning of your page.

You can use keywords without any additional tags on the page, or put your target search terms in tags like H1, H2-H6, Bold, Italic or Link anchor. Such tags are also part of your page’s body and should be attended to during its optimization.

Also try to make the page’s content not only keyword-rich, but also appealing to human visitors: maintain a logical structure, use headings and bullet points, add visuals and other multi-media content.

Be sure to refrain from black-hat page optimization practices (such as keyword stuffing, making your text the same color as the background, etc.), while these could incur a search engine penalty.

Word count in body: Webpages with a lot of textual content tend to rank higher in search results, as they’re considered more useful to users. It’s also a good practice to place textual content above the fold, as search engines may frown at pages top-heavy with ads and other non-textual content.

However, specific recommendations on how much text to put on a page vary from niche to niche, and it’s better to look at competitor averages to get the optimum word count for your particular keyword niche.

Keywords in H1: The H1 tag serves to mark the heading on a page. The best practice is to use just one H1 tag per page, while all subheadings should be marked up using less important heading tags: H2 to H6 tags subsequently.

To achieve higher rankings, place your keyword(s) towards the beginning of your heading.
If you’re optimizing your page for multiple keywords, it’s best to define your top-important keyword and use it in your H1 tag.

Keywords in H2-H6: H2-H6 tags serve to mark subheadings on a page (unlike the H1 tag that’s used to mark the main heading). It is important to observe the hierarchy of H2-H6 tags on a page to avoid confusing search engines with your coding. For example, it’s illogical use the H3 tag ahead of the H2 one, and so on.

At the same time, the use of keywords in H2-H6 tags is optional and would only give your page a slight ranking boost for such keywords.

Keywords in bold: Bold text is used on a page to give certain words additional emphasis (e.g., if you write <b>bananas</b> or <strong>bananas</strong> in the code – the visitors will see bananas). Search engines usually consider words in bold text slightly more important than words with no additional mark-up.

Hence, wherever natural, try to use your important keywords in bold text, but don’t overdo to avoid over-optimization.

Keywords in italic: Italics (or italicized text) are used on a page to give certain words additional emphasis (e.g., if you write <i>bananas</i> or <em>bananas</em> in the code – the visitors will see bananas). Search engines usually consider words in italics slightly more important than words with no additional mark-up.

Hence, wherever natural, try to use your important keywords in italics, but don’t overdo to avoid over-optimization.

Keywords in link anchors: Link anchor texts are visible parts of links present on your page. Although the use of keywords in anchor texts helps strengthen your page’s semantic whole, search engines are more likely to attribute the meaning of a link’s anchor text to the page the link is pointing to.

Hence, you could also check the internal links pointing to your current page and make sure their anchor texts are in line with your page’s semantics.

Markup

Open graph markup: Open Graph is a special page markup protocol that lets webmasters influence the social snippets of their shared URLs. It is the Open Graph tags’ data that determines the title, description and thumbnail image that accompany a URL when someone posts it to social networks.

Right now Open Graph markup is supported by all the major social networks, including Facebook, Twitter, Google+, Pinterest, LinkedIn, Instagram and others.

The Open Graph tags need to be placed within meta tags in the head of a webpage. And they must be present on every page that may be shared on social networks to make sure it gets the most relevant and compelling snippet.

Structured data markup: Structured data markup serves to better structure the data that’s being sent to search engines and to label certain types of content: reviews, events, product information, the information about your organization, etc.

Such markup can greatly improve the presentation of your page in search results. For example, with its help you can create rich snippets that look extremely attractive and can increase click-through rates to up to 300%.

Consider using Google’s Structured Data Testing Tool to check the validity of your structure markup (please note that intentionally misleading semantic markup could incur a search engine penalty).

Page speed (Desktop)

Page size: This factor measures how big your page is, including all elements like JavaScript, CSS, images, etc. In terms of SEO it is recommended to keep your page size below 3MB. Apart from directly affecting page speed and user experience, page size is an important ranking factor. Make sure your page loads quickly and doesn’t include too many components that can increase its load time.

Server response time: Server response time is an important user experience factor, as it contributes to the page’s overall load speed.

Too many redirects: Too many redirects can significantly increase your page’s load time, thus negatively impacting user experience.

Uncompressed resources: GZIP compression of text resources (HTML, JavaScript, CSS) significantly reduces page size, which leads to improved page speed. GZIP compression is configured on the site’s server and applied to all site’s resources.

Uncompressed images: Using uncompressed images or images with unnecessarily high resolution will have a negative impact on page speed. Optimizing images can often yield some of the largest byte savings and performance improvements. Using PNG and JPEG formats for photos and GIF for smaller images is recommended. Using CSS sprites also helps improve page speed, as it reduces the number of images the browser has to load.

Uncached resources: Browser caching is an important page speed factor. Caching makes the same page load faster when the user revisits it, and speeds up the loading of the site’s other pages that use the same resources.

Unminified resources: Minifying resources means getting rid of unnecessary spaces, empty lines, and line breaks in the page’s HTML, JavaScript, and CSS. Pages with minified resources make for a faster page speed. For more info, see Google’s PageSpeed Insights

Render-blocking JavaScript/CSS: This factor shows whether there are JavaScript or CSS issues on your page. The JavaScript code used for loading the above-the-fold part of your page needs to be inlined, while the scripts that are not critical to initial render should be performed after the first render.

If external CSS resources are small, you can inline those as well (insert those resources directly into the HTML document). This will allow the browser to proceed with rendering the page. If you have a large CSS file, try to identify and inline the CSS necessary for rendering the above-the-fold content and defer loading the remaining styles until after that. Click here for Google’s recommendations on optimizing CSS delivery.

Above-the-fold content prioritized: Prioritizing the above-the-fold part of your page ensures that the page’s visible portion loads quickly. If large amounts of data are required to load the page’s above-the-fold portion, additional round trips between your server and the user’s browser will need to be made.

Page usability (Mobile)

Mobile friendly: According to Google, the mobile-friendly algorithm affects mobile searches in all languages worldwide and has a significant impact on Google rankings. This algorithm works on a page-by-page basis – it is not about how mobile-friendly your pages are, it is simply are you mobile-friendly or not. The algo is based on such criteria as small font sizes, tap targets/links, readable content, your viewpoint, etc.

Viewport configured properly: A viewport controls how a webpage is displayed on mobile devices. Without a viewport, mobile browsers will render the page at a typical desktop screen width, scaled to fit the device’s screen size. This will likely lead to readability and navigation issues on mobile devices.

On pages optimized for mobile devices, the block contains the viewport tag with the following contents:
<meta name=viewport content=”width=device-width, initial-scale=1″>

Too small text: This factor shows whether there is text on your page that can be hard to read due to its small size. This can occur if the page hasn’t been optimized for mobile devices (e.g. the viewport is not set up, or it is configured to use a fixed width). As a result, mobile browsers will automatically scale the page to fit the screen, making text smaller and harder to read.

Too small tap targets: Tap targets are elements on your page that you expect users to click on, such as links and buttons. Small or tightly packed tap targets are difficult to press on touchscreen. This can occur if the page hasn’t been adjusted for mobile devices (e.g. the viewport meta tag is missing or not configured properly). As a result, mobile browsers will automatically scale the page to fit the screen, making page elements and the space between them smaller.

Content outside viewport: Page content that falls outside the viewport is content that does not fit the user’s screen width, thus forcing the user to scroll horizontally to view it.

Page uses plugins: Plugins help browsers process certain types of web content, such as Flash, Silverlight, and Java. Most mobile devices do not support plugins; in those that do, plugins are a leading cause of hangs, crashes, and security incidents. Many desktop browsers also restrict plugins:

Internet Explorer runs without plugins in Windows UI mode.
Chrome intends to remove support for most plugins.
Firefox will prompt users before running most plugins.

Up to you …

Have I missed any important on page seo checklist? Please comment below to share your on page optimization points for performing a better seo audit.

Manas Chowdhury

Digital Marketing aficionado holding PG in Economics. Championing online marketing, branding and advertising strategy for the digital commerce, online and social media spaces driving YOY increased growth, ROI and revenue. Up-to-date and in tune with Google philosophy.

No Comments