How To Do An SEO and Inbound Marketing Audit: Do You Have a Healthy Site?

Here Shweiki teams up once again with expert Alicia Lawrence of WebpageFX to present a must-watch webinar on performing an SEO and inbound marketing audit to determine if one has a healthy site.

Download Audio Button

 

Internet marketing has been listed as one of the  top 8 fun careers, so although this webinar focuses on how to do an SEO and inbound marketing audit, one shouldn’t think of it as a monotonous task, but as an annual checkup to see how healthy one’s site is.

In previous webinars, there’s been a lot of talk of content marketing and other various SEO tactics to help boost rankings and conversions, but these tactics won’t be very effective if the site has a few obvious flaws. A site check-up allows one to find opportunities that might have otherwise been missed and shines a spotlight on where one should focus their online efforts for the next year.

Tools To Use

As in a real-life check-up, the first step towards a cure is to go to the doctors–which in this analogy, are tools like Screaming Frog, SEMrush, Google Webmaster Tools, and Search Metrics–to see how well the site has been performing:

  • Screaming Frog:  This is where one can scan their website (errors, meta data, rel canonicals) to fix any page errors, most of which will be things like open tags, stray elements and attribute errors.
  • SEMrush: This enables one to get an overview of site traffic (whether it’s steady or seasonal) and discover how it ranks compared to competitors
  • Google Webmaster Tools: This should be set up (using the gmail connected to one’s site) to track impressions and technical data, using these steps:Add a Site> type in URL to popup> Login to FileZilla> public.html >Drag HTML verification file Google gave you into the folder>Verify

 

Things To Look At

Traffic: One should check to see if there are sudden drops, a steady incline, etc. Webmaster Tools gives one the ability to look at impressions one’s site would have had in the search index–not necessarily ones that made it onto the actual site, just those that could have.

Penalty: One would find this under “Manual Action” in Google and will need to figure out on which page, link or keyword the penalty is on. (A description of various penalties and how one can get their site back in Google’s good graces will be found in a future webinar.)

Site Architecture:  One should try to keep all  pages to a single subdomain as it will result in higher rankings for the pages. A single subdomain like a blog is fine, though the subject is up for debate whether subdomains pass value to the rest of the site.

Navigation: Sites should be less than three layers deep to make them easy to navigate and to make  articles accessible to searchers.

Keywords and Links:  One can try this tool from W-ebpageFX or utilize the “Links to Your Site” section of Webmaster tools. Google doesn’t like site-wide links.

Anchor Text: One’s anchor text should be broken up in the following manner:

  • 70% branded, URL, Brand+keyword, white noise (non-targeted anchor text)
  • 20% partial, phrase and broad match
  • 10% exact anchor text match

-No Follows Links: These should be considered for links under DA 30 or any links to off-topic sites (which one really shouldn’t be linking to at all).

Where Links Are Coming From: This needs to be well-rounded.

-URLs: One should avoid URLs that contain certain characters (?, &, $, =, +) and should be static, less than 100 characters and user-friendly.

Robots.txt: This restricts access to pages on  a site, but one should make sure they aren’t inadvertently blocking pages that they want Google to index. One should upload a file via FileZilla in a folder to tell Google which pages to crawl in that section. Pages to add a robots.txt file would be like “wp-content/plugins” since many plugins have links to the creator’s site.

NOINDEX tags: NOINDEX is another way to tell Google not to index a page in the search engines. One will want to use a NOINDEX, FOLLOW tag on pages like wp-includes on a page-by-page basis (such as for /tag, /category or /author to make sure they aren’t being crawled when duplicate. By allowing them to be FOLLOW, Google will still allow links to pass (credit james). One should also keep in mind that when NOINDEXing pages, it’s important to want to remove those from one’s sitemap as well. One should place the following code in the header: <META NAME=”ROBOTS” CONTENT=”NOINDEX, FOLLOW”>

Then one should use the Moz SEO Toolbar or look at the source code to see if the pages use canonical tags.

URL Redirects: In almost all situations, a 301 Redirect should be used. One can, however, use a 302 redirect for temporary pages because of a promotional campaign or something of the sort. Any page for which the URL is changed after publishing or is deleted should be redirected to the correct or similar page.

Duplicate Content: It’s crucial to ensure that there is no duplicate content. If one is using https and http and the same page appears on both, that’s duplicate content. There are a few ways to handle duplicate content:

  • Use a rel-canonical tag in the header pointing to the link one would like to be recognized as the one to index, and then one can change the content to make it unique.
  • Fix all broken links with a 301 redirect

Page Load Speed: To deal with this, one can use this tool or this plugin for WP. One should also do the following:

  • Get rid of plugins or unnecessary pages
  • Reduce number of javascript files and have them load asynchronously
  • Reduce images using JPEGmini
  • Combine CSS files

For a list of things to fix and how to fix them on both a website and mobile site, click here.

Like this Article? Subscribe to Our Feed!

Alicia Lawrence

Alicia Lawrence is a content coordinator for WebpageFX. Her work has been published by the Association for Business Communication, Yahoo! Small Business, and Advanced Web Rankings.

Leave a Reply

Your email address will not be published.