- Monday, May 11, 2015

Perform Your Own SEO Audit

Recently I had to write how to perform an SEO audit in a word count smaller than I wanted. So now I'm going to perform an SEO audit on my sites, as a nonexpert, and write about it. As boring as it is to read this article, it was even more boring to write. I really took one for the team on this one.

SEO Audit | Blogging | Internet Marketing

The focus on this article is on blogging. If you have a regular website like Doc Johnson Photography (because that would be an interesting name), this information will still be helpful, and all of it still pertains to you.

I also have a free SEO Checklist printable for those who sign up to receive our newsletter. You should totally check it out because this is a lot of information, and the free printable chunks it making it less overwhelming. You may decide to print the printable and read that first, and then come to the blog for just the things you are questioning.

I want to point out that I have several wordpress blogs and several blogger blogs. The wordpress blogs also have all the plugins I thought I needed for awesome SEO to include Yoast, WordPress SEO (for keywords), Wordfence (for faster page load times), and a few social media share plugins (for sharing). After performing my own SEO audit, I can testify that it isn't enough by itself.

The point of an SEO audit is to make sure your website is as search engine friendly as possible. It's almost a form of networking with robots. A lot of the info in here will give you tips for some blogging lifestyle changes that will help your blog attract robot love with its SEO Juice.

An SEO audit has 3 main steps.

1. Gather Information
2. Analyze Information
3. Make Some Changes

1. Gather Your SEO Information

There are various places we will use to gather information, and a few of them offer the same things (with different results), and that's ok. Combining them will make for a more thorough check on things than going it with just one. 


The first step is to crawl. Not on your hands and knees. Get up. I mean crawl like Google. Install a crawler to search your sites like Google does. Screaming Frog SEO Spider is highly recommended by all sites discussing this topic, and it will forever be known on this article as, "The Frog." Once you download and open it, enter your website to crawl. You should get a bunch of lists about your website. Yes. That's all it does. It just looks through your site the way Google and Bing does. If the results look like your computer threw up on screen, that's ok. Grab your geek glasses and a cup of hipster-mocha-latte, and we'll talk about it here in a minute. You can also check out The Frog's User Guide.

My first ERROR

I downloaded it and tried to open it. It kept saying I needed to be running more up to date Java with a message like, "java runtime corrupted." I have Java. It's the most up to date. And like anything else that's free, in order to get any support for it, you have to pay them money defeating the awesome purpose of it being free. In googling "java runtime corrupted," I found people attempting to play Minecraft had the same issue (and this is why my nephew couldn't get Minecraft on my computer last month). 

The forum listed many different ways to remedy the situation, but I went to the following Oracle Java Link, and opened a new browser because I use Chrome and it won't work on Chrome, and it too couldn't find Java. I first checked my Java to see it was enabled, so then I clicked as if I wanted to install Java, and it let me install a new one despite the fact that I already have Java running somewhere else on my computer. And then I ran the test again after restarting my browser, and it still couldn't find it, but The Frog was able to open at that point. 

Webmaster Tools

You'll want to sign up for Google Webmaster Tools and Bing Webmaster Tools. if you haven't already.

Haven't signed up? Feel lost? Check out these videos.

How To Sign Up for Google Webmaster Tools
How To Sign Up for Bing Webmaster Tools 

When you've signed up websites, you'll need this data as well for an SEO audit; unfortunately, these places don't always crawl and obtain data immediately from signing up. That's ok. Do what you can without it and then check back at a later date to use Webmaster Tool data.

Also, if you are new to Bing, I totally suggest playing around on their Help site set up similarly to the webmaster tools with explanations. For instance, they have a "How to Disavow Links" that is designed for listing spammy pages linking to you.

MOZ Bar: Domain Authority

This tells how awesome you rank for quality, trust, and good SEO behavior. It's like your website's credit score, or better said, your SEO grade had this been a test.

40-70 is good. 70+ is great.

To find out your domain authority, you can install MOZ Bar into your browser, which I highly suggest as it gives ranks for any site on the web (including your own and sites you're searching). You can hide it and show it, and it's about as impeding on my life as Pinterest's button.

The higher your score, the less you'll need to do for an SEO Audit. The lower the score, the more work that's ahead of you. Don't fret about a low score. My score for this blog is very low, mainly because I don't blog enough to share enough.

To improve domain authority, you can do a number of things. Check out these articles on the subject.

5 Best Techniques to Increase Domain Authority
5 Practical Steps to Improve your Website's Domain Authority
SEO Infographic


As we sort through the things we are analyzing, there will be other places to gather some information listed.

2. Analyze Information

This might get long and boring, but we are strong, determined people. We CAN do this. We are doing it for the people. The people need to be able to find you. This will help people find you.

You really can do these in any order. Some are more boring than others. There's a lot to look at, but for the most part, reading about it will take longer than doing it.

I'm also not sure what information is NECESSARY versus FLUFF from researched sites. But I'm going to attempt all this and see if there's an SEO improvement.

I've broken this down by:



What pages are NOT being indexed? 

We want to make sure your pages are accessible to the crawler. This isn't about making sure people can see it. It's about making sure robots can see it. The main place that tells robots not to look is the robot.txt file on your website. It's the place robots pay to see their own sinful peep shows.

In Google Webmaster Tools, you can access the robot.txt file. Click on the site, on the left, click the crawl option, and from there, robots.txt Tester. It should say stuff like user-agent:* and some disallow options. Make sure there aren't any pages listed there that you want the search robots to find. You do want your "unfriendly" pages listed here to avoid getting penalized (like if you're breaking a big SEO rule somewhere). And if you see the word Sitemap, that's telling Google where to find your sitemap, so that's totally allowed to be there.

On Bing Webmaster tools, click configure my site, and then Block URLs. Make sure you don't have any URL's on there unless you don't want search engines finding it. In Bing, I don't think this is your robot.txt file. I think this lists pages that have a NOINDEX meta-tag to the HTML header of the page. It lets the robot see it, but it won't let the robot index it. Whether I'm right or wrong, the important thing is to make sure there isn't a web page listed there that you want robots to show people.

The Frog will also help you double check accessibility. Go to the Directives Tab and Filter by your choice like No Index.

New Geek Vocabulary Terms:

robot.txt: it hides the skeletons from search engines
no index: it gives the search engines permission to see it, but not index it to share it


Both search engines should have your sitemap because the internet says so. A sitemap lists pages for crawlers, and it's usually an xml file. Not all crawlers use them all the time, but you are not penalized for submitting them, and in some cases, it is used.

A sitemap is a file you upload to your server. You submit the web address of the file to the search engines, so you only need to submit the file once to each robot unless you change its name, but you do want to update the file itself with new pages as you create them.

Most of the people reading this blog are like me, Wordpress users, and we're in luck. Wordpress automatically creates and updates the sitemaps. Just type in yourblogaddress.com/sitemap.xml. On my main blog, I had numerous sitemaps show up in a list. Those are really child sitemaps. All you need to submit to the robots is the main one, sitemap.xml. Now if you self-host and aren't seeing any, try using the following plug-in: Yoast's Wordpress SEO. You only need the free version.

For Blogger blogs, if you type in your blog's address with a /sitemap.xml, you'll only get the last 25ish posts. That's not what you want. I don't understand the hack, but if you submit it this way, you'll get the whole blog. Type in whatever address you have with the search engine and add


Make sure you don't accidentally get any spaces in that if copying and pasting.

My Second Error

The first time I attempted it, it didn't work. But when I entered the information on this sitemap generator, it provided a sitemap address in the robot.txt file it created, so I copied that sitemap's address and pasted, and it worked. I used it on all my blogger blogs without regenerating. 

If you need to make your own sitemap, Screaming Monkey will create one for you, or you can check out Google's list of 3rd parties for sitemaps, or this Sitemap Generator. Once the sitemap is generated, you'll need to download the file to your computer, and then upload to your server. Once on the server, collect the web address for its location and submit that to the search engines. Just keep in mind, as you add pages, you will want to update the sitemap FILE (not the search engine submission). You can just regenerate new ones every time you add pages, or you can learn how to manually add a page yourself in code. You can also consider switching to Wordpress. Many wordpress blogs make beautiful static websites like I did here for myself as a writer.

Once the sitemap is created, submit them to the robots. In Google, you click Crawl, and then Sitemaps. There's a button on the top right to add a sitemap. In Bing, you click Configure My Sites and Sitemaps.

For an in-depth SEO analysis, you can check the pages in the crawler with the pages in the sitemap. If you have pages on the sitemap listed that aren't in the crawler, then you want to update the sitemap. If you have pages in the crawler listed that aren't on the sitemap, find a place in your architecture to put them (like by adding categories in a Wordpress blog).

Site Search your Site

Go to Google's website and search (using your site instead of website below)


At the top, it tells you how many results. Compare that to how many pages you actually have. From the Frog, go to the Internal Tab, filter by html, and on the bottom of the spreadsheet, it will give you a total.

If they are about the same, then you're good to go.

If Google has a lot more pages than the Frog, check to see if you have duplicate content by going to this address


If you have duplicate content, there will be some sort of warning saying that it omitted similar results. If that's the case, find that content, combine to one page, and redirect old pages to it. (See Redirect Pages) If there's no warning, I can't tell you what it is, but I can tell you I had this issue with my Wordpress blogs, and I noticed Google was indexing archives and old Homepage (that shows the recent posts on top) pages that the Frog was not. Also check out the subheading Duplicate Content on this post.

If Google has less pages than the Frog, you want to see why Google isn't indexing those pages. It could be you are penalized. I wouldn't worry about it unless it's obvious Google is ignoring your site. In many cases of an actual penalty, they will send a message to your Webmaster Tools. It's also possible some of your pages aren't getting indexed due to algorithm changes, and it may or may not fix itself in time, especially after you perform an SEO audit. But you can Google to see if other sites were affected by the algorithm changes.

Search Yourself

Google and Bing your name, whether it's your name, your blog's name, or a business name. If your site is toward the top, that's good. If not, it's something to work on. Start first by making sure you aren't being penalized, and then work from there.


HTML Code Compliance

Make sure your html is too legit to quit. You can use the W3C Validator. You'll probably have to google some of the results.

Page Load Speed

You can test your page load speed with Google's Page Speed Tool, and Pingdom will break it down by each process so you can see what exactly is slowing you down. Google recommends that a page load in 1.4 seconds or less.

Site Architecture

In the frog, go to the right hand side and click the tab, "Site Structure." You want it to be kind of flat utilizing both vertical and horizontal linking. Reduce clickage to important pages, and increase links between pages.

HTML navigation can be read by robots. Many robots struggle reading Flash and Java, so you want to avoid those for a navigation system.

You might as well also take the time to think about how user friendly your site navigation is.

Site URL's

Take a look at the lists of URL's on the Frog (Internal Tab on the top left). Your URL should be static (as opposed to dynamic), where most of the address are numbers, letters and phrases. The length should be under 115 characters. It should be easy to remember, contain relevant key words, and use phrases or words over numbers. Hyphens are better than underscores, but use sparingly. Avoid subdomains like this.that.website.com, and avoid unnecessary folderes like website.com/folder/folder/post.html. Try to keep it all lower case if possible.

If a lot of the URL's don't adhere to this, just make a mental a note to change that for the future. Switching URL's would require a redirect from every OLD URL. You can't get rid of the URL once you have it (like herpes), so your best bet is just to take a mental note and move forward with that.

URL Codes

Go into the frog, and check the code and status on the Internal Tab (you might have to scroll to the right).

First let's check redirects. Try to use 301 redirects instead of 302, so look for 302's in the codes.

Second let's check errors. You don't want any 400's because those are errors. If you see any, find the page and correct it or redirect to a good page.

Here's a list of codes and what they mean.


Internal Links - Outlinks

These are links you enter on your site linking out to another place. You can analyze these from webmaster tools and the Frog.

The more you link, the easier it is for search engines, but you want less than 100 links per page. The links should be relevant and natural and link to trustworthy sites (high domain authority). Mix up the anchor text: some Click Here and Learn More, others a keyword or company name or article title. Every page should have at least 2-3 links with rich text keywords used 10-30% of the time. Don't link to redirects if you can. Try to aim for going directly to the page.

You don't want a lot of broken links in your linkage. A lot of times, the link breaks over a period of time, so it was good the last time you checked your blog post, but not anymore. If you go to the frog, and check the External Tab, you can peruse the codes in the list looking for 400's to repair broken links and 300's to change the link to hit the actual page instead of a redirect. You can also click on the Response Codes tab for a list, and you can filter by coding. You can also try Link Checker to see what you got.

Backlinks - Inbound Links

These are links other people link to your page or site. The more others link to your site, the better, unless they are spammy. The other sites linking to you should be relevant and use relevant anchor text. The more different the root domains of these links, the better, especially if they are from popular sites and trusted domains. It's better if they link to an internal page (like a specific post) than to the home page.

In addition to your webmaster tools, Open Site Explorer helps you see who is linking to you. If you see a bad site linking to you, go to your webmaster tools, and disavow the links.

If any of the links to you are broken links, like someone had a typo, but the root domain is correct, you'll want them to correct it, but if they can't, you can create a page to the bad address and have it redirect to the good page.

Did you switch Blogging Platforms?

If you switched from Blogger to Wordpress, you want to clean up your old blogger addresses, especially if you had it pointing to a domain you purchased.

The easiest way I found to redirect old posts to new posts is using the Blogger to Wordpress Plugin. The instructions don't mention it, but when you copy the code it forms for you, paste it at the very top of your Blogger template before anything else. I used this plugin over a year after switching, and it didn't interfere with the plugin I had used to switch, Blogger Importer.


Content is King. Everyone else says that at this point, and I didn't want to be left out.

Check out your pages. Google ranks pages with over 2,400 words the highest, and they like at least 300 words. Most people aim for 400-600 words. These are words in text, not java or images. You want the content to be unique and quality. Use keyword in the first paragraph, and then 1-2 more times in text and conclusion. No spammy keyword stuffing, and use good grammar. You don't want multiple pages with the same keywords. Merge them if you got them and redirect the old page.

Duplicate Content

Two different addresses going to the same page without a redirect is bad SEO. For example, a printer-friendly version of an article. You don't want two pages that look so alike, Google is seeing double like it drank too much booze.

Look for plagiarized content with Copyscape. And look for duplicate content on your site with Siteliner.

Wordpress creates duplicate content on its own. It has the same content appearing under posts, tags and categories. I've read quite a few articles on the subject, and they seem to contradict, but I'm planning to remove all my tags and just let the categories be duplicates. I linked to a couple articles below on this subject.

Some will say to no-index or robot.txt the duplicates, but Google says not to go that route. In many cases, you can do a 301 Redirect. In other cases, an option is to Rel=Canonical Label the page. That labels the page as a duplicate, but ok. It's kind of like introducing a cloned robot friend to Google's Robot saying, "This is not me! This is my clone!" To Rel=Canoninize, add the following to your clone's head (in code, head, get it?)

<link rel="canonical" href=http://www.theWebAddressOfOriginalMaterial.com"/>

For syndicated content, try to get the site to link to your post (not your page) at the bottom. That let's Google know where the original content is. If you provide any creative commons, make sure to require attribution to the post.

Switching from blogger to wordpress can create duplicate content. Check out Did you Switch Blogging Platforms? Above.

Check out some of these for more information:








From the frog, you can see title tag length if you scroll to the right from the internal tab. You can also click on the Page Titles tab.

The titles should be no more than 70 characters. It should describe the content on the page well and relate to that topic. It should contain a keyword, preferably toward the front if possible. Avoid duplicate titles (check for this by going to Google Webmaster).

Meta Descriptions

From the frog, click the Meta Description tab.

Meta Descriptions should contain keywords, but they should also read naturally. Make sure you have meta description tags. Help for Blogger Users. As a Blogger and Wordpress user, I'm not going to worry about meta descriptions on my Blogger Blogs.  

Make sure they aren't duplicated on other pages. They should be unique, relevant, descriptive, and contain a call to action. They should also be between 51 and 160 characters. They should also use correct grammar with no more than 5 commas.


If you really want to analyze your key word usage, take a look at the Frog's title tag and meta description. That tells you what you have been aiming for with key words. Then check Google Analytics to see if that's what you're getting (via search terms that brought people to you). You can do some keyword planning by using Google's Keyword Planner.

In addition, for Wordpress users, Wordpress SEO is a great plug in for entering and planning keywords for each post as you write and post.


These are the choices you have on making your font bigger for headings and subheadings within a post. When it comes to these, size matters. Search engines tend to think the biggest font size contain the most important keywords. So, make sure headings are a larger font than paragraphs. Each page should have headings, and if possible, headings should contain keywords. While I've read places that recommend never using H1 tags, most SEO sites do recommend using H1 tags, but sparingly. You want at least 1 per page, but don't overuse them.


Every image should have an ALT tag. Ideally, you want the image file name and the alt description to accurately describe the image with relevant keywords. File names are what you name the file, and try to get into the habit of naming your images with keywords and phrases as opposed to numbers. All images should also specify height and width in the image tag.

Social Media

Believe it or not, your social media ranking and popularity affects your SEO. Make sure it's easy to share content from your site, for each post as well as the site as a whole. Plug-ins like Sumo Me can help make social media sharing even easier for you (as well as provide a nice pop up box for email collections), and highly recommended, Jetpack too makes it easier to share. Shared Count will tell you how well you are doing on the social media sites, and if you keep a portfolio on Contently, they generally configure how many likes and shares you are getting per article.

3. Make Some Changes

This is the part where you can write a long list of things to do. Some of these things, you can't change the past. Never attempt changing your old URL's. Once you create a URL, it's there forever. But some things that you can't change, you can always change how you approach it in the future. So this is also the time to plan for some changes in the way you blog.

But like anything in business, not only do you want to implement your plan, but you want to come back later and monitor results.

I hope this SEO Audit is helpful for your site. I'm so sorry it's long and boring, but I tried to make it more interesting than reading the IRS Pub 17.

P.S. To get codes highlightable, I used this website http://codeformatter.blogspot.com/

Go Ahead and Pin This

SEO Audit | Blogging | Internet Marketing