1) Open Site Explorer
When it comes to analyzing your website’s SEO performance, one of the most important places to start is your overall backlink profile. You want to make sure that as your website grows, you have a good mixture of different kinds of links (follow and nofollow), anchor texts (branded and keyword rich), and a number of different websites linking to you.
Our favorite tool to accomplish this is Moz’s Open Site Explorer. It’s easy to use and provides tons of information about any given website’s domain authority. All while lending a quick understanding of how many links a website has amassed and where they came from. If you want a more detailed report, you can easily export all of the data in an excel format to see where every link is coming from, what pages are receiving the most links, and what different domains are linking to your website. You’d be hard-pressed to find a tool that can teach you more about external links to your website than Open Site Explorer.
While most SEOs wish that their jobs weren’t thought of as trying to increase rankings for individual keywords (because they really shouldn’t be, and aren’t at forward-thinking companies that want to build their online presence the right way), the stark reality is most businesses engage in SEO because they want to boost rankings for certain keywords that will grow their business.
As a result, one of the most important things that requires constant monitoring on a website is a company’s individual keyword rankings. This can be a ridiculously difficult task without the right tools – particularly with Google’s “Pigeon” local updates, doing a quick Google search and just jotting down where you show up is extremely inaccurate and provides very little insight as to how your efforts are doing on a broader, national scale. The best solution to this problem is SEMRush. A website that helps to keep track of not only what keywords your domain ranks for as a whole, but what individual pages are ranking for. This is particularly useful if you have a massive site and notice that a few pages are losing organic search engine traffic for whatever reason. By using SEMRush you can quickly identify what keywords were lost on that page, and start trying to understand the reasons that might be happening while planning to resolve them accordingly.
If you’ve read any of my 1 Hour SEO Audits, you know I’m a huge fan of website speed as an SEO ranking factor. It’s not a massive one, but it does play a role in both desktop and mobile rankings which is why businesses need to keep a close eye on how they are doing. Otherwise they leave themselves open to dips in ranking due to their slow site. Google Analytics is a great way to figure out what pages are slow, but figuring out what specific issues are causing your website to take ages to load is a more difficult proposition. Sure, you might have to find a new hosting provider – but often problems with a website are not so easily resolved.
Enter GTMetrix. A free service that allows you to run pages and receive a grade from both Google’s PageSpeed service and YSlow. It also aggregates problems that both services identify in order to give you a broader understanding of actionable fixes that each page needs. Maybe your images need to be optimized for file size, or maybe you’ve forgotten to specify image dimensions? Whatever the problem may be, GTMetrix will provide you with all the information you need to go in, make the fixes required to get your website loading up quicker than your competition, and possibly improve your search engine rankings as a nice little bonus!
4) Moz’s On-Page Grader
We’ve already given kudos to a Moz product once, but their On-Page Grader program deserves some recognition as well. Particularly in the aftermath of the new Panda 4.2 update, online businesses need to constantly monitor their on-page elements to make sure there is nothing that needs fixing in order to increase search rankings.
The On-Page Grader takes any single webpage and runs it through an algorithm to see how well optimized it is for a specified keyword. It takes into account how many times the keyword is used on the page, and whether or not it is in the title tag, meta description, h1 and h2 tags, then gives you a simple letter grade. It also provides you with the reason that each element is so important to include, and lends you an idea of how difficult each fix will be in order to optimize your page appropriately. Obviously this is something you can do on your own, but it’s nice to have a handy checklist ready for review without having to manually go through and make each change.
It’s well established that user behavior on your page directly affects your search engine rankings. As a result, it is important to make sure that you’re targeting keywords that are accurately reflected by the content you’re presenting in order to give users the best experience possible once they get to your page. The last thing you want is for users to immediately bounce away from your page and click on a different search result because you are not accurately addressing the keyword they searched for in Google.
As a result, it is important that you monitor user behavior on your pages so that you can identify which specific pieces of content on your site are causing people to leave and go to other options in the search engine results. One of the best, and most cost effective solutions for this task is Crazyegg. The heat-mapping tool lets you see how far people are scrolling down your page, where they are clicking, and what is getting clicked on the most on your page. More importantly, it allows you to segment your traffic so that you can see how specific types of referral sources change user behavior once they get to your website. You can segment out your organic traffic and address exactly what they find uninteresting in such a way that if your bounce rates fall, your time on site increases, and users have a better experience on your website. The higher the user experience, the higher your rank will be in search engines as a result.
It’s clearly important to know not only where you’re going wrong, but to what extent you need to be cleaning up individual SEO problems on your website – whether they are title tag problems or pages that are 404ing. We’ve already explored tools that can help you address site speed, overall backlink profile, and individual page content issues, but how do you do a more overall audit on your website to look at technical elements like response codes, title tag character limits, and missing img alt text?
My favorite means to deal with this is ScreamingFrog. The tool will crawl your entire website and give you a myriad of possible reports about how it is performing from a technical standpoint. Missing H1 tags? Duplicate H2 tags? Pages with very little written content? ScreamingFrog can point you in the right direction to fix these issues. This is a key tool for any business that is serious about SEO analysis, and should be used frequently for any business that wants to make sure that all their on-site technical elements are up to SEO standards.
Duplicate content is the worst. While in most cases it won’t affect your SEO in any significant way, it’s weird to know that someone thought your content was good enough to rip off word for word on their website. However, duplicate content is not always taken from your website by others with the intentions of repurposing it as their own. Sometimes other websites will copy your content and reference that they “syndicated” it from you – or point back to you as the original source of the article. This is fine in most cases, but a lot of the time these sites won’t do you the justice of linking back.
How do you identify these types of content? Copyscape is a good start. All you need to do is enter the URL of a piece of your content that you think might have been ripped off. Then Copyscape will show you all the other pieces of content that contain certain percentages of your original article. This is particularly helpful because a lot of the time you’ll come across other articles that only have a 5-10% of words in common with your content, as they have quoted something you said in your original article. This provides you with another opportunity to get the domain to link to you when they reference you as a source. This whole process of identifying and reaching out to websites can be frustrating, but finding a few good sources that have used your content without referencing you and getting links from them can make the whole process worth it.
While it isn’t the most important ranking factor (yet!) in the world of SEO, it is significant to keep track of what Google sees as your topical relevancy. If you’re a gardening business and you’ve amassed links and have created content that has led Google to believe you sell website design software, there’s clearly a problem that needs to be fixed. As Google’s understanding of what each domain is topically relevant for improves, it will become more and more essential to make sure that you have all of the correct signals to properly indicate to Google what products and services you concentrate on.
The best tool to check this out is MajesticSEO. While they provide a range of different SEO services, this one in particular really stands them apart from the competition. Not only can you see what your website is topically relevant for, but you can see what subjects most of your inbound links are revolving around. By analyzing what kinds of links you are amassing, you can scale up (or down) on your SEO efforts to get links of certain topical relevancy that is particularly related to what your business does. Therefore improving your website’s domain authority to more efficiently rank for specific target keywords.
9) Fresh Web Explorer
Google has repeatedly said that a backlink profile should appear “natural”. While there has been a lot of debate about what exactly that means, at it’s core, a “natural” backlink profile should contain lots of links that have been given to you by other domains because of the great content you have on your website without you having to do much (if any) work to convince them of that fact. The big problem with this is that most online businesses are unaware of who is mentioning their brand or products in natural content like blog posts.
The best way to solve this problem is with Moz’s Fresh Web Explorer. This convenient tool will scour the internet looking for references of an individual phrase or term and show you what sites have used it recently so that you can keep up-to-date on natural mentions of your brand as they happen. It is important to note that there are lots of tools out there that get this done, and as long as you have one that is consistently finding new mentions of your brand, it doesn’t matter which product you use. The most important thing is that you’re able to regularly analyze who is mentioning your business, and that you can take the time to reach out to these sources to see if you can get any acknowledgments of your brand to link to your business.
How big is your website? How old is it? If you have an old domain with tons of different pages, you can stumble across a huge number of old pages that need optimization – specifically when it comes to title tag and meta description character count. Creating copy that fits within these two elements' character limits is an extremely difficult proposition, and if you don’t keep track of how many characters you’ve used in each, you leave yourself open to Google re-writing your title tags or meta descriptions to versions that are less than optimal for search engines.
While it isn’t exactly an “analysis” tool, Wordcounter.net is a great website to use as you try to come up with new title tags and meta descriptions that are of interest to users, and are keyword-rich when possible. Not only does the tool show you your word and character count, it shows you other metrics like keyword density and reading level to help you come up with the best possible title tags and meta descriptions for old (and new!) content.
What are your favorite SEO analysis tools? Let us know in the comments section below.