You probably did not build your website yourself. You trusted a webmaster to do it. Or you may be using a Content Management System (CMS) or an online instant website-building tool with many elements in it that are pre-built and which you have no control over. If this is the case, you have no idea if your website is optimized for the search engines like Google and Bing to crawl and index properly. You are also unsure if your website gives your audience a good user experience.
Below are questions you should ask about your website to see if it is optimized not only for search engines but also for its users. To help you answer the questions, use the suggested tools.
What do you do if you determine that there are many elements in your website that are not optimized? Ask your webmaster to fix it. In the worst case scenario where a fix is difficult or impossible, rebuild your website with search engine optimization in mind. This is worth the effort since you will get substantial increase in qualified traffic, and for online marketers, this is why you built your site in the first place.
How many pages on my site are indexed by search engines?
Do you know how many pages are on your site? Are they all indexed by Google and other search engine?
To find out:
Go to Google, enter in the search box: “site:yourwebsite.com” (replace “yourwebsite.com” with the name of your website).
Google will tell you how many results (pages) it finds.
If your expected number is less than what Google reports, you may have a crawler problem. If it is more, Google might be indexing a number of unimportant content that might hinder it from crawling your site effectively.
With this site search, you will also know if you are using a subdomain. Remember that Google is treating subdomains as different websites. If you have a powerful site and you need the content in your subdomain to be visible in the search engine, use subfolder instead of a subdomain so your important page can inherit the power of your main domain.
There is only one or very few pages being indexed on my site. What’s wrong?
After making a site search in Google or Bing you might find that the number of pages being indexed is strikingly low compared to the actual number of pages on your site.
Look at your robots.txt to see if any of the pages you want indexed are “disallowed”.
Your robots.txt is usually found in “http://www.yourwebsite.com/robots.txt” (replace “www.ourwebsite.com” to the URL of your site.
Search Status: http://www.quirk.biz/searchstatus/
What does the search engine spider see on my site?
The search engine bot or spider crawl millions of web pages per day. It “reads” the content of the web pages to allow the search engine to index them and rank them in search results pages. Search engines bots are machines, and how it reads a web page may not necessarily how humans see it.
Also importantly, some hidden, spammy content that humans cannot see can get your site penalized or banned.
Below are tools you can use to see what the search engines see in your website:
Search Engine Spider Simulator http://www.webconfs.com/search-engine-spider-simulator.php
Internet Marketing Ninjas On-page Optimization Tool http://www.internetmarketingninjas.com/seo-tools/free-optimization/
Go to Google, search for your page, click on the preview arrow > cached > text only version
What HTML code or CSS property is causing a certain appearance in my page? How do sections of my page look like in code?
When looking at your website, there might be certain elements in its design that you would like to change, such as font sizes, color and background, alignment, border color, etc.
How does my site look in the Mac Safari, mobile phones (or other browser)?
The browser you are using may not be the browser that you think your audience is using. Different browsers may show your content differently. For example, if your audience primarily access your site via their smartphones, you should know how it looks on that device by using the User Agent Switcher addon.
Am I targeting the right keyword?
Keyword research and keyword targeting is the foundation of search engine optimization. You have to optimize every page of your website to target keywords that are popular or are actually being used by users.
Here are some of the best keyword tools that will help you determine if you are targeting the keywords that people actually use to seek for information on your site:
With Google Insights for Search, you can compare search volume patterns across specific regions, categories, time frames and properties:
Woorank Website Review:
Wordstream Keyword Tool:
Google Suggest – Type beginning of keyword in the Google search box and see the popular related keyword searches in its drop down autocomplete feature
Competition Keyword Research:
SEMrush – Input your competitor’s web address and see the top 10 keywords that they are ranking for:
SpyFu – For researching keywords being targeted by your competition
Google Trends – Useful for keyword seasonality research, changing trends, among others
Rank.nl – Tool to create permutations and combination of keywords:
Is my site relevant to a keyword?
Find out how strong a keyword message your content is sending to the search engines. Using your target keyword in your content increases its relevancy signal for that keyword.
Note though that the search engines are sophisticated enough to know if you are intentionally stuffing your keyword in your content, and thus see your pages as spammy and trying to game their index. This will almost certainly lower your page ranking, if your page is seen at all.
There is no magic percentage that you should aspire for in terms of keyword density (Google even emphasizes that you should not be concerned about “keyword density”). Write for people, not search engines, by making sure that your keywords are found naturally in your content, and are not being “stuffed”.
Use the following tools to determine your keyword density:
Dave Naylor’s Keyword Density Tool http://tools.davidnaylor.co.uk/keyworddensity/
Ranks.nl Keyword Density & Prominence Analyzer www.ranks.nl/tools/spider.html
How can I easily combine keywords to come up with phrases relevant to my site?
You might want to broaden your keyword horizons by generating more keyword targets to your site. That way, you get more keywords that drive traffic to your site. One way to do this is to cross-combine single keywords and search phrases to come up with new keyword phrases.
Search Combination http://www.internetmarketingninjas.com/search/
Do my pages have Page Title and Description Tags? Do I have duplicate tags on my pages?
The Page Title is the most important tag where search engines get signal as to what your web pages are about (what keywords are relevant to your site). The Description Tags helps your pages become enticing to click when found in the search engines results pages. It is important for each page of your website to have a different Page Title and Description.
With Screaming Frog, you can have every page of your website crawled an see if any of your pages have Titles and Meta Descriptions missing, are duplicated, or have more than the optimal number of characters.
Screaming Frog http://www.screamingfrog.co.uk/seo-spider/
To see how your Page Titles and Meta Descriptions might look on the search engine results, you can use the Google SERP Optimization tool: http://www.seomofo.com/snippet-optimizer.html
Do I have a lot of links to my website?
Having external links or links from other websites is a major ranking factor for search engines. For search engines, links are votes . The more links you have to your page, especially if they are coming from quality sites, the higher it will rank in relevant keyword searches.
Open Site Explorer: http://www.opensiteexplorer.org/
Majestic SEO: http://www.majesticseo.com/
Link Diagnosis: http://www.linkdiagnosis.com/
Blekko: http://www.blekko.com (click on the SEO link)
If you’ve signed up, Google Webmaster Tools: https://www.google.com/webmasters/tools/home?hl=en
SEO for Firefox : http://tools.seobook.com/firefox/seo-for-firefox.html
mozBar for Firefox and Chrome: http://www.seomoz.org/seo-toolbar
Are there broken links on my website?
Screaming Frog crawls your site and reveals any links that are broken: http://www.screamingfrog.co.uk/seo-spider/
Xenu’s Link Sleuth: http://home.snafu.de/tilman/xenulink.html
What text link do other websites use to link to my site?
Text links give signal to the search engines about what your website is about (what keywords it should rank for). Do not overdo it though, having the same text link on all your links is a signal for the search engine that your site is overoptimized, and may suffer suppressed ranking.
Open site explorer: http://www.opensiteexplorer.org/
How do I rank, and how do I track the changes in my ranking?
SEOBook rank checker http://www.seobook.com/download-your-free-seo-tools/
Also, for checking bulk keywords: http://searchenginereports.net/
Can my web content be reached with only one URL (Canonicalization)?
Ideally, every web page should be reached from only one URL. Having multiple URLs for the same web content can cause problems for search engines. Search engines treat the same content with different URL as duplicate pages. Also external links may be spread out to different URLs for the same page, thus diluting the page’s value or equity.
The following URL’s, for example, should be redirected to one URL:
Does my page load fast?
Page load time is now a minor ranking factor for (at least) Google.
PageSpeed Online analyzes the content of a web page, then generates suggestions to make that page faster. Aside from a minor ranking boost, reducing page load times can reduce bounce rates and increase conversion rates.
YSlow addon: http://developer.yahoo.com/yslow/
Load Time Speed Test Tool http://www.internetmarketingninjas.com/pagespeed/
Why is it taking so long for my page to load?
HTTP header analyzer shows number of files being downloaded, size of files being downloaded, location of server, and more:
Also see recommendations for your site at https://developers.google.com/pagespeed/
Is my website penalized?
Having a PageRank of 0 even if your website is relatively old and have external links going to it indicates that your website may be penalized.
Among other tools you can see your PageRank by using Search Status: http://www.quirk.biz/searchstatus/
Live HTTP header Firefox Addon https://addons.mozilla.org/en-US/firefox/addon/live-http-headers/
Is my redirect SEO friendly?
When you have web pages taken down and being replaced by a new page containing the same information on a new URL, you can tell the search engines that the old page is permanently redirected by giving it a 301 redirect to the new page. This way, most of the page equity of the old page is transferred to the new page.
A 302 (temporary) redirect is not search engine optimization friendly because the page equity of the old page is not transferred to the new page.
Use the Live HTTP header analyzer to show whether a redirect is a 301 or 302. You will also know all the nodes in redirect chain.
Live HTTP header Firefox Addon: https://addons.mozilla.org/en-US/firefox/addon/live-http-headers/
HTTP Response code checker http://www.internetmarketingninjas.com/header-checker/
Is my website cloaking any content?
Use Firefox User Agent Switcher , switch to search engine user agents, then refresh. If results are different, yes you have cloaked content.
Do I have duplicate pages on my site?
You can do this manually by going to Google search and type: site:yourwebsite.com “exact quote from one of your content”
Are other sites duplicating my content (stealing my content)?
Other sites stealing your content may result in your site being identified by the search engines as the duplicate content, and thus is not shown in the index. Of course, you would not like content thieves to benefit from your hard work.
Copyscape checks if another website is plagiarizing your content.
If you catch a site plagiarizing or stealing your content or images, you can file a DMCA takedown notice. Details here: http://www.ipwatchdog.com/2009/07/06/sample-dmca-take-down-letter/id=4501/
Are my images optimized?
Search engines currently cannot read what images on web pages are about. One way to indicate what the image is about is by using alt tags in the HTML image code.
These tools reveals the alt tags in your website’s images, if there are any:
Web developer toolbar:
You can also inspect code with firebug
http://www.internetmarketingninjas.com/broken-links-tool/ Tells you file name of image, alt text, size, file size. This tool also reveals broken links.
Do I use stylesheets properly?
A minor issue, but it is advisable to use an external style sheet (style.css) on your website as it removes code from the page that may hamper the search engine spider’s ability to find the “content” of a page.
Are all my HTML codes valid?
A lot of error in your pages’ HTML codes may signal to the search engines that your site is low-quality and less than trustworthy. They may also hinder the search engine to properly find your web page’s content.
Are my CSS codes valid?
Is my URL using parameters?
This is true for sites that are creating dynamically generated pages. Having more than 2 parameters in the URL may result in search engines having a hard time crawling the page correctly and limiting the amount of content crawled by the search engines.
Check your URLs manually.
Example of a URL with multiple parameters:
Is my site sending data to Google analytics correctly?
Google Analytics Debugger
messages include error messages and warnings which can tell you when your analytics tracking code is set up incorrectly. In addition, it provides a detailed breakdown of each tracking beacon sent to Google Analytics.
How do I compare with competition in terms of PageRank, links, etc?
SEO Toolbar for Firefox http://tools.seobook.com/seo-toolbar/ make it easy get a holistic view of the competitive landscape of a market directly in the search results.
How old is my domain name, and how long before it expires?
With DomainTools.com (http://www.domaintools.com), you also see what people see as your contact info, also other Top level domains (.com, .org, .net if you don’t know them yet that you should probably claim to protect your brand)
Is it good practice to use subdomain on my site?
Use of subdomains is acceptable if the content located on the subdomain is substantial enough to exist as a completely separate website. Subdomains used for administrative or technical purposes should not be indexing in the search engines
To look for subdomains, go to Google and do site search. In search box, type: site:yoursite.com –www
Do I have an xml sitemap?
XML sitemaps are used to pass the URLs of your site’s pages directly to the search engines. Especially useful for large sites with complex navigation, they are a best practice for all web sites from an SEO perspective.
Check. It’s usually at www.yourwebsite.com/sitemap.xml
Among other information, revealing your xml sitemap is one of the functions of the Search Status extension from Firefox: http://www.quirk.biz/searchstatus/
Do I have an html sitemap?
An html sitemap is a page or pages on your website that list links to all the important pages on the site. This is very helpful for users and search engines as it gives them an overview of your website and is structured. It also makes your important pages appear two clicks away from the homepage, and provides anchor text that gives signal to search engines as to what the linked pages are about.
Example of a good html sitemap:
Am I using Iframes or Framesets?
iFrames load content into a web page from an external source. When content is loaded into a web page this way, it is not visible to search engine spiders and will not be indexed or associated with the page it is loaded into. Do not use iFrames to load relevant content into a web page.
Firebug: Firefox: https://addons.mozilla.org/en-US/firefox/addon/firebug/
Is my 404 (Page not Found) error configured properly?
When a page is not found in your site, the server must respond with 404 error in the header response code. Do not use dynamic or custom 404 pages that do not return 404 code as this might cause crawling issues with search engine spiders (Note: having custom 404 pages that return 404 code is not only okay, but advisable).
http://www.Rexswain.com/httpview shows you exactly what an HTTP request returns to your browser
Is my URL human-readable?
A human-readable URL is search engine friendly, and is easier to remember than a bunch of characters.
Dynamic web sites often have page URLs which contain multiple parameters and values. These should be modified to be human readable when possible. The most common method to rewrite these URLs is through a mod-rewrite function.
Inspect visually. Example of human-readable URL:
Has my server enabled compression?
Compressing resources with
deflate can reduce the number of bytes sent over the network, resulting in faster page load.
See Google recommendation for more details: https://developers.google.com/speed/docs/best-practices/payload#GzipCompression
Does my server have Conditional Get enabled?
Like the HTTP compression, conditional GET reduces bandwidth usage with browsers that support these technologies. This setting allows search engine spiders to check if the page has changed since the last time they requested it. When the spiders ignore unchanged pages, it can crawl deeper into the site and index more pages quicker.
Do I have social buttons so it will be easy for my audience to share my websites content in social sites such as Facebook and Twitter?
Contact your webmaster to add the social buttons, or get code from social sharing sites such as AddThis, ShareThis or Gigya. You can also use various plug-ins compatible with your CMS. Ask your webmaster.
Are there all-in-one tools I can use to see if my website is optimized?
Google Webmaster Tool – shows what Google knows about the top search queries, crawl error types and their counts, links to your site, keywords, sitemaps, malware, crawl errors, etc.
Web developer toolbar
Here’s more articles to help you find out what’s going on with your website: