Sample of our SEO Checker Tool.
This unique FREE SEO TOOL allows you to easily analyze your websites SEO. It is completely free and there is no obligation or requirement to use it.
This Free SEO tool is provided to you by SEO ONE, the #1 SEO Firm in the America. To learn more visit us at www.seoone.com in a future article we will explore the Free Tool in more depth.
To take our Free SEO Checker for a spin visit it at
Summary for http://www.investorsipo.com
|Overview||• 2 Notable Issues||• 11 Worth Reviewing||• 11 Correct (or are they?)|
In the past, the title tag was considered the single most important “on page” ranking factor there was.
Since 2012, Google has often shown something other than you might prefer, but the general concepts remain the same. Read this for some interesting concepts about the title tag.
It usually is shown as the “headline” in your search rankings, and will help determine whether a viewer will click on your result or not. As of spring 2014, Google changed the font size, so now you should keep the character length to about 55 or so to prevent it from getting cut off,
The title tag also shows in the blue bar at the very top of your browser window, but it does not actually show “on page”.
Okay – You have a title tag, but what does it say?
is this something that people are searching for? Is it your business name? Is your city and state included? Is it – gasp – “Home”?
Most search industry professionals agree that the title tag is still important for rankings, although it’s not as important as it once was.
Don’t forget that the title tag ALSO entices users to click on your result.
If you don’t fully understand the Title Tag, including what it does, where it goes in your code, where it shows to users, etc. then you should give my original Title Tag Tips a quick read.
The description tag is what the visitor will read in the search results when deciding whether or not to click on your listing.
While not necessarily a “ranking factor”, it should be unique for each page, and written as a “classified ad” of up to 160 characters to compel users to click that result.
the description tag should be relevant to that specific page, which is why Google Webmaster Tools points it out when you have duplicates.
Your page has no meta description tag at all.
This is very important, since it tells the end user seeing your listing in the search results exactly what your web page is about. Read this description tag article right away to understand more clearly what I’m referring to!
The keyword meta tag is NO LONGER a critical factor for ranking. HOWEVER, it’s been left here because I do prefer it to stand out, and I do continue to use them.
Your page has no keyword meta tag at all.
If it was left out intentionally, I guess that’s your call, but if I were you, I’d add some.
Keyword meta tags were all the rage many years ago, but as people began stuffing and abusing them, the search engines started to devalue them, and today, they’re almost entirely ignored.
However, I believe that could turn around again, and it only takes a minute to add a couple of relevant phrases to your keyword tag.
Make the batch of keywords completely unique for each URL, and try not to duplicate more than 50% of them with any other page. If you can’t come up with unique kw tags for a page, then why do you have a different URL?
Admittedly, I’m in the minority among SEO’s that think it’s still worth the effort.
|H1 Header Tag|
The H1 tag should often closely resemble your title tag, since it’s the headline for the page.
If your headline isn’t visible and relevant, then how is anybody supposed to know what your page is about?
This is the headline you’re using to tell the world what your page is about.
It should be exclusive to just this page, entirely subject relevant, and to add a little kick throughout your site, you could also make it a text link to the next most highly relevant page on your site.
Is what you see below really the most relevant and descriptive headline for your page? Does it closely relate to (or even match) a phrase that you’re also using in your title tag?
Read these Header Tag Tips for more info.
Find Qualified Investors Fast
|H2 Header Tag|
If you have a page long enough for multiple headlines, then using them can add some more clearly defined relevance to your page.
You do want to be sure you’re only using ONE H1 tag, but you can use multiple sub-headlines and even bullet points all you like.
For their secondary headlines, some people prefer to use “bolded” text instead of actual H2 tags, H3, H4 etc. and thats fine. There’s no known advantage to using one over the other.
This is what you are using for your H2 tags on your page.
Are they subject relevant, or are they meaningless headlines that are replicated thousands of times across the web?
Your H2 headlines should be specific to a subject that you would like this page to be ranked for, and making them link to other relevant areas of your site helps too.
Angel Investors and Venture Capital
|Image Height & Width Tags|
Specifying the height and width attributes for images makes pages load faster, leaving space for the images to fill in as they completely download.
Failure to add them forces everything to stop while the image renders, slowing the overall page load time, which can decrease the frequency and depth of your crawls by the search spiders.
Some images are missing height and width attributes.
Is it a big deal? Well, that’s up to you. I don’t think so, unless you have multiple infractions, or large individual file sizes too. However, like meta keyword tags, it only takes a few seconds to put ’em in, so don’t be lazy!
|Images Expires headers|
If you use “image expires” in your header, then browsers will cache the images until a specified date, speeding your load times for returning visitors.
While this may not be directly relevant to search ranking, it does wonders for the sanity of your return visitors.
Your site is not using “expires” in the headers for your images.
An expires tag can help speed up the serving of your webpages for users that regularly visit your site.
There are likely more important things you can do, but if your site is really almost perfect, then go learn more about it by reading section 14.9.3 – W3c and implement.
I’d also like to point out that this is typically one GIANT pain in the neck to have to deal with, due to so many different server configurations, and restricted access to your different config files, so might be worth ignoring entirely.
|Robots Meta Tag|
The Robots Meta Tag is a command issued to visiting search spiders.
If you have one, it will be displayed on the right.
Your page has no robots meta tags.
The robots tag is effective for giving directions to the search spiders, and I do recommend using at least these two –
The robots.txt file MUST be present on your domain, or you’re going to generate 404 errors when the search engines look for it.
The robots.txt file also helps you steer the search engines away from areas you don’t want indexed.
Also, the robots.txt file is where you need to reference the location of your sitemap.xml file, and you can see Robots.org for more information.
Good Job –
Your site uses a robots.txt file.
|XML sitemap file|
XML sitemaps tell the search engines exactly where every page on your site is located, and the importance you place on each.
Here’s a link to the best XML sitemap tool that I’ve ever used.
A1 Sitemap Generator is completely free for 30 days, which is more than ample time to get yours made, and there’s no reason to buy it unless you need it regularly.
Your site lacks a file named sitemap.xml, and in most cases, this means you don’t have an XML sitemap that you’ve submitted to Google, although in some cases, your sitemap may be named something else.
I think this is critical, because you can’t take advantage of all the really great information you can get from Google Webmaster Tools.
Ahhh, maybe you actually ARE trying to hide something from them?
If that’s the case, then mission accomplished I guess, but having an XML sitemap AND referencing it in your robots.txt file will tell all of the search spiders exactly where each of your pages are.
For WordPress sites you’d want to use a plugin, and for non-WordPrress sites, the best sitemap creator I’ve ever used is A1 Sitemap generator.
You can get a 30 day free trial here, which is plenty of time to create your sitemap, so there’s no reason to buy it unless you routinely work on multiple websites.
|Local – Google Earth KML file|
A business with a physical location needs one of these Keyhole Markup Language files, which can help Google determine your precise location.
Name your file locations.kml and upload it to the root of your domain.
Uh oh, you don’t have one of these. If you think you do, then either it’s not named locations.kml or it’s not in the root directory. If you’re a local business, this a very important step that should not be overlooked.
On the other hand, if you’re “online only” and have no address, then you can disregard it entirely. Here’s what Google has to say about the KML sitemap. Now use the KML File creator, and upload it to your root.
The rel=canonical tag is used to tell the search engines what you want to be the “real” URL for a page, keeping them from indexing multiple versions of the same page. For example, if a blog post is in multiple categories, it may be accessible multiple URLs. Using a single canonical tag keeps just one version in the SERPs.
Read more from Google.
Too bad – You’re not using the rel=canonical tag. However, it may or may not be a big deal in your case. It all depends on how many potential versions of your page might be on your site, so it’s hard to say in this analysis.
Are you an ecommerce site? Do you have a blog? How about a forum? Running a content management system? If so, then you should likely investigate further, and implement this tag. Get with your web developer.
On the other hand, if you’re a small static site, manually edited the old fashined way in an .html editor, like Dreamweaver, Coffee Cup, Front Page etc, then you’re probably fine.
Duplicate content, even from your own site is a killer, so any and all versions of a page should have the exact same rel=”canonical” link. This tells the world that it’s all the same page.
The search engines DO see subdomains (including www.) as entirely separate domains, and it’s up to you to ensure that all incoming requests get 301 redirected to the www version to prevent possible duplicate content issues.
Furthermore, the search engines see varying versions of pages with different URL’s as duplicates too.
Keep in mind that this tool ONLY checks a single URL for the www issue, so if your home page loads both with and without /index.html then you still may have a problem.
Good Job –
Your URL is exactly the same with a 301 redirect, either with or without the www.
Whomever set up your site paid attention to details, however, you STILL may have a problem that this tool cannot detect.
Why? Because, although your domain has a 301 redirect in place for the www version of a page, we’re unable to tell if you might have other URL’s that show the exact same content. This usually takes place on the home page, but has been seen elsewhere too.
The worst offender, in my experience, are sited hosted on a Windows server, where adding /Default.aspx (or similar) after the domain.com brings up multiple versions of the home page.
This one is definitely worth learning more about, and a single fix of your home page can make a world of difference! Go watch this 2009 Matt Cutts video on canonicalization to see the changes that came along back in 2009.
Pages that contain tables inside other tables make the page load slower, since the web browser is forced to find the end of the table before it can display the whole page.
Good Job –
Your page does not use nested tables, which speed up your load times and optimizes the user experience. That happy visitor might even be a search engine!
In-line styles or CSS styles are those which are applied to just one single element.
Using an external CSS style sheet will lead to overall smaller code, which means faster page loading, happier visitors, and likely, higher search rankings.
You do appear to be using inline styles. Is this a WordPress site? If so, it’s likely just the sidebars, and not an issue.
to be a bug here, and it’s incorrectly reporting the existence of tables. sorry – we’re on it.
Having too much code in a page makes it load more slowly. Instead of placing code in-line, move it to an external file that you can then call as an include to speed it up.
It might be something minor, like a category drop down menu, or it could be a larger issue.
You developer should also examine how many scripts there are and how large they are, to determine if there might be any value for you to remedy.
It may even just be something small like Google Analytics, which you wouldn’t even worry about unless you’re an obsessive perfectionist. I only mention it here because its worth examination.
Favicons are the branded logos/icons that appear in the address bar next to your URL.
This helps brand your company, and it makes it easier for users to return to your site from among a list of bookmarks, or from the browser tab. Using one is a good idea.
Good Job –
Your site appears to be calling for a favicon, but have you added code in your header to tell the world that you’ve got one?
Even more important though, can you actually see your favicon showing here? If not, then something’s wrong, and your file is probably missing.
To show a favicon, some sites just drop in an .ico file and let Web browsers find it on their own. Instead, there’s code that should can, and should be put in your header.
The correct code for a favicon isn’t in your header. Not only should you have a favicon, but you should tell ALL spiders and browsers that it’s there. go read this for more info.
|ALT Tags for Images|
The ALT tag was originally designed to help sight impaired visitors “see” what an image was.
With the addition of Google’s Universal search, ALT tags for key images will help your site appear more often in the search results.
Good job –
At least some of your photos have ALT tags. Now, do they accurately describe each photo, or are they spammy sounding?
For more information about this, give a quick read to my article on ALT tag Tips and you’ll be an expert in no time.
These are the words that you’re using on this page to link to other pages.
Even though Google changed the rules a bit in early 2012, for ranking purposes, they should still be directly relevant to your landing page.
Wherever possible, link with relevant phrases throughout your website, but don’t make it sound ridiculous and unnatural.
Even after Googles Penguin update in April of 2012, nothing is more important than the anchor text to tell the search engines what a webpage is about.
Just like an inbound link from an external domain, the links within your own domain also add relevance to each target landing page, so try to link with relevant text when possible, as long as it sounds natural.
When it’s not possible to link with good text, or it sounds too artificial, use relevant surrounding text, both in front and behind your “click here” link. This too, does help establish relevance in Google’s eyes.
Another thing to note is that you only get “credit” for the FIRST link on a page to any given URL, so there’s no point in trying to game the system with extra anchor text links pointing to the same URL.
These are links that point to other pages within your own domain.
Whether internal or external, it’s typically a good idea to keep the numbers down, and I prefer to usually keep less than 30.
Internal Links: 77
Besides diluting your internal Page Rank, don’t forget that this many choices can be overwhelming for visitors. Why not cut them back a bit?
External links are those that point to other domains.
Referencing other websites is great, and can help establish your own authority, but having too many links of any type can mean bad news, because you’ll leak away too much of your link juise (i.e. PageRank) .
Unless this is a resource page, having any more than 20 external links is just ridiculous, and you do run the risk of being labeled as a “link farm” too.
Is there a logical reason to list so many links here other than your own? If so, then okay – great – I mean, what can you expect from an automated tool, anyway?
External Links: 0
Authoritative webpages which rank well often do link out to other sites, so there’s no reason not to consider it, if appropriate.
However, it is CRITICAL that you look at these links and make sure you recognize them all, and verify that they’re good links you want to offer your visitors.
Webites often get subtly hacked and you’ll unknowingly have links to sites squeezing off your link juice.
|Total Page Size|
Google has a great online tool showing developers how to speed up websites, called Page Speed Online and you can expect things like page speed to matter more and more in the coming years.
Maintaining a total page size of under 400k is a good idea for speed of loading, and the individual object sizes can be seen below.
1.63 MB – Your page is (in my opinion), TOO LARGE.
You should use lower resolution photos, or consider various methods of compression, or just cut out some of the unneeded fat.
If your page is over 1mb, you should make it a higher priority, and this tool from Google can help.
|HTML Size (uncompressed)|
This is the size of all code on your web page EXCEPT…
70.97 KB – Even without your images and scripts, your uncompressed web page is leaning large at over 50k by itself. Faster load times are better for your user experience; can’t this get cleaned up? This issue goes hand in hand with HTML compression too.
Secondary steps to reduce HTML size include reducing HTML white space and shortening HTML comments too.
|HTML Compression Status|
Compressed pages load faster,and your users (and the search engines) will thank you for compressing them before delivery.
If you’re managing your own Apache server, have the admin get you the Apache module called Apache module installed for zipping called mod_Gzip.
You’re not serving compressed web pages, and below this stat should be another one, showing exactly what percentage you COULD be compressing.
Since April 2010, Google has been counting page load time as a ranking factor, and your pages would load faster if they were compressed.
Here’s a quick video demo of the performance test at Google.
If those things do apply to you, then yes, in my experience, it can actually give you a ranking boost. Google even says so here at Page Speed Online .
Some servers will compress the files before serving them to your visitors, making them load faster. This certainly isn’t critical, but the more competitive your environment, the more these little things can add up, so you may want to compress.
13% – is the percentage that your page COULD potentially be reduced if you were compressing your HTML.
Please note that this is not a separate issue, and is directly related to the (probably minor) issue of Gzip compression.
If the percentage shown is significant, then you could further reduce your page load time, and may improve your rankings.
This could simply be like buying light aluminum rims for your car to go faster. Does it really matter in your case?
You need to decide for yourself, but if your programmer has nothing better to do than play video games, they may as well attend to this, in my opinion.
|HTML Size (if compressed)|
Compressing just your .html files alone likely isn’t necessary, but it might be something to consider if your pages are way too large.
9.47 KB – Since I’m nitpicking here, I suppose you could be compressing your HTML size which would further reduce page loading time. If you’re struggling to reduce overall file size, I think you should consider using this Google Insights tool.
This shows whether or not your site is serving cached pages. Caching saves bandwidth and speeds up load times by serving static versions of your dynamic pages.
It doesn’t appear that you’re caching pages, but just like compression, it may not be necessary in your case.
When a site is driven by a database, a cached page can be served to your visitors as static html, and will eliminate time spent querying the database.
The down side of caching is that frequent visitors may not see your newest content, so you should use caching cautiously.
By itself, this is not a huge issue, and unless you have thousands of daily visitors it’s likely not worth figuring out.
On the other hand, combined with load time issues can affect your user experience and even your Google AdWords Quality Score too.
Every little bit helps though, and if you’re in a competitive environment, shaving hundredths of seconds off your load time can only help.
Total Page Size: 1.63 MB
Top of Form