Attorneys often get a bad reputation as ambulance chasers or for using their knowledge to exploit people and businesses with the law. Unfortunately a few bad apples spoil the whole bunch and nowhere else is that narrative personified better than in the character Saul Goodman on the hit series Breaking Bad. If you haven’t had time to catch the show, Goodman (played by Bob Odenkirk) is a self-centered and unethical attorney who does anything he can to get his low-life clients out of trouble. He even has a rather hilarious website which we will do an on-page analysis of for this post.
Access
Robots
Lets first take a look at the robots file for the site. If you are ever having trouble with pages being indexed in search or just not showing up, this is the first thing you should look at. The robots file is typically located at the root domain or www.example.com/robots.txt.
The above is what the file looks like. As it is configured for Saul’s site, all robots are disallowed from crawling the services directory. The rest of the site however is fair game. In general, if there are no directories that you don’t want search engines crawling, you can allow all of them. Note that bots are under no obligation to obey the robots file but most major search companies do.
You should also look for robots meta tags in the header of web pages indicating to bots whether the page should be crawled or not. There doesn’t appear to be any of those meta tags on Saul’s site. Below is an example of what the tag should look like if it were present.
Sitemap
Sitemaps give search engines a road map to how a site is laid out. Submitting a sitemap to search engines helps them identify pages more quickly and easily. An xml sitemap should be well-formed and should contain all the URLs (pages) of your website.
Make sure there are no pages floating outside of the site architecture and that you submit a sitemap to a webmaster tools account you have configured for your domain.
You can learn best practices and what your sitemap should look like at sitemap.org. At first glance, it looks like Saul’s webmaster forgot to put a sitemap on the domain.
Flash and JavaScript
If you didn’t already know, search engines have a hard time seeing content rendered by Flash files and JavaScript. For these things I like to approach is as a little goes a long way. If you absolutely have to use Flash or JavaScript to render content that could otherwise be text, use it sparingly.
There are a lot of free and paid tools online that allow you to see a page exactly how a search engine spider sees it when it is crawled. For example you can check out seo-browser.com to see what content is actually showing up and being indexed. To give you some perspective on this, here is what Saul’s website looks like with JavaScript, Flash and images disabled in the browser.
Saul’s site is not unique in this regard. New design trends have developers making very beautiful websites that are rendered using code, images and flash. Unfortunately for search marketing purposes, this isn’t ideal. If you see that there is content you want indexed but it isn’t because its being rendered by Flash or JavaScript, you might want to change it to plain text.
In Saul’s case, there is a lot of content being overlooked by search engines. All the video, flash banners at the top of the screen, images and other elements are completely invisible to search. To his credit, he does have alt and title attributes filled out.
Here is the site in all its glory.
Performance
Have you ever timed yourself sitting at a light when the person in front of you doesn’t go when the light turns green? Even just a few seconds can seem like an eternity. In fact, if a person doesn’t notice the light in just a couple seconds, we are already calling them a moron.
Ironically the same is true for websites. When the pages we want to access don’t load fast, we hit the back button and go elsewhere. Yes visitors might return later but you miss out on chances to make a good first impression if your pages are loading slow.
You can use free tools like Google’s Page Speed Insights to find out how fast your pages are and to get tips on how to improve them.
Saul’s site did ok but there is definitely room for improvement. Note the very first thing that Google suggests we fix. It’s that JavaScript rendering that is making most of Saul’s pages. If he could find a way to make more of his content render with inline CSS or plain text, the page would load a lot faster.
Indexed Pages
Even though your robots file is in order, you should check to see how many indexed pages a site has in search. If there are none and your robots file is fine, that could be an indication of a deeper or more serious problem.
You can search for indexed pages by using the site:operator in the Google search engine. Doing this we can see that Saul has 9 indexed pages (which is pretty much all the site has) so it looks good.
So you want see if the amount of indexed pages is the same as the number you would expect to be there. You also want the content of those pages to be the same as whats already on your website (that the pages are not out dated in Google’s index).
Meta Description
While search engines don’t have to use the meta description you provide in their SERPs, they often do. You should make sure that you have a relevant meta description for each page of your site. Put your target keywords for the page in there as well if applicable.
Saul has a little issue with his meta descriptions. They are all the same. Each meta description should be different depending on the page that it is associated with.
There is really no magic to title tags on your website. If you aren’t using keywords in them, make them descriptive of the page. For instance if the page is a privacy policy, you should write Privacy Policy | Brand Name.
If you are using keywords, you should front load your title tags with your primary target keyword phrase first. This helps users see it first in search. Saul has his title tags squared away with good, descriptive phrases. It doesn’t appear that he is trying to rank for any keyword terms.
Duplicate content is always something you should be thinking of but maybe not for the reason you think. There are a lot of misconceptions about duplicate content and that there is some sort of special penalty reserved for those foolish enough to have two pages with the same stuff on them.
In reality, the negative impact is indirect in that a search engine merely doesn’t know which version is the correct version. In these cases, it may neglect to include a page in its index that you might want in there.
Canonicalization helps solve that issue. If you have a custom built solution, you can canonicalize your pages with the canonical tag. If you are using a CMS, you should look for a plugin to do the job for you.
Saul is not canonicalizing his URLs. Granted he only has a 9 page site but what if a hardened murderer is looking for a page that has not been indexed when it actually exists? How is he supposed to get the information he needs?
Content
This is a tricky one because having thin content doesn’t automatically mean your site isn’t going to do well. In general, you want to have a good amount of content on your pages but if you have a popular site that doesn’t have content (i.e. a lot of other sites are linking to it), it may not be as big of an issue for you.
As evidenced by our crawl of Saul’s site, there is virtually no text based content on his pages. Most of it is images, video and flash. Despite those facts, the site has a PageRank of 6 and ranks very well for terms like “Saul Goodman”, “attorney from Breaking bad”, and other similar phrases.
On-site optimization for lawyers can get a lot more in-depth then the steps you have seen here. This list is by no means complete and an analysis may also be tailored to your specific goals for optimizing your site. As for Saul, I’m sure he has the connections to get his site up to par. I doubt he’ll need it with his referral network.