SEO, GoDaddy and me

10 Aug 2011 Loading seo etc.

The bliss website has always been a static one. That is, the bliss website is simply a bunch of HTML files on the Web host. The host was, until the events described in this post, GoDaddy due to them being reasonably cheap and having my domains registered through them. The pages are not written by hand; the site is generated by Jekyll. But any dynamic content on the website is provided by Javascript. For instance, the correct download button is displayed with a bit of JQuery magic.

Search engine optimisation (SEO) is a cost-effective way of driving traffic to the website, resulting in more downloads and thus (hopefully) more sales. So long as you aim to rank highly for targeted keywords, you should get targeted prospect customers. This tends to mean a shorter pipeline from visit to sale because your solution is easier to sell and the prospect has a direct need for the solution. Furthermore, you don't have to pay Google when people search and click on one of your pages in the search results. That sounds like free advertising; of course there's nothing free about it, because search engines require content, and content creation costs either time or money or both.

So, for Micro-ISVs, SEO is both cost effective and scalable.

The position that your pages take in the search engine rankings is determined by a huge number of factors. Some are much more important than others. Since I launched the bliss website in 2009 I have been working my way through these factors, for instance building links where possible and optimising my HTML. This has been slowly successful (indeed, if there's one thing I've learnt, it's that sustainable SEO requires calendar time and plenty of it).

Google Webmaster Tools and the curious case of the HTTP redirect

Back in April though, I began to notice my a number of error messages being recorded by Google Webmaster Tools.

HTTP redirects for many files

These redirect errors in the Not followed section confused me at first. These were just static HTML files on a static Web host. Why on Earth would redirects be occurring?

Then, it started happening for my sitemap.xml:

HTTP redirects for my sitemap.xml

I realised that these redirects may affect how Google was crawling my site, and thus may be limiting the rankings was able to achieve. A HTTP redirect is a way for a web server to tell your browser to request a different page. The browser does not have to comply, but most do. More importantly for SEO purposes, the Googlebot could be considered another type of browser that happens to be automated and stores the pages it browses in Google's search indexes. So, just because Chrome, Firefox and the rest follow the redirect it doesn't mean Google will, and worst case Google may begin to lose trust in the pages that are being redirected.

So I resolved to fix the problem.

Why the redirect?

So: a static HTML website, a simple, static Web host (GoDaddy) and straight, direct requests for HTML files and still Google was getting HTTP redirects. I tried it myself, it seemed fine! So I did what anyone would do. I asked the Internets.

Pretty quickly, responders replied saying requests for my pages were redirecting. So I went back to the trusty command line to double check.

Using wget I simulated requests for my pages. First time: it worked fine. Then I tried again, for exactly the same file. And this time I was sent two redirects, first for a page with exactly the same prefix, but with what looked like an auto-generated suffix, then once wget followed that redirect it was redirected back to the original page it requested. Madness! Here's how it was reported:


HTTP/1.1 302 Moved Temporarily
Location: sitemap.xml?acd5d7e3

HTTP/1.1 302 Moved Temporarily
Location: sitemap.xml
HTTP/1.1 200 OK

So, the conclusion was: there was a problem, and it was intermittent.

Frankly, we'd rather be shooting elephants

(If you don't know what that heading refers to, see this)

Looking around the Web I found more examples of the same thing. It appeared that the 302 redirects were imposed by GoDaddy under certain circumstances. Here's their reply to another commentor's request for help:

“The 302 redirects are filters setup to maintain the integrity of the hosting server while we investigate and resolve an issue. These filters are temporary and will be removed as soon as possible. We apologize for any inconvenience this may have caused and appreciate your patience and understanding in this matter.”

(Incidentally, "Simon", for it was you who first posted that quote, if you want a link back to your website just ask).

I resolved that I had little recourse other than to ask GoDaddy for it not to happen again. I assembled all of the tests I'd run so far and wrote a support ticket to GoDaddy, explaining that the issue seems sporadic, so they may need to not only run the same tests but also delve into past support tickets. If they refused to fix the redirects, I decided I'd move Web host. If they agreed, I'd have to keep an eye on it forever more.

I should've predicted the actual reply: "works for me". So, despite a few months remaining on my hsoting plan, that day I decided to move host.

Into the clouds

I had a look around on Hacker News and the Business of Software forums for recommendations. There weren't many hosts that didn't have a mix of good and bad reviews. Unsurprisingly the better hosts seemed to be more expensive, certainly more expensive than my existing plan with GoDaddy. And some of them were full VPSs which would mean administering my own server.

All I wanted to do was host HTML. Why should I have to know how to write a httpd.conf file for that? My thoughts began to turn to the cloud. Google App Engine is a form of Platform As A Service, where Google has responsibility for the entire technology stack below the application. Couldn't I have something similar for my HTML files? Then I realised the answer: Amazon S3 supports Web hosting

Other than potential SEO benefits and lower maintenance costs, Amazon S3 is cheeeeeeeeap. In fact, after moving, my hosting bill went from the already-low £3.50 a month to £2 a month. Not enormous savings in absolute terms but it's the sort of thing that puts a smile on the face of a Micro-ISV owner! Further, I was hoping to benefit from automatic scaling in the event of a traffic spike. Finally, it would be easy to enable Cloudfront to improve the speed at which files were downloaded from the site (I still haven't done this yet... it may be useful for larger downloads like the bliss installation file).

So I setup an Amazon S3 account, and began the process of moving my hosting. I'd need a way of uploading HTML to Amazon, both the original upload and on an ongoing basis, and a way of redirecting the domain to Amazon S3... then I'd be pretty much done.

Uploading to S3

Performing an initial upload of HTML files to S3 is trivial. There are apps like Cloudberry that do the job easily.

What I needed was a way to automate the process. The bliss website is not stored as the HTML files as source. Instead, content is written individually, separated from templates, and Jekyll is run to combine the content with the templates to produce the HTML. This massively reduces redundant elements of a page and the effort I would have to go to to do something as simple as update the navigation bar along the top of the page. I need only edit the template.

So, uploading changes to the website is a matter of running the Jekyll engine, then uploading the files. I already used Ant to run Jekyll. I did the same to upload the files. I installed s3cmd, a command line way of uploading content to S3 (frankly, I'd much prefer Amazon support rsync or even just scp but it's easily surmountable by using s3cmd). Then, I wrote the upload script:

<target name="upload" depends="generate">
	<apply executable="s3cmd" relative="true" dir="${site}">
		<arg value="--acl-public"/>
		<arg value="put"/>
		<fileset dir="${site}">
		<globmapper from="*" to="s3://*"/>


This Ant target runs s3cmd, uploading all modified files in the Jekyll-generated site (${site}). The globmapper is responsible for changing each generated file to a full target URL on S3. For example, it changes sitemap.xml to s3://

The result is the following command line gets built for each file (example is for sitemap.xml):

s3cmd --acl-public put sitemap.xml s3://

When run, this means all the files get uploaded.

DNS changes - 'A' records and more

Changing the DNS entry for was easy. is a CNAME, so I simply redirected it from the old GoDaddy location to the Amazon S3 one. This new, canonical, location is provided by the Amazon S3 console and is a concatenation of the S3 bucket name in which your website is stored with the physical location of the Amazon S3 data center.

The root domain,, which I wanted to redirect to, was tricker. Root domains are A records in DNS, and cannot point to another symbolic name like a CNAME record can. It has to be an IP address, but because Amazon does not give out IP addresses, and they could change at any time, I had none to provide.

My solution was to find a Web host that allows 301 redirects at a one-off, pre-pay cost. 301 redirects are not as questionable from an SEO perspective as 302 redirects and, besides, most direct links to the website used the www. prefix.

I found this really neat host called NearlyFreeSpeach.NET. I setup an account for '' and configuring the redirect was simple through their web front-end.

To configure the redirect, first I added two aliases for the site. These were and Then, I set the Canonical Name to be and set the Canonical Type to be Hard. This forces the redirect whenever anyone requests

Thus far it costs me about 40p a month to maintain this redirect. I'm very happy with NearlyFreeSpeech.NET, their pricing is fair and their configuration straightforward.

And the results are...

... good!

The Google Webmaster Tools site no longer gives me any redirect errors.

The website has got faster (the change was made around the 12th April):

Response times after the change

And most importantly, organic traffic has continued to grow:

Organic traffic after the change

Of course, this continued improvement in SEO traffic is not necessarily down to the move from GoDaddy. I've also been building links, A/B testing the site and adding content. However, with Google confirming page speed is now a factor in ranking pages I think anything, really, can help.

I was pretty dissatisfied with GoDaddy's response to the 302 redirects problem, but other than that I don't have an enormous beef with them. They were cheap and generally got the job done. However, I'd advise any SEO-sensitive business to keep an eye out for 302 redirects.

Thanks to Igor ™ for the image above.
blog comments powered by Disqus