Site Down : the Explanation

So.  The site went down over the weekend and through to most of Monday with intermittent times up.  For that we apologise to you; our customers.

It took us a while to figure out what was happening – at first it seemed like the site just wasn’t loading properly, so we had to get the hosting company to adjust the bandwidth and memory used for the site.

Later, we got a second error – too many open connections at the same time.  At that point, we upped the number of allowable connections – though we were beginning to truly wonder since our visitor logs were showing no change.

Then the hosting company decided to shut us down as we were just hogging up bandwidth and resources.  At this point, began a huge hunt through the log files (individual log files for each visitor) to figure out the cause.

It was  No, they’re not getting a link.  More like a kick in the ass.  They were trying to index our site, which is cool and all.  But they were ignoring all normal Robots.txt (the file that tells the Robots what to do) instructions and basically conducting a DNOS attack on us.


Currently they’re entire IP range is banned from out site.

So. On our most profitable days (i.e. the beginning of the month) our site has gone down due to the inability of the ‘next Google’ to actually manage their robots.