10-Day results on new product exploration

by alex 17. April 2013 17:14

About 10 days ago, I wrote about my plans to see if traffic to one of my old websites, http://www.nodicerequired.com/, was actually interested people or confused robots. In that post, I had 2 signups and 89 unique visitors. Ten days later, I have 2 signups and 95 unique visitors.

Le sigh. I guess you can't just put stuff online and expect people to flock to it, all on their own.

Metrics indicate that my most frequent visitors are bots

The spinning coins on the website are interactive - the user can click on them. Clicking on a coin will play a coin-pickup sound, and increase the coin counter at the top of the screen. The number of coins the user has is stored in a client-side cookie, so I can only grab the value of this cookie whenever the visitor interacts with my web server - by signing up or going to another page, for example.

There were only four people who both 1) understood that the coins could be clicked AND 2) also reloaded the page, went to another page or signed up for the beta.

This implies a few things:

  • The bounce rate on this page is high
  • Bots are a major source of traffic
  • Converting visitors (from visitor to sign up) don't understand that the coins are interactive
  • Non-converting visitors click the coins but don't interact with the site anymore

Actually, all of these could be true.

My bounce rate IS high, because there aren't any other calls to action or content for the visitor to navigate to; the visitor is not pulled into the site.

Bots give me a lot of traffic - there are a large number of hits to pages that were part of the old site but the new. When a visitor tries to request a page from the old site, I show them a "hey, sorry the site's gone, but here's some information, click here for more, click here to sign up" page, and I'm not seeing any traffic from these pages. There are also a few query strings that are obviously not from either the new or old site; this indicates bots trying to sniff out my server.

No converting visitor has clicked on a coin before signing up. This implies that it might not be obvious that coins can be clicked on, so perhaps non-converting visitors are also not aware that they can be clicked on.

However, the coins mean nothing. They were an experiment to see if making the site more interactive would be beneficial to the visitor's interest in the product. Kind of like an A/B test from the start.

I'm going to keep it up an running, but not much coming from (or going into) No Dice Required, for now.

Early results on exploratory signup website No Dice Required

by alex 8. April 2013 22:31

It's day 67 of the experiment. I mentioned last week that one of my old projects, No Dice Required, recently had an unexpected surge in traffic and signups. I spent a few days last week redesigning the website in ASP.NET MVC 4 and writing new copy and user tracking code.

The new website is a coming-soon page with a beta signup. The entire site consists of maybe a dozen pages: home page (with sign up control), sign up page, sign up complete page, HTTP error pages, and a this-site-has-been-redesigned page for the pages of the old site. Additionally, there are many dynamically generated landing pages.

Home page

The old home page was poor for a few reasons. There were less than 20 words available before login and there wasn't any meta data included in the headers: together, these make it hard for search engines to know anything about the site.

The old page also didn't sell the site in any way. Why does it exist? Why should the visitor care? I'm still not particularly good at writing copy that converts, but I feel like I'm improving.

The new homepage has much more text, images, and formatting variations: these should make a visitor more interested in reading the content of the webpage.

The coins at the bottom spin, and if the visitor clicks on them then a coin-pickup sound is played and the coin counter in the top menu increases. The number of collected coins is persisted in a cookie.

I pushed this to the internet a little less than a week ago. So far there are 2 signups for the beta and it has been seen by 89 unique visitors. Most visitors haven't realized that the coins were clickable. This might be because they lack imagination and I imply that they are clickable. Implying on the internet is bad for conversion! I should have come out and said that they were clickable or collectible or something a bit less subtle.

There's quite a bit of custom user tracking going on; I will write about the details of what I'm doing. The landing pages are done, except that I still need to work out the creation of dynamic sitemaps. The trouble here is that I can't easily route a single file to a controller action without also routing all files through the ASP.NET pipeline as well; doing that would be a performance nightmare.

Early internet traffic for a software startup

by alex 25. March 2013 02:37

This post reviews the internet traffic patterns to my product site and blog for the first three months of their existence.

My traffic from this year, January 1st through March 23rd, is discussed below. My gut says that my traffic rates for today are certainly not enough to sustain the business, but I don't have enough experience to form a baseline to compare my progress against. Maybe this information will be useful to others.

I have two web applications - one is a homebrew ASP.NET MVC project hosted at EisenhartSoftware.com and the other is a Blog Engine.NET installation hosted at EisenhartSoftware.com/blog.

Since they're separate applications, I configured them in Google Analytics to track the root site and the blog independently. After creating the reports below, I think this was an error.

Initially I thought that separating the sites would be useful, as I could more easily isolate traffic patterns generated by my blog and by the root product page. However, using two tracking numbers doubles the amount of research I have to perform to create actionable conclusions from the data. Also, it makes it much harder to see how visitors travel between the root site and the blog. AND it probably makes more sense to use one tracking number because Google and other search engines view eisenhartsoftware.com and eisenhartsoftware.com/blog as the same site without regard to how the server is configured to serve up the content.

My pattern in the future will be to use one tracking number per domain. I would probably use a unique tracking number for each sub domain, based on the traffic patterns I expected between sub domains.

Product Site eisenhartsoftware.com

The traffic is illustrated by the following graph:

The first spike, on February 1st is when I sent my farewell email to the employees of my day job; the email had a link to my product site. Four days later, the largest spike, was when I announced to my friends and family on Facebook that I quit my job and started a software company. The most recent spike was an announcement to Facebook again that my site was redesigned and updated.

Audience summary

  • 145 unique visitors
  • 241 visits
  • 726 page views (3.01 pages per visit)
  • 2 minutes and 38 seconds for the average visit duration
  • 52.70% bounce rate

Traffic source information

  • 4.15% search traffic
  • 34.44% direct traffic
  • 61.41% referral traffic

My top referrers, in order: Facebook, this site, and simpleprogrammer.com (I made a comment or two on this blog and got some traffic from trackbacks).

SEO results

  • 320 impressions
  • 5 clicks
  • 280 average position
  • 1.56% click through rate

I'm trending on a few queries, including "release management", "code review process" and "software build process". Google Analytics reports a click through rate of 1.56%, of all of this is for the term "website build process". I'm inclined to think that this is noise and my CTR is actually 0%.

My product site needs to trend higher for my keywords. Since my site has been recently redesigned, tomorrow I will take a close look to make sure I hit all the basics of SEO, and work on some useful content (blog posts).

Blog eisenhartsoftware.com/blog

I created my blog about a month after setting up the root product site. The traffic to the blog is illustrated by the following graph:

Some immediate thoughts -

The blog's peak day is not as high as the website root. I attribute this to abnormal traffic spikes generated by Facebook traffic. When saying farewell to my old coworkers and when announcing the formation of the software business and the 2nd site design, I linked to the root website. These three days had higher than normal traffic due to direct links to my non-target market; each day was also marked with high bounce rates.

However, the blog has a better daily traffic rate. The following graph shows both graphs together, with the blog normalized to fit in scale to the root site (the blog is red):

I attribute the blog's general higher traffic rate to the quality of its content. The blog has a number of SEO keyword-rich posts and is perhaps mildly helpful. I've had a fair bit of idle time with the blog, but it had fewer zero-traffic days than the root site.

Audience summary


  • 43 unique visitors
  • 131 visits
  • 345 page views (2.63 pages per visit)
  • 6 minutes and 56 seconds for the average visit duration
  • 67.18% bounce rate

The average visit for the blog is almost triple that of the root site. I think this is again due to the content. The bounce rate, however, is higher. I suspect this might be because the blog is young, doesn't have much content yet and I link to external resources in many of my posts.

To decrease my bounce rate, I could do a few things:

  1. Add a page break to every post - this requires the visitor to click on the "read more" button to get to the meat and potatoes of the post. I don't like this solution because it hurts the user experience (more clicks). The copy before the break has to sell the visitor that the post is worth clicking on; if there is no break, the visitor can easily mouse-wheel down to see images or headers that might catch their eye. If this content was hidden behind a break then the bounce rate might actually increase for posts with weak leads.
  2. Stop linking to external websites - this is an interesting idea. Ultimately the posts are designed to be useful to the visitor. If I have content to link to then I should link to it over another website. The trouble is that I don't have much content yet and it's easier to link to a Stack Overflow article. I should make an effort to reduce external links by increase my own content, but not to the point that it is harmful to the visitor's experience. For example, if the post is about a new Javascript library, it would be appropriate to link to the external site. But if the post is about password hashing, I should make an effort to document password hashing best practices and pitfalls rather than linking to someone else who already has.

I'm not going to add page breaks, but I will consider creating more content rather than linking to other resources, as appropriate.

Traffic source information


  • 4.58% search traffic
  • 74.81% direct traffic
  • 20.61% referral traffic

My top referrers, in order: Facebook, this site, and simpleprogrammer.com. Same as the root site.

SEO results


  • 3000 impressions
  • 5 clicks
  • 1.0 average position
  • 0.17% click through rate

The blog has been active for about half the time the root site has been, but it has 10x the impressions. An impression happens when one of my pages is visible to a potential-visitor performing a Google search. So, this means that a page from the blog was actually visible in search results on Google 3000 times.

There's something wrong with the configuration of the blog's analytics - there isn't any query data, but the supporting data (such as landing pages) is populated. As such, the 1.0 average position is probably just the default value. I have 5 clicks recorded for each of two posts and my author page. It's suspicious that all of my click activity is in 5's; also, the summary reports 5 clicks, but the landing page report shows 15 clicks... all this leads me to believe that something isn't configured correctly. Anyway, the two posts which have reported clicks are ones that I would expect - they are both how-to articles.

Conclusion and what happens next

  • I need to get the blog and root site on to the same tracker.
  • I'm going to move the blog since that is the one that currently has suspicious issues with the reported data.
  • I will make an effort to reduce my external links in blog posts.
  • I will continue efforts to create unique content for the blog - that seems to be getting impressions.
  • I should try to improve the leading copy for my blog posts.

About the author

Something about the author

Month List