From conception to customers in 24 hours – how we built a successful startup overnight

From conception to customers in 24 hours – how we built a successful startup overnight

Update, 2 Feb 2016: was obviously taken down after a year, so these links no longer work. However, the article is still interesting reading if I may say so myself was conceived, developed and marketed in just 22.5 non-stop hours by myself and my business partner James.  As you may have heard, the app has taken off at a rate far faster than we were prepared for and it’s been a busy couple of weeks.  Now that we have it under control, I’ve been asked to give a technical summary of how we built the site in such a short time frame.

So, here’s how we did it…

The idea

This was James’ idea, pure and simple.  The story goes that he was out fishing and remarking on how much easier it was to catch a fish if you knew were they were – i.e. if you had a fish finder.  You still had to reel that bad boy in, you still had to coax him out of the water, but if you knew there were 100 fish under you, your odds went up considerably.

Wouldn’t it be good, he thought, if dating was like that?

…whoops, a little background might be in order…

James and I co-founded Knowhere ( back in 2006.  We know location-based services inside-out.  We were doing real-time user tracking (using Windows Mobile, back when it was awesome) before Google Latitude was dreamt of and before Facebook and Foursquare even existed.  We’ve dealt with ‘urban jungles’ and mass caching.  We developed our own SQL routines to query location data before SQL Server 2008 served them up out of the box.  We also know the frustration of building an awesome product only to fall flat trying to explain it to people, or worse yet – even letting them know it exists in the first place.

So, when James thought “wouldn’t it be good if…”, he wasn’t just coming into it cold.

…anyway, as I was saying…

Wouldn’t it be good, he thought, if dating was like that?  That’s when he called me up.  James is the ideas man, but I’m the handsome developer who has to actually make them work.  We were both flat-out in March and didn’t have much time to spare, but the idea was good, it was fun to build, and we knew that our previous experience could make it work, where others couldn’t.

So we decided to give ourselves 24 hours to build and market as much as we could, and then see what happened.

Friday, 9am – cutting to the core of the problem

The biggest lesson I learnt in this project was that age-old maxim in the development and design circles – “less is more”.  We had a lot of big ideas for this project, cool fun stuff which would make the app really great, keep it “feature rich” and intimidate others from trying to copy us.

But forcing ourselves to ship within 24 hours made us really break down the problem into its simplest form, and then come up with the simplest solution to answer it.  So, out went:

  • customer logins
  • user tracking
  • graphics
  • caching (although I hoped I’d come to regret that, and I did)
  • friend invites
  • demographic capture
  • a contact page and supporting website
  • did I mention graphics?

Which left us with:

  • tell us if you’re searching for girls or boys
  • tell us where you’d like to search
  • view the map

That meant users could get the information they wanted in just two mouse clicks and often within about 10 seconds of typing the URL into their browser.

Friday, 10:30am – the audience

We’ve already been burnt before by focusing too much on the technology and features of our systems, and not enough on how we’re going to promote the damn thing.  This was a particularly frustrating concession for me to learn because I’m a software developer.  I take a lot of pride in things like reducing page load times, or setting focus to a form field when you enter a page.  But these things are quite frankly useless if nobody is using them.

With, we knew that our audience would be technology focused and youngish (actually, we were wrong about that one, as we found out later).  We also knew that our application would probably be used ‘on-demand’ i.e. as people were actually out and about with their friends.  With this in mind, we targeted the following platforms, in this order of priority:

  • iPhone (heard of it?)
  • Facebook App
  • Website

If you’re not a developer, building three applications may seem like a big commitment in our 24 hour time period.  But if you are, you’d have heard of HTML5 and jQuery.

Friday, 11:00am – the architecture

Two hours in to our mission, we had a workflow and a target platform.  Now I had to build it.  Luckily, I do a lot of work as a software architect and consultant and I’ve tried and tested a a fair few development and design methodologies over the past.  Because of this, I was able to get us off the ground with a really good solid foundation, consisting of:

  • Sql Server 2005 backend.  I don’t care how much flat-file caching you do, I still like to know that if my indexes fall over I can rebuild everything from scratch, and that scratch is my good old relational database.  Note that I didn’t use 2008 – that’s because I knew our production server didn’t have a license for it 
  • Microsoft .Net 4.0.  I’m a .Net guy and I make no excuses for it – I reckon it is the strongest development platform available (feel free to post your hateful comments below, but I’m over those types of arguments so you probably won’t get a reply)
  • Unity Dependency Injection for IoC.  Unity was overkill in such a small project, but I wanted a foundation for unit testing later (should the app ever take off).  Unity also lets you do ridiculous tricks like (indirectly) calling the web layer from your data layer (for example, to record the current user in the Http Session) or vertically integrating your logging classes.  It is cool.
  • Entity Framework 4.0.  LINQ is the technology I love to hate.  I have gone into so many Brownfield applications where LINQ has been ignorantly spread throughout the site and dragged down the performance, and the project separation considerably.  I saw a client once where one page was generating 50,000 database calls due to LINQ traversal within a repeater.  However, using EF4 and T4 templates, I am able to control exactly how I want my objects represented and avoid these pitfalls.  In particular, I do not represent any foreign key relationships in LINQ and I bind them to POCO in a separate web project.  This allows my web layer to have no project reference to my data layer, and forces my developers to explicitly call the logic/service layers every time they want some data.  It also allows me to cache the objects later, as they are not data-bound.
  • Memcached.  Actually, I didn’t put this in (only 24 hours remember), I just left a gap for it.  But I couldn’t have done it without Unity and the EF4 POCO model.

So, using these, I was able to give myself a 200 hour head start on research and implementation for a top-notch and highly scalable architecture.  I also knew it was bug-free (or close to it) and worked in real-world conditions, due to its recent implementation on

The final thing I had to design was the front-end.  In particular, I wanted to use the same code for all three platforms – iPhone, Facebook and the regular website.

Friday, 2pm – generalizing the front-end for multiple platforms

Thank you so much iPhone for supporting HTML5 and proper web standards.  Because of this, I was able to build my HTML in exactly the same way as I would for a normal website.  And because Facebook Apps are in fact regular websites in iframes, I was able to use HTML for that too.

Building an iPhone application in HTML is not as good as building a regular embedded app in iOS.  I know that.  I also knew I had 24 hours and it simply wasn’t feasible (also I can’t code iOS.  Also, I don’t own a Mac to develop on).  We figured that if the application proved popular, we could build an app later – hell, it might even be a way to monetize it.

As it turns out though, this design decision had a number of other benefits:

  • people could access it immediately when they were out and about (they didn’t have to go to the AppStore first)
  • we could focus our marketing solely on the website URL
  • there was a possibility that other mobile platforms like Android could use it (as it turns it, it doesn’t work on Android, but I’ve been too busy scrambling with other things to work out why.  I’ll bet it’s simple though)
  • I’ve already used for another client so I figured this might be a good stepping stone to a full-blown app one day.

Of course, the three platforms have different form factors and so the final thing I had to do was switch in CSS styles to adjust widths, graphics etc depending on whether an iPhone was viewing the site or Facebook was.

Friday, 2:01pm – hacking in the style sheets

I’m pretty sure that HTML5 lets you switch in style sheets based on meta tags and media flags, but I didn’t have time to work all that out when I knew I could just do this:

/// <summary>
/// Detects current device (eg iphone) and overrides styles)
/// </summary>
private void InjectStylesForCurrentDevice()
var html = "";
if (Request.Browser.MobileDeviceModel == "IPhone" || FindFish.Common.Configuration.Current.IsDeveloperMode)
html = @"
<link rel='stylesheet' type='text/css' href='" + Library.GetFullUrl("~/include/css/theme/iphone.css") + @"'/>
<meta name='viewport' content='initial-scale=1.0, user-scalable=no' />
<meta name='apple-mobile-web-app-status-bar-style' content='black' />
<link rel='apple-touch-icon' href='" + Library.GetFullUrl("~/include/img/iphone_touch_icon.jpg") + @"'/>
<link rel='apple-touch-startup-image' href='" + Library.GetFullUrl("~/include/img/iphone_startup.jpg") + @"' />
if (html != "") this.PHExtraHeaderContent.Controls.Add(new LiteralControl(html));

It’s crude, it’s not testable, but it worked in about 10 minutes.

Friday 3:00pm – meanwhile on the other side of the office…

James, isn’t a developer which is great because for every hour I was developing, James was marketing.  He started withhalf a dozen press releases, each subtly worded to target the audience, and moved on to what is probably the crustiest promotional video every made.

I can’t code for hours on end like I used to do when I was young.  So every hour or so I’d walk over and annoy James.  This kept us talking and enthusiastic and helped us keep up to date on where we were going.

I say it again – what good is a cool application (including auto-focusing form fields!) if nobody is there to see it.  Having such a dedicated marketing effort was (and continues to be) the most important contributor to our success.  (James, do you have anything nice to say about all my development?)

Friday, 4:00pm – the map

Having solved the platform targeting, there was one other ‘unknown’ that I needed to solve before I could be 100% that our application would work – how to present the data.  I’ve used Google Maps extensively for Knowhere so I knew I could get it okay, but I wanted a bit more:

  • slick when used on an iPhone, just like the built-in Google Maps application
  • upgraded to the v3 API.  I had only used v2 before.
  • animations and graphics.  If not now, then at least the possibility later.

As an aside, I didn’t even consider Microsoft Maps or anybody else.  Google is awesome and I have used it before so it was a no brainer.

Fortunately for me, the Google Maps API v3 has been extensively rebuilt with a particular focus on mobile devices.  And to my delight, it supported the ‘pinch’ feature on the iPhone!

As far as the graphics went, James and I had envisaged something cool like a radar scanning over the top of the map.  I still reckon I can do this – either with a floating DIV or using their custom overlays, but I didn’t want to get bogged down in the UI for too long – I’ve fallen into this quagmire one time too often – so I settled for a simple ‘bounce’ animation.  I’m actually thinking about taking this off now, because it is jerky on slower computers when there are 100+ items animating at once.

I drew a couple of cute avatars – one for boys and one for girls – and used them for my marker icons.  Easy.  Here’s the resulting initialization:

this.EnsureMapInitialized = function(callback){
if (this.HasInitialized) {callback(); return;}
var useragent = navigator.userAgent;
var isIPhone = useragent.indexOf('iphone') != -1;
var mapdiv = document.getElementById(that.MapID);
// Load map into container
var latlng = new google.maps.LatLng(0, 0);
var myOptions = {
zoom: 15,
center: latlng,
// Iphone zoom done with pinch
zoomControlOptions:{style: google.maps.ZoomControlStyle.SMALL},
mapTypeId: google.maps.MapTypeId.ROADMAP
that.Map = new google.maps.Map(document.getElementById(that.MapID), myOptions);
// Load to state for retrieval later
myMaps[mapID] = this;
// Events
google.maps.event.addListener(that.Map, 'bounds_changed', function(){
google.maps.event.clearListeners(that.Map, 'bounds_changed');
if (callback != null) callback();
google.maps.event.addListener(that.Map, 'dragend', function(){that.OnLocationChanged();});
google.maps.event.addListener(that.Map, 'zoom_changed', function(){that.OnLocationChanged();});
this.HasInitialized = true;

I took off all the overlays like pan and street view because I wanted an uncluttered interface and I also wanted to reduce the ‘googleness’ of the site as much as possible.  Note in particular that I switched the zoomControl off for iPhone devices which support the pinch.

Friday, 8:00pm – getting the current location using Facebook Places

Almost halfway through the day and I finally began the work which I knew would be the biggest and most frustrating – integrating the Facebook Graph API to allow users to share their location with us using Facebook Places.

With our early emphasis on social media for marketing, a Facebook login seemed like a pretty good way to get people’s location.  In particular, if somebody agreed to sign in using Places, it would give us the ability to periodically update their location (anonymously of course) on our map, providing much more accurate results for our users.

James and I both had reservations about how many people would actually use this feature.  We’re a couple of old-fashioned guys and we’ve been fighting the ‘big brother’ issue for years in Knowhere.  As it turns out an incredible number of people have actually used this feature – about a quarter of our visitors.  I guess people are realizing more and more that privacy is a two way street – the more information you give about yourself, the better service you will get.

Facebook also gave us other information such as gender and age (although we don’t use or record the latter at the moment, perhaps one day).

I’ve already blogged about the Facebook Graph API during my work for, so I’m not going to repeat it here.  Suffice to say, I more or less lifted the code and dropped it into our site (it helped that they used the same overall architecture).

The call to get Places data was pretty simple once I had the architecture in place:

/// <summary>
/// Class presenting the structure of the JSON-formatted response returned by the Facebook Places API. This allows us to 
/// use the Newtonsoft Serializer to deconstruct into a compile-time-checked class.
/// </summary>
private class FacebookCheckinData
public class FacebookCheckinInfo
public class PlaceInfo
public class LocationInfo
public double latitude { get; set; }
public double longitude { get; set; }
public string id { get; set; }
public LocationInfo location { get; set; }
public string id { get; set; }
public DateTime created_time { get; set; }
public PlaceInfo place { get; set; }
public List<FacebookCheckinInfo> data { get; set; }
/// <summary>
/// Loads the personal information for the given user
/// </summary>
/// <param name="accessToken"></param>
/// <param name="userID"></param>
public void LoadUser(string accessToken, string userID)
var url = "" + userID + "/checkins/"
.AppendQueryString(QueryKeys.OAUTH_CODE, accessToken, true);
var webRequest = System.Net.WebRequest.Create(url);
var webResponse = webRequest.GetResponse();
StreamReader sr = null;
string responseText = "";
sr = new StreamReader(webResponse.GetResponseStream());
responseText = sr.ReadToEnd();
if (sr != null) sr.Close();
// Parse the text from JSON
var checkins = Newtonsoft.Json.JsonConvert.DeserializeObject<FacebookCheckinData>(responseText
foreach(var checkin in {
this.Latitude =;
this.Longitude =;
this.WhenRecorded = checkin.created_time;

I suppose the main thing of interest here is the class FacebookCheckinInfo which I was able to build after viewing Facebook’s JSON-formatted response.  This let me parse their response into a “C# friendly” object for me to use later.

Friday, 11:30pm – time for a break

We were both pretty ****ed by this point and starting to get a little grumpy.  James had brought around a few cigars for us to celebrate with in the morning, but we thought a half-way celebration was justified.  Unfortunately, as soon as one gets that taste of cigar in their mouth, one needs a drink so James had a beer and I had a whisky.  Just one mind – I knew I was vulnerable to a snooze.

Remarkably, this combination served to wake us both up and we got back into it with a vengeance at midnight.

Saturday, 1:00am – don’t force people to use Facebook

Restricting people to using Facebook for our application would have been a pretty daft move – alienating both those people that didn’t want to give away their personal data and those without a Facebook account at all (yes, they exist, I was one of them six months ago).

HTML5 to the rescue again with their new geolocation feature.  Now on the app, our visitors can choose to just ‘find people around me now’ and I run the following code:

SignInUsingCurrentLocation : function(){
alert('Sorry, your browser does not support this feature - try signing in using Facebook above.');
this.Map.ShowLocation(position.coords.latitude, position.coords.longitude, that.FindMales);

When I think of all the hassles we had writing Windows Phones apps back in 2006, my heart bleeds.

Saturday, 3:00am – the UI

Because of the time-sink that it is, I purposefully left the UI to last.  This is despite my feeling that the UI is actually the most important part of a site (see what a progressive and magnanimous developer I am?).  Because iPhone users were our primary market, we knew it had to mimic an application as much as possible – and not look like a regular website.

Although the application had three ‘parts’, I decided to keep it all on a single page and switch screens in and out using jQuery.  Although this increased the initial load time, I thought it was worthwhile to have a faster and more responsive app.

I drew the three panels out using DIVS, each one with the width of the screen meaning that only one would show at once.  I then used jQuery to animate the screens left and right – kind of stepping the user through the form.


Of note:

  • there is a single back button allowing the user to return.  Initially I had a button allowing you to skip straight to any of the three screens, but this was confusing.  Less is more.
  • each screen pushes the user through the path that we want them to take.  The ‘calls to action’ are in big button-like links, and there are no distracting work-flows to draw them away from their task
  • instead of just hiding/showing screens, I animated them left and right to give the user a sense of flow and process
  • I had to remove all background images and some CSS radius/shadow to improve the performance of the animations on the iPhone.  As it turns out, it’s better this way – they were just clutter.

Saturday, 6:00am – the finishing touches

That’s it!  End-to-end functionality achieved and tested (a little).  I was exhausted but needed to do a few more things for professional pride:

    • Google Analytics
    • cached the application page using ASP.Net OutputCache – it’s wicked fast
    • deferred the Google Maps loading until the user actually requested a location (as opposed to page load)
    • built us a logo – it’s a pixelated cityscape with fish floating above it.  Just like a real fish finder but in the city!  I’m quite chuffed with that one.

Saturday, 7:30am – the release!

At 7:30am we hit deploy, sent off our press releases, told our Facebook friends and then went and had another cigar and beer.  It is quite an anti-climax releasing a website like that – you send it out into the great wide world but you have no idea if anybody is using it (even Google Analytics take 24 hours to turnaround).  The phone doesn’t ring.  Your bank balance remains fairly average.  Your new neighbour who you haven’t introduced yourself to yet drives by on her way to work and sees you having a whisky and cigar on the deck at 7:30am.

Summary & Lessons Learnt

I have built applications in the past that were much much more sophisticated than  I have had logging, offline email notifications, asynchronous task managers, caching, Cassandra, unit testing and lots of other buzz words.  I have built sites with a $50,000 design and UI budget.

But in terms of bang for your buck, has been the best by a mile.  I think there are a number of reasons for this:

  • the limited time frame forced us to concentrate on the core product – no fluff
  • no graphic designer meant that we were forced to keep the site minimal – again emphasizing the purpose of the site without distracting the user.  It also improved page load times.
  • we put just as much effort into marketing as we did development
  • we built an application that addressed a genuine need in the psyche of most human beings, and we modernized for a technical audience.
  • the 24 hour factor gave us a unique marketing angle – something for non-technical journalists to grasp and run with
  • there are no barriers to using the site – not even a login
  • the ‘fish finder for singles’ branding is easy for new visitors to understand and immediately know what to expect from our application

Given a little more time, the only thing I’d have done different is to add a loading screen so people could get immediate feedback when initially viewing the application – first impressions count (haha I actually wrote ‘fist impressions count’ then – that would be the subject of another blog).

If you haven’t checked out yet, go to it now then ‘like’ us on Facebook   Oh, and tell your friends.

Best practice architecture for professional Microsoft.Net websites

Best practice architecture for professional Microsoft.Net websites

Visual Studio Performance Analysis Tool Error: Getting process information failed…vsenterprisehelper.axd…(500) internal server

Visual Studio Performance Analysis Tool Error: Getting process information failed…vsenterprisehelper.axd…(500) internal server