A few Google Geo Presentations

imageIt’s been a busy year so far and therefore my posting has been woefully slow. Much have my time has been working with Google Earth Builder. This provides a new way of distributing geospatial information from Google into all sorts of clients, providing a true geospatial platform as a service without the need to worry about how many servers or virtual servers you need to support your clients, allowing GIS experts to worry about geospatial information science rather than geospatial information systems. There will be more information coming out about this platform over the next few months and I hope I can talk to as many people as possible about how they may use this to share their data and collaborate with their clients, removing the complexity, headaches and expense that this causes with current geospatial systems. If you want to know more watch the Where 2.0 video below.

More Google Geo Goodness

Where 2.0 wasn’t the only conference recently, last week at Google IO 2011 there was a veritable plethora of geospatial talks on all of the aspects of Google Geospatial. Two talks stood out for people imagewho might be using and sharing geospatial data, the first about Google Fusion Tables and the second on the surprises of the maps API. The latter presentation gives a good overview of some of the new geospatial functions with the whole Maps API that you might have missed over the last year including. Geospatial at Google is a ever increasing area which touches lots and lots of products, that are easy to use and implement for non-geo experts which increases the reach of geospatial data to more people, this talk is a great intro to all of these.

Fusion tables, GIS for normal people.

Fusion tables continues to add more and more functionality that will allow the geo-prosumer to create and share spatial data in record time. One of the main additions has been that of extra styling functions to the product allowing people to create more engaging maps which in turn help convey the message easier to non-geo people, you can get some more details here. If you wanted to know more about Fusion Tables then this talk is a much watch to show how you can host and map geospatial information from the Google cloud without a single line of code, something I would never have imagined possible when I started doing this GIS malarkey in the early 90’s.

Speed, Speed, Speed

A final geo talk at Google IO that caught my attention was one about how to improve the performance of any mapping application using the Google Maps API. I’ve spent a good portion of my career trying to improve the speed of web applications and especially geospatial web applications. It’s good to be armed with knowledge before you even start any development, this presentation hopefully will help you avoid the pitfalls that people go through when starting developing in this area.

If you combine this with articles from the Google Maps website, such as Too Many Markers, then you can hopefully create speedy maps that are a joy and not a curse to use.

You can find more information and sessions from Google IO 2011 at the website here. If you set the filter to Geo you can see all of the presentations that had a geo flavour. Hopefully I’ll be presenting at the Google Enterprise Geospatial day at the end of august. If your into geospatial and Google it will be like a mini Geo-IO. Hope to see you there.

Normal service will soon be restored.

I’ve been increasingly remiss on the updating of this blog, which I’ll hopefully remedy over the next few weeks. I don’t have any more excuses apart from the fact that I’ve been on a never end whirl of meetings, conferences (as delegate and booth babe!), training courses (run by me) and the never ending search for people to help me out. I’ve also been coming to grips with the never ending volume of innovation and products coming on stream at Google, absorbing their nature and understanding their impact on enterprise geospatial.

Google London

What no PC?

The other fact is prior to joining Google my main blog editor used to be Live Writer. I was one of the most PC people you might have known, thats PC as in Personal Computer, not the other one although I’m not too bad in that department as well (I try to minimise the HR violations). It seems that over the last year I seem to have become mostly Mac, that being until my first day at Google I had never even had a Mac or even touched one in anger for over 15 years (since I waged a one man war against them back in my days at Kingston). It turns out they are not half bad.

Mars Edit

Maybe because I secretly like having the option of a command window hat doesn’t run a horrible version of DOS, and actually have to use vi and ssh in my job! Maybe it’s also because I do most of my job in a browser these days and no longer have to have a copy of ArcGIS on my machine. Finally it’s probably the fact the the machine boots up fast, suspends without trouble and doesn’t seem to lag at weirdmoments. Sure it’s probably down to Apples bespoke hardware / software combination but whatever it is it’s been a breath of fresh air and whilst it’s made me tear whats left of my hair out sometimes, once I started to think like a Mac I’m unsure that I can start to think like a PC again (I’m sure I’m going to grow a goatee and get some thick glasses next).

So here I am having found a new easy way to write blog posts on my Mac using MarsEdit. It seems to do everything LiveWriter did without the fact that Microsoft seems to always want to update the whole Live suite all the time including the now horrible new version of Live Messenger, the Mac version from Microsoft is better.

Fast Boot, Quick Suspend

Why don't you!

Just as I find one way to edit posts another challenge falls into my lap, the CR48. Now my next challenge is to find a good blog editor that runs just in the browser and I can conquer the a new frontier. So far I’ve had good experiences with the machine. For much of what I do on the couch (as distinct from at my desk, although it’s not far off there also) is within a browser I want instant on to check emails and websites and instant off when I need to go do ‘something less boring instead’. The CR48 really does those things well, for editing text and using the ‘normal web’ it’s a lot better than an iPad, although for media consumption the iPad still has it beat, but then again that’s not what it’s primarily for.


What I like most about the CR48 is that it is something that is trying to upset the status quo of something that looks like a ‘normal computer’ acting like a ‘smart phone’. The act of doing away with storage on the device makes it instantly more controversial as people will start to get all nostalgic over defragging hard disks or performing virus scans, updating applications or managing services. It seems like many of our computers have turned into actual servers over time running background processes that seem to make the machine randomly page to disk or kick a background process to max warp (subsequently making your 2010 computer run like something purchased before modems went out of fashion).

Are we living in the future yet?

I don’t know when this happened, but it seems that between my Mac and my ChromeOS device I’m slowly moving into a brave new future when I can start to worry less about about what’s on my machine and more about actually getting things done. When most things you do are being stored in the cloud, you become much less attached to the actual machine your using, it becomes much more of a tool than a priceless piece of technology to be protected with your life less you lose your photos. Hopefully this  ChromeOS video serves to highlight this brave new world in a robust manner, hope to see you there in the near future!

When web hosts go bad.

There are a number of reasons why this blog has taken a bit of hiatus in terms of posts in the last fewimage months. Probably the most important of which is that fact for once in the UK we seem to actually be having a summer, coupling this with my daughter forcing me to watch all of the Doctor Who’s since the reboot (honestly she’s only 4 but she’s quite persuasive) means I spend a lot of time outside on the bike or in explaining the fighting differences between Daleks, Cybermen and everything in between.

A slow spiral down the web plug

Secondly has been the slow demise of an excellent webhost, Webhost4Life. Whilst never the fastest host it has always been reliable for everything I’ve needed it for, especially for the price and features I paid for imageit. Recently though they decided to move to a new platform which I think this was due imageto the company being sold to another and a change of management went for a cheaper option. Initially it seemed good, the admin site was much improved and it seemed that things were going tickey-boo. Unfortunately as others have also found the support and stability of the site has gone way down. This blog was up and down more times than the Grand Old Duke of York, and even when it was up it was slower than an England footballer in front of goal. Finally having enough after 3 days of downtime I decided to pull the plug and move this blog over to a new host, Arvixe.

imageNot only has this host been a lot quicker for me, it obviously doesn’t seem to be as contested, but it also supports a variety of development platforms, from PHP to ASP.NET and whilst the admin pages aren’t as nice and I had to install WordPress 3 the manual way (actually it wouldn’t even upgrade on WebHost4Life so it was very broken) which took all of one hour including getting the data loaded from the old blog.

I wonder if it’s like changing banks, hosting is one of the most painful things to change once you have it all set up, once you’ve done it though, you wonder why you never did it before.

Moving On

image Finally I had my final, final, final leaving do this week from the hallowed halls of ESRI(UK) esri(UK) ESRI UK. After 7 weeks of being at Google and one month of garden leave (people kept telling me there was no ‘–ing’) it was about time too. Moving on from a job where I talked to people about GIS technology to a company which I talk to people about GEO technology has been less of a shift than you might think. The complexity of solutions might be less, I haven’t yet touched SAP since I left (phew), but it’s still based around understanding how people’s workflows might fit and integrate with the respective technologies and API’s. Although it’s a lot more consumer-focused though, a bit like web mapping was with ESRI UK 8 years ago, I like it.

image There is so many geo-technologies to learn at Google, from the server based Google Earth Enterprise to the cloud based Fusion tables, as well as the well know Google Earth and Maps. My first seven weeks I have have been like a kid in a candy store both metaphorically with the learning of new technology and physically in the micro kitchens (note to self, must do more exercise).

Expect to see much more Google Geo related tips and trips around here as I work the stack of technology in my sweetie bag. I won’t give up the esri thing just yet though and hope to do some integration work between the various systems and I’m giving a talk on integration next week

Now with added location and mobile

On a side note, whilst updating the site I implemented two new features. Firstly the move had broken my imageprevious ‘where are you from’ section so I thought I’d update it with some Google Maps code. Unlike the previous attempt I decided to take the easy geolocation route and have the browsers (or gears) do it for me. I’ll discuss how this works in a subsequent post, it’s not tricky and takes about 10 lines of code, all of which you can get from the Google Maps v3 API page here

The second feature was the addition of the WPTouch plug-in which formats WordPress sites to work nicely on mobile devices like the iPhone and Android devices. The free version seems to work nicely for me, the pro version seems to have some nice features that one day I might find I need. Give it a go, it looks nice.

Freemercialism with a Spudger?

Let’s start this post off with a simple question. You do know what a spudger is don’t you? What you don’t and more importantly you don’t know how you could use one to take apart an iPad. Well you’ve come to  imagethe wrong place to find out what and how. You can find the right place to do that at iFixit who are the self named ‘free repair manual that you can edit’ but actually are like a parts catalogue that shows you what you can do with technology and then gives you a link to buy the widgets that you need to do said task from their online shop. In this particular case, pun intended, they showed how to take apart an iPad using the implement in question, exhibit A one spudger.

This is a classic case of cross-subsidy of one product with another, the giving of a free guide encourages you to buy a new (you don’t already have one do you?) tool in order to perform the task. It’s one of many image ways that companies are able to provide free software that for many users have absolutely no cost. Slightly different from completely free where there is no commercial gain by the supplier at all, these economic models have been growing on the web more and more since it started but was not begun there. The 2008 Wired article “Free! Why $0.00 is the future of business” gives an excellent overview about where free comes from and why it’s becoming increasingly important. In an almost prophetic manner, that article and the accompanying book “Free: The Future at a Radical Price”, expound on the nature of how you can seemingly get something for nothing, how people now expect it especially with software and services on the web and how as commercial companies you have to develop business models to embrace and/or compete with it. image In the post bank bust world, with central and local government agencies, especially in the UK, having limited capital resource on which to blow on massive IT projects, free might not become part of the solution, it might become the only solution.

The book is itself free online and can be obtained here. It’s interesting to see how that the unabridged audio book is actually free also, but the abridged version costs money. Obviously anyone can read the book verbatim onto tape (old school) but the work it takes to create a meaningful abridged version contains value and is therefore costs money. Value that is in terms of the time is has taken to edit the book down and also the value it has to the attention challenged time poor iPod carrying commuter who can’t concentrate for six hours on anything. This is the same for the dead tree version also; the payback for being able to fall asleep with your new paper book in the bath is the cost of the pulping, printing and possibly delivery.

Free OS Data, finding a freemercial model.

In the GIS Industry in the UK, whilst there is a plethora of good software that is available for nothing (see here) it has always been the cost of data that has been one of the main talking points around the imageindustry. The recent freeing of some of the data from the Ordnance survey, I know this makes it sound  like some sort of mystical quest that is because in some ways it has been, has now provided a good deal of authoritative spatial data for base maps, base geographies as well as gazetteers. Now that this data is ‘free’ it’s going to be interesting to see how people will try and add value to the datasets to realise value in the market place.

One of the main reasons behind freeing the data up was to encourage the use of geographic information within the general commercial landscape outside the Universities, Utilities and Governmental organisations that have been it’s natural home. How people will get access to this information is the next challenge, many of the business models outlined in Chris Anderson’s book will be applied to the delivery and usage of this data. Some will succeed and some will fail, but it will be interesting to see how many people outside the traditional geo-markets will be able to get access to this data and how they will imageinteract with it for nothing using widely available tools, which themselves are free.

Who Pays?

Also it will be interesting to see how businesses can afford to cross subsidise this access, how they will be able to create money out of such offerings. in the past just the value add of supplying the data used to be enough to justify a fee, in today’s market, that might no longer be enough for many people, or at least some access should be available for nothing. One thing is for certain, in the current cash strapped world, there might not be many alternatives many people will want, or can afford, to start with.

The web world is (mostly) flat.

It might come as a shock to many of you, but often on the internet the world is flat. Yes I know that you thought this whole debate had gone out with the ark (or actually a little later), but after years coming to imageterms of the world being a sphere, cartographers everywhere needed a method of putting that world  down onto paper.

Now this was fine for many years until 2nd May 2000 the US decided to turn off Selective Availability and whole world seemingly brought into WGS84. The difficulty here though is the fact that WGS84 and Longitude and Latitude coordinates are an approximate representation of the position in the real world, geographic coordinates, many maps paper and web based are representations of a flat world using Cartesian coordinates.  

A world of BNG

Obviously as a GIS professional you know all this, although come to think about it, a lot of people in the UK probably haven’t. Why, I hear you cry? Because the Ordnance Survey in its role of national owner of all things spatial in the UK decided that we needed a more accurate representation of the surface of the United Kingdom (fortunately we are not a large county, well unless you measure it in ego image terms, so it can work quite well for us). In doing this they created the British National Grid (BNG) which allows them to produce all of their excellent paper maps (and not paper globes which would be inconvenient for packing up in your rucksack).

Now why do I mention this, well for the first three years of my work at ESRI(UK) I touched nothing else but data in the BNG projection, it was only when I needed to implement a solution using GPS data for tacking refuse trucks that I came up against the need to occur re-projecting data between two projection systems WGS84 (for GPS) and BNG (to plot on the web map).

Whilst BNG is the preferred coordinate system for many users of OS data, amongst many web developers who have come across mapping via the various web offerings of Microsoft, Yahoo and Google (or indeed in the past with ArcWeb Services) the coordinate system they use will be Web Mercator. In fact in the UK there is a chance now that more people use the Web Mercator projection than the BNG projection, a fact that shouldn’t be lost on people. So where was I? Ah yes setting the stage for my problem.

What was I doing?

The reason why I came to this post was the fact I was implementing a little demo for a colleague to show a GeoRSS feed of Earth Quakes (what I term a ‘classic feed’ as it appears in every demo) with the Bing map service, both within the ArcGIS Silverlight API. I thought that this would be no problem as I knew of two existing ESRI demo’s with the functionality I required in. Firstly the GeoRSS layer sample from Morten and the Bing sample from the ArcGIS Silverlight concepts section.image

Now using my advanced skills in copy / paste I managed to get a map that looks like this (image right). The classic, “everything is off the equator near africa” map. Hmm, I think, when i was using the old ArcGIS Services map (as per the sample) I could get an image like this (see below).

imageAh-ha I think, something is up in the state of projections. How can I request the GeoRSS feed in another projection, in this case Web Mercator, rather than the Long/Lat WGS projection that was given as standard.

The answer to my ponderings was that GeoRSS defines the returned format as having to be WGS84 and in order to place it on top of my Bing map I would have to re-project the data myself. Fine I thought, I know how to do that, ArcGIS Server doesn’t have a Geometry Service for no reason you know.

Chatting with the Geometry Service

With the implementation of the Geometry service within ArcGIS it’s been very easy to re-project data between coordinates, you can find the documentation to do this here. This is good as it allows you to project between many different coordinate systems as it’s all server bound you don’t need to pollute your client with any algorithms.

It should be noted here, that over time there have been a number of SRS’s used to define Web Mercator. An email that I’ve seen on the boards which explains the differences between the numbers used can be found here (see the post by Melita Kennedy).

The problem you have is that it can be rather inefficient to get points into the client and then send them to another service to project, get the data back and then have to display that in the map. This solution might be the most flexible (as it could possibly handle any projection required) but it leaves a bad architectural taste in the mouth.

I put it to the back of my mind in the folder entitled, architectures to use when other methods fail and went on thinking about how it might done, back to math (+s for UK readers).

An Algorithm

I’ve always wondered what the actual algorithm used for doing the projections between WGS84 and Web Mercator, this was a demo so I didn’t have to be too careful about the accuracy. I once again used my Google brain to come up with the following link which contained a Python script which contained the algorithm a required. Which when converted to C# (not too hard) looks like so:


When I placed this into my GeoRssLoader.cs class file (see the GeoRSS Silverlight sample again) then I managed to get the points placed into the correct position, see map below.

imageSorted I thought, but then I got thinking. Surely this is an amazing common task that everyone is doing, GeoRSS is very popular and there are one or two ESRI developers out there (I know I have 250+ of them coming to see me present at our Developer Hub Conference next week).

If you’re interested in converting between WGS84 and OSGB36 then this link should be handy, if I get time I’ll knock up a C# class doing it and post it on this site somewhere.

The Easy Way

So I was showing my solution to another colleague of mine all impressed that I could do it using the power of math when he said that the JavaScript API had a function for it built in called esri.geometry.geographicToWebMercator(). Gah! I thought, all these JavaScript dudes have it so easy, no one ever creates one of these for us poor Silverlight chumps. Well actually they do.

Hidden away in the ESRI.ArcGIS.Client.Bing.Transform class are the following two methods:

imageBoth of which much simpler than including your own projection algorithm within your code, why reinvent the wheel after all. As we can see there are always many ways to do things with technology. It’s often the simplest one that can avoid notice when your thinking through a problem, although it should be said that there are merits to all approaches due to flexibility (REST Service), transparency (algorithm) or simplicity (existing class), the choice as they say, is yours.

Adventures through the Silverlight

imageOver the last few weeks I’ve been having a few adventures in the world of Silverlight. A bit like Alice I’ve been following white rabbits down holes and through looking glasses. What I’ve discovered is that having an IDE doesn’t always make things easier, especially when the error is occurring somewhere between the chair and the keyboard, a place which is notoriously hard to debug.

Brain don’t fail me now

One issue I have is with my brain. If you start thinking as if the development environment is going to help you, then when it doesn’t it can completely throw the processes you use to figure stuff out, if indeed you can figure it out. Strangely enough (well for me anyway) if I’m in an environment where there is little help, read ‘no intellisense’, then my brain rewires itself for self help. This can often be easier on the development as I tend to check the code more and be more robust my development methodologies (i.e. checking my environment is setup correctly for one thing). Usually I find the differences in coding for Silverlight or using Dojo to follow these patterns (I often still use VS for Dojo, but obviously get precious little help!).

With Silverlight my development is all done in Visual Studio 2008 (with some design done in Expression Blend of course). Now 2008 is quite helpful when checking the syntax of C# code, but it can come off the imagerails with the XAML syntax, so once your done in blend and are hacking around with the mark-up you can often come unstuck. Many the time I’ve spent at the top of the page.xaml tinkering with the namespaces, wondering why the code wont work when I’ve copied it straight from the ArcGIS Silverlight API samples (note: always check the breaking changes in any of the API release notes like here as the samples sometimes lag behind the releases and don’t always correspond).

Other times the IDE just goes a bit spooky on you, such as when I added a new class to the top of my page.cs file (don’t ask me why I wasn’t refactoring things into different places, I was prototyping it’s allowed). Now I figured that this wouldn’t have been a problem, I’ve often slung a class at the top of a file and had no problem (or none I can remember!), but whilst the compilation and run of the application had no problem, the actual linking up of the page.xaml and the page.cs seem to have been b0rked.

Every time I needed to add a new event handler, or to navigate to an existing handler from the XAML to the code I would get the following error:

To generate an event handler the class ‘page’ must be the first class in the file.

Now I didn’t believe what my eyes were reading at first, of course I was using Visual Studio so my brain had partially shut down, therefore using my outsourced brain (read Google) I spent a few minutes imagetrawling the interweb in the hope of finding a solution. I did find it buried deep down in the following thread, where it spelt out the reason for my ‘code fail’ to be the fact that my new class was first in the code behind file, move the class to the end and hey-presto everything was tickety-boo.

Now this serves to highlight both my initial problem of my brain expecting simple issues like this to be sorted out by the wonder that is Visual Studio 2008 and secondly highlighting the fact that whilst an order of magnitude better than Visual Studio 2005 for its integration with Silverlight, it wont be 2010 that Microsoft will have a true development environment for it. Note to self, better get installing the 2010 RC when it’s released next month to check life will be peachy.

A Dash or Two

Where Silverlight (or any RIA environment, Flash, HTML 5) really excels is in the delivery of dashboards that allow for the easy cognitive processing of information without the clutter of hardcore GIS tools that are often prevalent in some internet mapping applications.

Indeed in my opinion (not necessarily anyone else’s though) if your using Silverlight to just deliver Y.A.M.A. (Yet Another Mapping Application) then unless you really need it to be rotating on a flying cube surrounded by dancing leprechauns (however tempting it might be) then you probably need to be using a more standard HTML/JavaScript based client, such as Dojo which doesn’t have the plug-in overhead.image

The ability to present and link multiple maps together, all updating in real time, with graphs and reports can really help sell the benefit of GIS to upper management, who often don’t get excited about data formats, tile caching and the different API’s. Show them the ability to visualise all their assets and modify their assignment in real time allowing for visual modelling of costs, then you might be on to a winner, show them the common operating picture of an unfolding disaster then you almost certainly are, especially if it can save money in the long run. Sure it might use all of the cool technologies under the hood, but most people who make decisions don’t care, they want simple tools that can leverage powerful geoprocessing tasks without even noticing. With good design and the interactivity given by Silverlight (and other RIA’s) then the move of GIS from allowing people to make niche decisions to impacting throughout the business should become a whole lot easier to show and use.

imageLet me at it – hold the white rabbit

Hopefully you can see not only the wonderland that can be offered by Silverlight but also the frustration that can occur if you blindly follow white rabbits around whilst developing.  If you don’t have time to build your own Silverlight client from scratch, you can always get a head start from one of the example applications fro
m the community section of the Silverlight resource center here or some nice examples of dashboards and UI can be seen on the ESRI North East Africa site here.

Which date is it?


I have to admit it I have a problem with dates. I always seem to draw a blank when trying to remember them; I’ve outsourced much of my anniversary notification to outlook and my iPhone, or work with them in applications. It has been ironic this week as the last decade ends and the next decade begins that the first thing that stumps me is a little date problem.

A choice of dates

I’ve been doing some development using the Silverlight API and I came to the part of the project where I needed to filter data using a date range and display on the map and in graph (or chart) form. Now querying with dates with any system is always tricky, especially with dates. Getting the right format can be the difference between date success and date fail! The right positioning of brackets or hash (US pound) symbols has often been the bane of my life.

As this is the Silverlight API I’m backing onto the ESRI REST API for my heavy lifting.  So I need to know the right REST syntax in order to create a correct query for use both as the ‘where filter’ for the feature layer as well as the ‘where filter’ for the query task. Fortunately the REST API provides a nice query form with which you can enter these parameters to your heart’s content (sample server example link).

imageNow, at this point you might think my work here is done, that this form and the collective knowledge of the internet (by that I mean my Google search box, to which I have outsourced my ability to remember syntax), would be able to get me to my goal of a working filter and in some ways it did, but then I found that there was a choice.

Choices Choices

Looking at the help page for the REST API here we can see the query layer section which gives the imagepossible filters for the whereas ‘any legal SQL where clause operating on the fields in the layer is allowed’. Now in the project that I was working on the dates being queried were in two different layers, one which was being filtered and the one providing the data for plotting on the graph. Both the layers contained one or more fields of the type “esriFieldTypeDate”, so I thought the same query would work on both, wouldn’t it?

Well it turns out that the answer is no. I initially started with one layer and was provided with the date format of #2009/01/01 00:00:00# which worked fine for querying a single field.

In usage this would give a query in the format of

"FIELDNAME > #2009/01/01 00:00:00#";

But when I applied this to another layer where I was filtering data by two date fields I got the strange error of:

“Unable to perform query. Please check your parameters.”

Hmm I thought, but it worked on the other layer (which is synonymous to the cry ‘but it worked on my machine’) so doing a bit of googling I managed to turn up a link on the forums where someone had had a similar problem and a suggestion to use the following syntax in the where clause:

"FIELDNAME > DATE ‘2009/01/01 00:00:00’";

This worked for both sets of queries. Whether it will work for all databases or data sets will require more imagetesting, but hopefully if your using dates in your application then this way of formatting a query for dates will hopefully work. If I come up with any more definitive answers for which format to use and where or a list of any other date formats that might not work with all datasets then i will endeavour to update this post!

Pointless Predictions

image As the first post of the year you might be expecting a string of pointless predictions about cloud computing, three screens and maybe a slate. Unfortunately I’ve yet to obtain the job title of ‘Futurologist’ so I’ll leave that to those sort of people who probably should really be out there doing a real job, like policeman, soldier or plumber, hmm I think I might need to shut up now and slink off with my Solution Architect business cards before someone outs me as a fraud.

Anyway enough of my blathering and hope you have had a happy new year up till now and your date searches will forever be successful.

Why Fiddler is like a Sonic Screwdriver

The post was going to be initially titled, why Visual Studio 2008 is like a VW Bora, in reference to the clip_image001current fun I’ve been having with my current Silverlight adventures. Unfortunately that particular episode will have to wait although I might give it as my Ignite presentation at our companies away day, watch this space as I’ll only give it if we don’t get enough other presenters. Anyway I digress, or should that be ramble, the current post has been sparked by two things, watching Doctor Who episodes with my daughter and fixing a simple connectivity problem in Silverlight.

Sonic What?

imageJust in case you’re not a Doctor Who fan, and I know it’s hard to believe that there are a number of them out there, or maybe you’re not British and been brought up with good time lord for the last 40 years (give or take a few gaps) then a little explanation is in order about imagethe sonic screwdriver. The sonic screwdriver is a plot device which allows the good Doctor to fix or interrogate any system or machine to find out information about it, fix it or damage it. Usually it’s just a flick of the wrist, a press of button and a flash of light and its job done, procedure complete, information acquired. Now unfortunately in the real world we don’t have a sonic screwdriver that can do this, but we do have fiddler.

So what has this got to do with Fiddler?

imageFiddler is a free tool that allows you to view the requests made from a browser (usually IE). It can be used to interrogate the requests and responses and to visually give you an indication about what might be wrong with a particular application. Like a sonic screwdriver it can gather information just running in the background without any interaction and can tell you how web applications work at the browser level. This can be useful to see why something might be failing in an application even though you might not have access to the code or to see what a system is requesting even though the code is not you own.

This can allow you to see exactly the information a website is sending and receiving from the browser very useful in an era where the logic is maintained within the browser as JavaScript applications or Silverlight/Flash programs. Here it is often useful to see the responses back from the server or more importantly why a request is failing and often this is to do with security issues or cross site access rights.

It’s screwdriver time!

Often the time you break fiddler out to diagnose the problem is the same time that Doctor Who breaks out the screwdriver. System calls failing, applications not working properly once installed on a remote site or map services not responding. Panic starts to set in hours go by as you try and vainly debug the application then someone (usually me) pipes up ‘have you seen what you get in fiddler’?


Now looking into fiddler is like staring into the matrix, when you have done it enough you start being able to see the woman in the red dress even though the screen just shows the weird green dots. You can start to pick out anomalies in the calls, fiddler helps you, it highlights file not found and errors in bright red and it shows you cached items in grey, both of which can give an indication of possible problems.

Authentication issues or caching problems can both cause errors that can be hard to track down, can manifest themselves in esoteric ways and can end up burning a lot of time to diagnose. Fiddler can be your sonic screwdriver in those moments.

Divining a problem.image

Now this post could equally be titled, how many times does a man need to forget to place a crossdomain.xml file on a web server before it stops being funny, but it does serve as a good example of  where fiddler could have shown a problem well before the final issues were resolved. Crossdomain.xml and its Microsoft equivalent of clientaccesspolicy.xml are used by Flash and Silverlight (which can use both) to tell their particular run time environments that they can access services and data on a particular server. This is particularly important when a rich internet application needs to access services or files from a server that it is not served (downloaded) from initially. Without these files present at the root of the server then the request will fail often with unforeseen consequences.

I must admit after going through the story the reason why there were problems becomes obvious, but the feature layer was coming from another server which I had not used before and I had no idea that there might be an issue with it. But it serves as a reasonable simple example nonetheless.

In my particular case I was using the ESRI Silverlight API in conjunction with the most excellent (and free!) Blacklight control library. Now my application was working without issue when I decided to add another layer into my map using a feature layer. I merrily put the lines of XAML into my page to add the feature class (notice server names have been changed to spare the innocent, any maps generated are done by actors).

image Having set up the right renderers and such like I proceeded to run the application. Nothing came back; I checked the code, double checked, triple checked, but nothing. The base maps came up but no features on the top. I then proceeded to add the new map service as a normal dynamic layer as such to check there wasn’t something wrong with the FeatureLayer class. The XAML looked as such:

image Now this gave me an error in Visual Studio:

imageDoh! I think. I had made the classic schoolboy error of not adding the cross domain file onto the new server. I can see this in fiddler as such, so just waving fiddler at the requests has allowed me to divine the problem, much like (in my head anyway) the sonic screwdriver.

image Now this is a good example of being able to see the problem in fiddler even though the application wasn’t reporting any issues. Being able to diagnose problems in this way should often be the first port of call with any issue that doesn’t get picked up by Visual Studio, especially when calling remote services.

Adding the following cross domain file to the server (it’s internal so it doesn’t require too much security, but don’t use this in the wild verbatim) allowed the application to work fine with both the dynamic layer and the feature layer, problem solved, Daleks defeated (again in my head).


Every time I use fiddler it saves me time, the key is remembering to use it earlier in the problem solve! That way you can save time being burnt on things that have simple solutions and concentrate on the other time killers like getting agreement on the UI design, as we know everyone’s a critic!

So where do I get my screwdriver?

Whilst fiddler is written by a Microsoft employee (Eric Lawrence) it’s not actually a Microsoft product ‘per imagesey’, but it does have some good information about how to use it here (msdn) and here (msdn).

If fiddler isn’t your thing, or you just think that pointing a glowing blue light at a computer might be able to fix any communication problem even in the real world, then you can get your own from Amazon here, in fact if I could get a Wi-Fi one maybe I could hook them both up <goes off to tinker in the shed>.

The banana of doom or 404 with style.

I’ve been hyper busy in the bat cave (aka the garden office) with end of year projects, Christmas parties and general shenanigans (who sounds like an Irish military commander). It’s times like this when it’s good to see another site straining under the weight of usage and the imaginative page showing that times are imagegood, yet the server load isn’t. Many sites (the tr.im one is too the left) now put up fun pictures when there servers are taking the strain. These pictures let you down gently, but thoroughly (no slow response just no response) with a promise that coming back later will make everything alright.

We begin to rely on many web sites to provide instant responses to our every whim and when they don’t it’s somewhat of a shock. We never expect a Google application to break so when Gmail is sometimes down everyone gets all of a twitter on how it’s terrible, especially given for most people it’s a free service.image This is the same when services such as Bing go down, TechCrunch said it well ‘Its one thing when startups, like Twitter, go down, which happens all the time. It’s another when a major search portal does it’.

In fact Twitter has one of the most recognisable site unavailable images on the interweb that of the fail whale. This charming graphic, which has often been seen by those people twittering via the website as the popularity of the site has grown, even has its own fan club (‘the fail whale fan club’) where you can buy mugs and t-shirts with the little whale plastered all over it. It’s amazing what a bit of imagination can do to endear people to what actually is a site failure (for whatever reason).

Under the hood

Now all these nice pages are doing are hiding the HTTP error codes that we have all seen emitted by our favourite application, that when not altered from its natural state tends to make their way to our web browsers which then renders it with the minimum amount of eye candy possible. Whilst it’s not only a sign of a poorly developed application to have raw error messages delivered back to the browser, it’s also bad to allow raw HTTP errors to fly around without even a little bit of window dressing.

This is a two stage process:

imageFirstly any web application or end point should emit the right codes when returning information, such as 200 for success and 404 if something isn’t found. REST services need to use HTTP codes appropriately to allow for the proper caching of information, through the use of conditional get, and the identification of bad requests for information. The RESTful web services bible gives an appendix dedicated to understanding which of the 41 HTTP codes that are needed (go on it’s an interesting read, honest guv!).

Secondly the web server needs to be configured to use a nicely formatted and informative imagepage to return so as the user is nicely calmed and reassured as their application is going down in flames. In IIS you can configure the error pages that are sent by a whole server or an individual site through the admin pages and then craft the appropriate response that you wish your users to see.

There for by the grace..

Whilst it’s easy to mock a site that’s having scalability woes especially if it’s run by one of the major internet companies (the Bing error page wasn’t too comforting at first) it only takes a simple post by Slashdot to bring all but the best designed or resourced sites to their knees. Hopefully now if it’s one of your sites at least you’ll look good on your knees and your customers will remain calm. If your stuck for ideas Smashing Magazine has a list of cool 404 error pages that might give you some, even some off the wall ones such as my favourite:


Staking a claim.

It seems strange in this day and age that there is a still a need to ‘stake a claim’ for a piece of digital landscape. We have often be warned about identity theft but it’s often just other people with the same identity (i.e. name) as you that you need to defend against theft of your name or infact the name you might want with a new service. I have no interest in being mtoon768562@hotmail.com or www.facebook.com/mattatoon/ and fortunately I’ve been using the internet longer than most other ‘Matt Toons’ which means I’ve got many of the important sites in the bag so to speak.

Mr Toon I presume?

imageBut it was interesting the other day to get an email from another Mr Toon, from Canada who had ordered  some online goods but unfortunately had seemed to use my username as the email registration. Now this could have been more serious, but the guy was upfront in emailing me so the transaction seemed to be above board. Luckily my name isn’t so common as to have a legion of people trying for the same ‘nick’ so to speak, but I can imagine that Mr David Jones (internet number 1) isn’t so lucky and this cross pollination of emails could become annoying.

What’s in a name?image

Now it might not seem a big deal in the world about whether your name is correct or that you have to  have a few numbers after it if you want that particular tag (I could always be mtoon, or m.toon or matt.toon or another combination which I actually use) but actually when you spend a lot of time sharing information in a professional manner such as with sites such as LinkedIn then having a proper , even spending time in more social arenas such as Twitter, Facebook or online games such as World of Warcraft, then the amount of time you might invest in creating an online persona might be guided by the initial name that you are able to obtain.

The choice of name for a web site can be important too, the amount of dead or dormant websites with really cools names that have holding pages or are owned by a ‘holding company’ means that increasingly the choice of a good site title (especially in the .com domain) can be increasing difficult. The right name can associate people with the ‘brand’ of a site or the give people an instant understanding of the message that your trying to convey (I don’t think a site such as www.mymapspeedsucks.com is really setting up the right aspirations, although if you want it, you can buy it off me for a small fee).

Claim your site

Anyway the main reason for this post was that in part of securing another part of the ‘digisphere’ for image myself I was required to ‘claim’ my blog. Interesting as I can’t imagine another person wanting to claim it myself, but still part of the deal when signing up for many of places is the addition and then the verification of ownership of a blog, in this case Technorati. Whilst I don’t know of the actual real world imagebenefit of adding your blog to sites such as this, as well as others such as BlogCatalog, the process is usually the inclusion of a token within a page (or some JavaScript as is the case of sites such as Google Analytics). In the case of Technorati the code was 74EZG2J6XH2J , now I’m sure that this could have been a simple short post which I could have deleted (I’ll probably delete the code after a while anyway), but once you get writing then sometimes you can’t stop.

Back to the more technical stuff next time.