The Lure of Easy

imageThe other day I built a computer almost from scratch. I can admit it, I can nerd it with the best of them when pressed, ok I don’t even need to be pressed. I had a bunch of components lying around, a not too old processor, a bunch of fast RAM and a laptop hard drive all I needed was a case. That was easy to rectify as I’ve always fancied building a little PC and Shuttle do some excellent barebones machines. Now the premise of this post is not the coolness of my new computer (although it is quite nice) but the ease at which it took to build.

When I was Young.

the internet in the 1970's When I was young and the ‘internet was all fields’ I remember building many a machine, both in and out of work, I remember saving my cash for the components, carefully making sure I didn’t bend anything when I slotted processors into motherboards and affixed strange looking fans to the top. I remember screaming when one of the components didn’t work and whole machine failed to boot. I remember returning complete orders and vowing never to build another computer again. But, the lure is too much for something’s and time can heal all wounds, even those inflicted by bad memory modules.

I Haz PowrToolNow whilst I was away from the field of home brew machines a number of things have happened, component prices have reduced, hardware is much more modular and available, I have an electric screwdriver (my only power tool I might add) and I can buy ready small machines with integrated motherboards at every online store. Now what does this add up too? An ability to assemble a machine in under 30 minutes, from start to end. I was shocked, surely it must be harder than this and after a brief moment of screeching from the machine as I had forgotten to plug-in the graphics card power supply, I was up and running installing Ubuntu (it’s free damn you and until I know it’s stable I’m not putting Windows on it!) and hooking it up to the ‘interwebs’.

Now the question arises, why if it’s so easy, would I not recommend building all the machines I own, or use at work? I’d be able to save money and tinker with hardware, what’s not to like?

imageSo easy is good right?

If pushed I could probably build a wall, but would I want it to support my house, probably not until I’d had  a lot of time building walls, maybe not even until 10000 hours to become an expert has passed. It’s the same with my new PC, would I use it to store my families photos, no I use a RAID disk set for that and the cloud (hmm I do trust them right?) as I’m unsure that the machine I threw together would be able to stay working for a long time.  I find this to be the same in designing and developing applications.

Components and development tools and platforms have come a long way since the internet fields were paved over and with that have come rapid prototyping, development and easy deployment. It’s now possible with the use of wizards and samples to throw a demo together in a very short period of time, like the construction of one imageof these modern barebones PC’s. Lots of development is easy, but because you can throw something together it does not mean it will be robust and stable, because I was able to build one machine quickly it doesn’t mean I will have the same luck again, or that my machine, which it’s mismatch components will not let me down when I need it most, like watching Snog, Marry, Avoid on the iPlayer!.

It’s the same with code developed quickly, technical debit will often lead to decisions being made that could impact the delivery of a system down the line, be those due to difficulties in refactoring or failure to run performance tests on software during development. For demo purposes technical debit might not be important, the code might not need to ever see the light of day beyond the demo, although the consequences of showing functionality that might be hard to implement reliably might live to haunt any project in the future. Lobbing technology bombs between pre-sales and professional services is always something that should be avoided, for good profitability reasons.

The Cloud Lure.

The cloud is another case of easy, it sells itself as a way to remove yourself from the burden of machines, your application can scale so long as you have the money to pay for it. Again, like the 30 minute machine build or the quick copy and paste development job, nothing is as easy as it seems and even though the imagelure is there, careful planning still needs to be done in architecting any system especially for those cloud platforms server to emulate a real system. In a world where your application isn’t tied to a specific machine you need to be careful what you can trust, are you getting data from a machine that knows about your updates, or another machine that is just handling your request at that point in time? As your application scales to multiple worker or web processes in an environment like Azure or App Engine, how do make sure everything is tied together?

Understanding how applications run in the cloud will still be needed, in order to utilise existing or still eme
rging patterns of development, such as those in the O’Reilly Cloud Application Architectures book or being developed by Microsoft on their patterns and practices site for Azure. There is no magic going on here, fundamentally thread must be mapped to processors somewhere, hardware has to do some work and then notify other machines about what has gone on. How you handle this in any deployment and its efficiency will impact the performance of any system and solution.

image Deploying applications into the cloud will be as complex as deploying applications into any set of machines, the complexity might be more software focussed and rely less on the understanding of processor specs and more on the understanding of the best practices for writing scalable applications, such as these provided by Google for App Engine.

Easy come Easy Go.

imageWhen I heard David Chappell (the IT speaker and not the comedian) say the phrase ‘there is no lock in like cloud lock in’ I realised that whilst there is much promise of Cloud computing it still needs treated like any other system. Badly written and architected solutions will not magically perform in the cloud and will always cost you more in the end than those that are optimised for performance and tested for scalability.

The cloud allows us to abstract ourselves from some aspects of deployment, but at a cost of making the software we are to deploy possibly more complex. As tooling and patterns become set we will be able to benefit from the power offered to us by a service we can build and deploy within 30 minutes, just don’t bet your mortgage that it will be up in the morning just because it’s in the cloud.

Install as I say not as I do.

betaAs we all know the pace of change in technology shows no sign of abating for good or ill. In software terms it’s a continual moving walkway of new patches, version and features, usually for the better sometime not so. I’m both lucky and cursed to be able to install a wide variety of new software where I work and at this moment installing a beta of ArcGIS 9.4 (or 10 as it will soon be) onto a new copy of Windows Server 2008 R2. I’ll soon be downloading and installing a copy of Visual Studio 2010 onto that virtual machine as well. Lucky eh? Well yes and no, lucky because I get to try out new technology as it comes out, unlucky as I’m sure there will be a whole host of frustrations about bugs and workflow changes that will eat time along the way.

This is good right?

When you see a new technology being released, usually as part of an existing product you use it can be tempting to upgrade as soon as possible. When you’ve been working on that technology for a while, at the cutting edge so to speak, you want to tell people how good it is. The problem comes around when the technology you use is not actually supported for the applications running on it. Sure it might work, even if you have to spend all night tinkering with he registry, but without support you’re on your own (or at the very best, it’s you and a forum of people!).

There is also a propagation of new and cool, as people install the newest and shiniest new software others also do, as successes increase people believe that because it works it is also support, this is definitively not the case, especially in the case of server software.image

Windows Server 2008 R2 and ArcGIS 9.3.1

I like Windows 2008 R2 in the same way as I like Windows 7, they have the same heritage, the main one being that they are not based upon the same core as Vista. Where possible I’ve upgraded all of my servers to this release, all of those servers I mean that do not run ArcGIS 9.3.1. Why if it so good, well because it’s not on the magic list. “What magic list?” I hear you ask; this one. The image below shows the list of platforms that is supported by ArcGIS 9.3.1. Look through it, notice no R2.

image

Now there are people who have no choice than to install on a new system such as R2, where purchases or machine suppliers can’t give you a copy of non-R2 or Windows 2003, in these cases, such as given here, the time for installing can be a lot greater than it should have been given a supported operating system, even if it seems that some people have an easier time installing it than others.

Now I have quite a lot of questions about which of the Microsoft operating systems are the best to install ArcGIS Server on, I used to say Windows 2003 as I felt at home in IIS6 manager and used to get lost in imagethe new IIS7 manager, but now I have my head around it I stick with recommending Windows 2008. I never recommend the use of desktop systems for anything more than brief testing (I do development against ArcGIS Server from a desktop machine, in my case Windows 7, I try and never install server software on my development machine if I can help it). Doing this gets you into good habits and doesn’t lead you to the problem of serving out large caches of data to an organisation using Windows XP’s crippled IIS5.1 (yes I have seen it happen, and no I don’t encourage de-crippling through registry hacking).image

Remember by its very name ArcGIS Server is a server not a desktop product and friends don’t let friends install servers on desktops. Until I here otherwise from places like here and here, I for one won’t be recommending R2 for ArcGIS 9.3.1 (and nor should other people be encouraging it!). If you have to, then  good luck, I’ll try not to sail in you boat.

Install as I say not as I do

imageSo to sum up, it’s easy to think that as people blog about software working together they are often only giving their opinion about how it has worked for them. They might be able to give you advice about how it might work for you, but when your production system goes down in the middle of the night because your versions were not certified I can guarantee that they probably won’t be coming round to explain the short comings to your boss.

When I say here that I’ve seen ArcGIS Server 9.3.1 running on Windows Server 2008 R2 don’t assume that it’s supported when the Support site says that 2008 R2 isn’t supported for 9.3.1, if you want to go ahead and do it, it’s a free country, but don’t expect the support department to lose sleep over your downtime.

The CleanerSure it’s nice to
try new software out once in a while and even install beta products to work out how they tick, but when money is on the line, take some advice from someone who has been there before and be conservative with your software installs, if it’s a production system then play it safe. So you don’t need to employ a ‘cleaner’ to remove the mess.

image

Anyway I’m off my installs are done and I have beta software to make work.

The web world is (mostly) flat.

It might come as a shock to many of you, but often on the internet the world is flat. Yes I know that you thought this whole debate had gone out with the ark (or actually a little later), but after years coming to imageterms of the world being a sphere, cartographers everywhere needed a method of putting that world  down onto paper.

Now this was fine for many years until 2nd May 2000 the US decided to turn off Selective Availability and whole world seemingly brought into WGS84. The difficulty here though is the fact that WGS84 and Longitude and Latitude coordinates are an approximate representation of the position in the real world, geographic coordinates, many maps paper and web based are representations of a flat world using Cartesian coordinates.  

A world of BNG

Obviously as a GIS professional you know all this, although come to think about it, a lot of people in the UK probably haven’t. Why, I hear you cry? Because the Ordnance Survey in its role of national owner of all things spatial in the UK decided that we needed a more accurate representation of the surface of the United Kingdom (fortunately we are not a large county, well unless you measure it in ego image terms, so it can work quite well for us). In doing this they created the British National Grid (BNG) which allows them to produce all of their excellent paper maps (and not paper globes which would be inconvenient for packing up in your rucksack).

Now why do I mention this, well for the first three years of my work at ESRI(UK) I touched nothing else but data in the BNG projection, it was only when I needed to implement a solution using GPS data for tacking refuse trucks that I came up against the need to occur re-projecting data between two projection systems WGS84 (for GPS) and BNG (to plot on the web map).

Whilst BNG is the preferred coordinate system for many users of OS data, amongst many web developers who have come across mapping via the various web offerings of Microsoft, Yahoo and Google (or indeed in the past with ArcWeb Services) the coordinate system they use will be Web Mercator. In fact in the UK there is a chance now that more people use the Web Mercator projection than the BNG projection, a fact that shouldn’t be lost on people. So where was I? Ah yes setting the stage for my problem.

What was I doing?

The reason why I came to this post was the fact I was implementing a little demo for a colleague to show a GeoRSS feed of Earth Quakes (what I term a ‘classic feed’ as it appears in every demo) with the Bing map service, both within the ArcGIS Silverlight API. I thought that this would be no problem as I knew of two existing ESRI demo’s with the functionality I required in. Firstly the GeoRSS layer sample from Morten and the Bing sample from the ArcGIS Silverlight concepts section.image

Now using my advanced skills in copy / paste I managed to get a map that looks like this (image right). The classic, “everything is off the equator near africa” map. Hmm, I think, when i was using the old ArcGIS Services map (as per the sample) I could get an image like this (see below).

imageAh-ha I think, something is up in the state of projections. How can I request the GeoRSS feed in another projection, in this case Web Mercator, rather than the Long/Lat WGS projection that was given as standard.

The answer to my ponderings was that GeoRSS defines the returned format as having to be WGS84 and in order to place it on top of my Bing map I would have to re-project the data myself. Fine I thought, I know how to do that, ArcGIS Server doesn’t have a Geometry Service for no reason you know.

Chatting with the Geometry Service

With the implementation of the Geometry service within ArcGIS it’s been very easy to re-project data between coordinates, you can find the documentation to do this here. This is good as it allows you to project between many different coordinate systems as it’s all server bound you don’t need to pollute your client with any algorithms.

It should be noted here, that over time there have been a number of SRS’s used to define Web Mercator. An email that I’ve seen on the boards which explains the differences between the numbers used can be found here (see the post by Melita Kennedy).

The problem you have is that it can be rather inefficient to get points into the client and then send them to another service to project, get the data back and then have to display that in the map. This solution might be the most flexible (as it could possibly handle any projection required) but it leaves a bad architectural taste in the mouth.

I put it to the back of my mind in the folder entitled, architectures to use when other methods fail and went on thinking about how it might done, back to math (+s for UK readers).

An Algorithm

I’ve always wondered what the actual algorithm used for doing the projections between WGS84 and Web Mercator, this was a demo so I didn’t have to be too careful about the accuracy. I once again used my Google brain to come up with the following link which contained a Python script which contained the algorithm a required. Which when converted to C# (not too hard) looks like so:

image

When I placed this into my GeoRssLoader.cs class file (see the GeoRSS Silverlight sample again) then I managed to get the points placed into the correct position, see map below.

imageSorted I thought, but then I got thinking. Surely this is an amazing common task that everyone is doing, GeoRSS is very popular and there are one or two ESRI developers out there (I know I have 250+ of them coming to see me present at our Developer Hub Conference next week).

If you’re interested in converting between WGS84 and OSGB36 then this link should be handy, if I get time I’ll knock up a C# class doing it and post it on this site somewhere.

The Easy Way

So I was showing my solution to another colleague of mine all impressed that I could do it using the power of math when he said that the JavaScript API had a function for it built in called esri.geometry.geographicToWebMercator(). Gah! I thought, all these JavaScript dudes have it so easy, no one ever creates one of these for us poor Silverlight chumps. Well actually they do.

Hidden away in the ESRI.ArcGIS.Client.Bing.Transform class are the following two methods:

imageBoth of which much simpler than including your own projection algorithm within your code, why reinvent the wheel after all. As we can see there are always many ways to do things with technology. It’s often the simplest one that can avoid notice when your thinking through a problem, although it should be said that there are merits to all approaches due to flexibility (REST Service), transparency (algorithm) or simplicity (existing class), the choice as they say, is yours.