Google Geospatial Infrastructure for Everyone

There once was a time where the scale to serve and process global maps, imagery and car mounted camera pictures to the world could be counted on a single hand. Before that time niche providers of complicated mapping solutions would guard those services behind an increasing complex and expensive set of services and solutions that only government agencies could afford at scale. Things have changed rapidly over the last few years, powered by easy and inexpensive access to cloud based servers and storage solutions that sets the bar for the creation and maintenance of a global map base within the reach of modestly sized companies. The rich base maps and image layers are more of a commodity now, with the ability to plot mass features on a map can be done with a little training and even less resources.

Whilst low cost virtual machines and storage can reduce companies costs when  lifting and shifting computing from on-premise to cloud based services, they still function in the same way, often need the same method of scaling and require the same amount of operations people to maintain. So whilst scalability might be within everyone’s reach the speed of operations can slow the development of applications down. In a world where mobile applications and viral games can require the development and deployment of solutions within weeks if not days then using the spatial systems and solutions of the past would mean most of the applications wouldn’t have any spatial functionality.

Fortunately cloud based solutions have developed beyond functionality for just emulating hardware and storing bits, to higher order solutions that are easier for developers to access, as they operate via web services, reducing the time implications of integration and require little or no operational involvement, reducing the dependency on manpower that was the requirement of previous systems.

Using Google services such as Cloud Pub/Sub (scalable real-time messaging) and BigQuery (petabyte scale storage and analytics) merged with data from the Maps APIs it’s possible to build a highly scalable telemetry solution that can store, process and analyse from hundreds of thousands of vehicles or devices without the need for any servers (virtual or otherwise) or even any geospatial software. Whilst these products may look like they have emerged fully formed, functionality such as this has been used internally for solution within Google for many years. Often this way of working had been published into papers to has published papers about how data is processed (MapReduce), and queried (Dremel) and managed on millions of servers (Borg). This has allowed for other implementations to have been created based upon these architectures, which has given more people access to the solutions that were only possible within Google (and even led to improvements in Google’s own infrastructure). External implementations such as Apache Hadoop (MapReduce), Apache Drill (Dremel) and Kubernetes (Borg) can now be accessed by everyone and managed at a developer level through a web console or API from both Google and other third-party cloud providers. The popularity of the hashtag #GIFEE (Google Infrastructure for Everyone Else) shows that customers large and small can now get access to many of the infrastructure services that Google has used to scale multiple services to billions of users as it has grown.

Looking back at the article I wrote last year and then looking at the companies that have continued to make waves in the utilisation of geospatial functions then I suppose you wouldn’t have thought that  one of the biggest splashes would be that of a game whose origins were in the  1990s. Whilst people might focus on the visual elements of the game, the real power was the fact that at it’s heart this is based on millions of people playing in real time using a geospatial backend, that wouldn’t have been either possible or affordable to do even 3 years ago for most companies

One of the most interesting by-products of this was the rise of helper apps build by people to share information about the game. Applications that can support 500000 users, backed by thousands of geospatial queries per second, on a set of cloud services that cost around $100 was the stuff of fantasy a few years ago. Due to the availability of these solutions the ability for a company to build a game that becomes viral and scales to millions of users but also for a cottage industry of developers who can ride the wave of this success with applications costing a few hundred dollars, which might not have a long shelf life but can capture ‘the moment’, is powerful and affordable in terms of development for all companies in the future.  

Google Geospatial Infrastructure for Everyone Else

In the same way Google has released information about how they have built a highly scalable infrastructure they have also released information about how they process and store information about the real world. Implementations such as the S2 geometry library have been used by many companies (Foursquare, Yelp, Uber) to build and scale massive geospatial solutions as well as many parts of Google and public-ally available solutions such as Google Earth Engine have been used to produce high resolution cloud free images of the globe, as well as being available to many academics around the world. Over time we have also seen the use of products like Street View to identify house numbers to better support geocoding or place searching within Google search or the Maps APIs. In the same way that Google infrastructure has become increasingly available to anyone so has the ability to do the same spatial or imagery related tasks that only Google or Governments could have done in the past.

Getting meaning out of data has always been one of the key tasks of any analyst, be that geospatial or image. As the size, complexity and temporality of data has increased we can now store the data but increasingly need help in deriving the meaning. Services can be provided to many users once this has been done, the faster and the more relevant this information is the more useful it is for people. Take the identification of house numbers as an example. In the past bespoke systems might be needed that would run on clusters
of servers now you can get this all behind a single API. The
Cloud Vision API  can provide an easy access to this sort of analysis, from the API you
can get information about all sorts of information from the image, from Optical Character Recognition to Landmark Detection which has potential applications in wide varieties of applications and can all be tagged with location.

Building applications at the scale of Google is now within reach for many companies using cloud based solutions, with the complexity of machine learning being hidden behind simple APIs. The functionality once used to launch many of Google’s geospatial products can be enabled from web based consoles and integrated into applications through the use of a single platform, Google’s infrastructure is now available for everyone and more and more Google’s geospatial analytics infrastructure can be accessed through a web service by any developer for use in any application.

Who should I support?

There are many reasons why a person supports a
football (read soccer for our US friends)club. Often these reasons come from the heart, based upon where a person was born or who had the nicest kit or Manchester United. I support Crystal Palace so I’m a sucker for punishment, but in an amazing turn of events they managed to get promotion this year. In honour of the single season they are probably going to have in the top flight of English football I decided to remove all of the emotion out of supporting a team and decided to write a demo which just used pure distance to determine who to support.

There are many people around the globe who watch English football and many of them wonder who they should support often influenced by such factors as whether the team is successful or play attractive football. Whilst these are valid points in determining which team to support one could argue that equally as important would be to support your local team. Now for many of us this might be obvious, but for a person in Tokyo, Miami or Cairo the choice might be less than obvious. Well for no longer, I give you the who should I support site. A site that takes the emotion and chance out of supporting a club.

This week saw the release of the Google Maps Engine API, which adds some key new functionality to the Google Maps Engine platform for querying geospatial data. Maps Engine allows you to upload massive amounts of data into the Google cloud, the API provides some key functions for querying and editing this data for building of applications like that, or integrating into internal systems or mobile applications using the same security and scalability that is available with all of Googles products.

I also wanted to use this demo for looking at how easy it is to integrate Google Maps with Twitter bootstrap which allows developers to easily enable responsive design onto their web development. The answer is, it’s easy once you add a few tweaks from StackOverflow.

Anyway I hope you enjoy the demo and have fun following your new club!


I wrote another article.

It’s been 11 years since I last dipped my toe into the magazine world and the IT and GIS industry has changes beyond recognition in that time. It was before Google had even entered the mapping world, before Facebook, YouTube and Open Street Map. Just thinking about this makes me feel old, anyway enough of my brooding.

Having moved from the Old world to the New world it’s been interesting to see how the geospatial world has evolved and amazingly how my JavaScript skills seem to keep me in employment. Whilst consumers have ‘got’ geospatial, and there are many many applications showing this, businesses still consider geospatial to be a dark art whilst many organisations have an army of geo-literate people checking in and sharing location already, they just need to right tools to unlock those skills.

So the article ‘what I wrote’ deals with the tools and technologies that Google has and is continuing to build that allows organisations to unlock the inherent geospatial literacy that many people already have from their personal use of technology. From familiar data set and cartography, to solutions that work on consumer devices in a simple way, without the need for expensive legacy systems and true cloud based geospatial data stores which use the same sharing and access technologies people are used to in GMail or Google docs. I hope you enjoy it, maybe I’ll write another one in 11 years time.

Pinpoint 2012

We recently held our yearly Google business conference in London. It where people, many of whom geospatial is only something they use day to day on their mobile phone, can come and see what we are doing on the business side of geospatial especially around some of the new products we are bringing to market. So without further ado you can view the videos here, most importantly almost 30 minutes of me ;-). Anyway there were lots of other people talking as well, enjoy!


A few Google Geo Presentations

imageIt’s been a busy year so far and therefore my posting has been woefully slow. Much have my time has been working with Google Earth Builder. This provides a new way of distributing geospatial information from Google into all sorts of clients, providing a true geospatial platform as a service without the need to worry about how many servers or virtual servers you need to support your clients, allowing GIS experts to worry about geospatial information science rather than geospatial information systems. There will be more information coming out about this platform over the next few months and I hope I can talk to as many people as possible about how they may use this to share their data and collaborate with their clients, removing the complexity, headaches and expense that this causes with current geospatial systems. If you want to know more watch the Where 2.0 video below.

More Google Geo Goodness

Where 2.0 wasn’t the only conference recently, last week at Google IO 2011 there was a veritable plethora of geospatial talks on all of the aspects of Google Geospatial. Two talks stood out for people imagewho might be using and sharing geospatial data, the first about Google Fusion Tables and the second on the surprises of the maps API. The latter presentation gives a good overview of some of the new geospatial functions with the whole Maps API that you might have missed over the last year including. Geospatial at Google is a ever increasing area which touches lots and lots of products, that are easy to use and implement for non-geo experts which increases the reach of geospatial data to more people, this talk is a great intro to all of these.

Fusion tables, GIS for normal people.

Fusion tables continues to add more and more functionality that will allow the geo-prosumer to create and share spatial data in record time. One of the main additions has been that of extra styling functions to the product allowing people to create more engaging maps which in turn help convey the message easier to non-geo people, you can get some more details here. If you wanted to know more about Fusion Tables then this talk is a much watch to show how you can host and map geospatial information from the Google cloud without a single line of code, something I would never have imagined possible when I started doing this GIS malarkey in the early 90’s.

Speed, Speed, Speed

A final geo talk at Google IO that caught my attention was one about how to improve the performance of any mapping application using the Google Maps API. I’ve spent a good portion of my career trying to improve the speed of web applications and especially geospatial web applications. It’s good to be armed with knowledge before you even start any development, this presentation hopefully will help you avoid the pitfalls that people go through when starting developing in this area.

If you combine this with articles from the Google Maps website, such as Too Many Markers, then you can hopefully create speedy maps that are a joy and not a curse to use.

You can find more information and sessions from Google IO 2011 at the website here. If you set the filter to Geo you can see all of the presentations that had a geo flavour. Hopefully I’ll be presenting at the Google Enterprise Geospatial day at the end of august. If your into geospatial and Google it will be like a mini Geo-IO. Hope to see you there.

Normal service will soon be restored.

I’ve been increasingly remiss on the updating of this blog, which I’ll hopefully remedy over the next few weeks. I don’t have any more excuses apart from the fact that I’ve been on a never end whirl of meetings, conferences (as delegate and booth babe!), training courses (run by me) and the never ending search for people to help me out. I’ve also been coming to grips with the never ending volume of innovation and products coming on stream at Google, absorbing their nature and understanding their impact on enterprise geospatial.

Google London

What no PC?

The other fact is prior to joining Google my main blog editor used to be Live Writer. I was one of the most PC people you might have known, thats PC as in Personal Computer, not the other one although I’m not too bad in that department as well (I try to minimise the HR violations). It seems that over the last year I seem to have become mostly Mac, that being until my first day at Google I had never even had a Mac or even touched one in anger for over 15 years (since I waged a one man war against them back in my days at Kingston). It turns out they are not half bad.

Mars Edit

Maybe because I secretly like having the option of a command window hat doesn’t run a horrible version of DOS, and actually have to use vi and ssh in my job! Maybe it’s also because I do most of my job in a browser these days and no longer have to have a copy of ArcGIS on my machine. Finally it’s probably the fact the the machine boots up fast, suspends without trouble and doesn’t seem to lag at weirdmoments. Sure it’s probably down to Apples bespoke hardware / software combination but whatever it is it’s been a breath of fresh air and whilst it’s made me tear whats left of my hair out sometimes, once I started to think like a Mac I’m unsure that I can start to think like a PC again (I’m sure I’m going to grow a goatee and get some thick glasses next).

So here I am having found a new easy way to write blog posts on my Mac using MarsEdit. It seems to do everything LiveWriter did without the fact that Microsoft seems to always want to update the whole Live suite all the time including the now horrible new version of Live Messenger, the Mac version from Microsoft is better.

Fast Boot, Quick Suspend

Why don't you!

Just as I find one way to edit posts another challenge falls into my lap, the CR48. Now my next challenge is to find a good blog editor that runs just in the browser and I can conquer the a new frontier. So far I’ve had good experiences with the machine. For much of what I do on the couch (as distinct from at my desk, although it’s not far off there also) is within a browser I want instant on to check emails and websites and instant off when I need to go do ‘something less boring instead’. The CR48 really does those things well, for editing text and using the ‘normal web’ it’s a lot better than an iPad, although for media consumption the iPad still has it beat, but then again that’s not what it’s primarily for.


What I like most about the CR48 is that it is something that is trying to upset the status quo of something that looks like a ‘normal computer’ acting like a ‘smart phone’. The act of doing away with storage on the device makes it instantly more controversial as people will start to get all nostalgic over defragging hard disks or performing virus scans, updating applications or managing services. It seems like many of our computers have turned into actual servers over time running background processes that seem to make the machine randomly page to disk or kick a background process to max warp (subsequently making your 2010 computer run like something purchased before modems went out of fashion).

Are we living in the future yet?

I don’t know when this happened, but it seems that between my Mac and my ChromeOS device I’m slowly moving into a brave new future when I can start to worry less about about what’s on my machine and more about actually getting things done. When most things you do are being stored in the cloud, you become much less attached to the actual machine your using, it becomes much more of a tool than a priceless piece of technology to be protected with your life less you lose your photos. Hopefully this  ChromeOS video serves to highlight this brave new world in a robust manner, hope to see you there in the near future!

Geo Semantics


The term GIS is one that tries to balance two very different disciplines, that of Geographic knowledge and  processes with information systems, that of the computer systems and processing power that allows the computer provision of maps and spatial analysis that was never possible before. As the software and systems have become more and more sophisticated it has often been the fact that GIS has seemed to be more about the IS component and less about the G. This is especially true with the advent of server software for providing map and analysis functionality that can be deployed on the intranet or internet.

image The GIS department had to start hiring and understanding the technical complexities of server installation and web development, whilst in the past it had mainly been concerned with providing the output of requests for maps and spatial analysis. Often this lead to the G and IS components being split across a number of departments and the complexities of any project rising as a case, especially if the IS was outsourced by the company. As the complexity of the systems gets ever greater the pressure on an organizations IT and IS department becomes even greater, whilst pressure for GIS to be ubiquitous within an organization puts increasing cost pressure to provide the maps at ‘Google Speed’.

The question is, how much G do people need and how complex do the IS need to be to support them?

G for all

I remember pouring over esteemed journals and papers in my youth (last century!) whilst people regaled image the readers with the exact types of functionality that were required by a system to be a GIS. That software doesn’t do this, it’s can’t be a GIS, our software does it, it’s got GIS on the box, it therefore must be a GIS. Often the premise was to get, what I would call hard-core GIS on the desktop for as many people as possible be that as an install or through a browser with complicated functionality such as editing or complex geographic processing. All of which came in a new interface which required a great deal of training to use, which obviously benefited the training departments of the organizations in question.

image One if the benefits of a product like Google Maps or Google Earth is the number of people that have already used it. The reduction of time it takes someone to get up to speed with a product ‘they already know how to use’ to quote an iPad advert is important to organizations that are rolling spatial functionality out to 50 or 100 people, possibly more. Arranging training courses on complex products can be both time consuming and expensive.

This is also the case for how people share information. Having to install and configure complex software for many people who just want to share a map amongst both small department or to a wider group of people without the need to have expensive complex software to maintain and configure, not to speak of any hardware, is a barrier to the take-up and use of geographic information. If this difficulty is taken away from people, then all sorts of people can take a spreadsheet of points, a set of addresses or even a KML file and upload this to a data store that just that.

G, without the ISimage

Google Fusion Tables provides such an environment to do this. It’s not an overly complicated piece of  software, it just allows someone to take some spatial data, such as a set of geographies, and upload them into a cloud based data store where the information is rendered in a table like environment. At this point there is the ability to filter, aggregate or link the data to any other and then create a simple visualisation of the data that can be placed on the Google base map or linked into Google Earth. That’s it, not complex configuration of servers, no need to handle security as this can be provided using authentication to groups of Google account users or if you’re sharing non-sensitive information just made public. Everything can be handled within a browser, no need to involve any IS group or outsourced department, sharing power can be given back to the actual providers of the data or to those people who want to play with visualizing the data and not configuring servers.

Sure in the background there is a whole series of IS going on, but the knowledge of uploading and managing security on items is now mainstream enough through sites like Google and Facebook that there is a ready army of new graduates who already know how this is done, indeed this is the way they will expect spatial data to be shared!

To G and beyond!

image GIS cover a wide variety of implementations, from viewing data in Google Earth (yes it’s a GIS to some degree) to manipulating features in ArcInfo (which is definitely hard-core GIS!). To say one thing isn’t or another is a matter of Geo Semantics. The more that the complexity of sharing and visualizing spatial information the more it will be used within organizations. The easier it becomes for people the more it will be used, not only by the users but also by the people sharing the data for them to see.

So in the future don’t believe the confusing semantics about whether something is a ‘GIS’ or not, just work out if it has enough ‘G’ for what you need to do. As the complexity of getting or accessing spatial data online is reduced then many more people will be ‘doing’ or ‘using’ GIS whether they realize it or not. That can only be a good thing.

Apps or Sites?


Part of me chuckled at the so called hack that affected Twitter today, not that something like this couldn’t affect any site (although given the simple and well known nature of the attack, it really shouldn’t have hit a site like Twitter) but it did remind me of the days in the early 00’s when this sort of thing was common place and the sort of problems we all had to face when coding sites in that era.

What did strike me was whilst I did notice it (whilst random JavaScript tweets are always fun doing the same one over and over again is sort of labouring any joke) I wasn’t affected by it. Why because I wasn’t accessing the Twitter website only consuming the feed from within an app, in my case TweetDeck.

Saved by TweetDeck

I was sort of surprised this week when I heard that 70% of people still use the Twitter website to send and read Tweets. I mean, Wired this month had a whole article bemoaning the death of the web, hasn’t Twitter read that and immediately shut down the home page. Hmm, no, in fact they just released a whole bunch of new functionality (which I can’t yet use, damn them) that can only be accessed via the website, just in time for the hack to emerge.image

Wired do have a point though, more and more people are buying applications for their phones, as smart phones become cheaper and cheaper more and more people will buy apps, just as more and more people will get access to the internet for web browsing. Applications will use the old ‘internet’ for services for the applications on their phone and the ‘web’ will go back to being one of the protocols used on it, that being HTML over HTTP. 

The thing about apps is that they don’t suffer the same attack profile as a web site, when information is mainly entered using an HTML form then that’s where people will look to attack. It’s harder to attack a series of apps that use a data feed, unless you can corrupt the feed in some way, as they usually will display the data in its own way, usually not using direct HTML or even in a browser.

Of course you could be using a compromised application, either downloaded onto a PC from an untrustworthy source or side loaded onto an Android or jail-broken iPhone in that case don’t say I haven’t imagewarned you. In fact the careful cultivation of the App Store under iTunes and to a slightly lesser degree the Android Market place adds that little bit more protection to users than the wilful installation abandon people have on their home (and sometimes work) PC’s (and Mac’s and Ubuntu boxes, but as I said who’s bothering to write a virus for those relatively paltry level of users /jk!).

Patched Apps

image The fact that Twitter patched the XSS issue in relatively short order is one of the main areas where the web works well. The ability to roll out a patch to millions of users at once be it a patch or new features, after thorough testing of course, is only really possible with application that leave no trace on the local machine. Cloud based applications not only protect your data from hardware failures but they can also be patched or upgraded without you having to do anything. Now I know some people will not like this, the same sort of people who still use OS/2 because they don’t understand these new fangled operating systems.

Desktop and mobile applications require an upgrade cycle because they rely on you installing something on a machine. On a mobile application this can be more arduous as they rely first on the developer getting the new application checked by the store it’s being delivered by, and then you have to be notified by the store that a new version is available, finally you have to actually install it.

On the web once it’s passed the requisite tests, it’s just there. Updated lazily in the background or when you next log into the website.

Apps or Sites, your call as long as it’s the cloud.

I read a comment the other day, every time I open an on premise application or use an on premise server to create data, I take a risk with my data. Every time I use cloud services for all sorts of tasks I know it’s not quite as whizzy as on premise applications or servers, but I know if my machine or server dies it’s still there. All I need is another browser and I’m up and running again. No need to install an app, no need to worry about the operating system either mostly these days.

image It works for me, I for one am happily replacing my offline apps for online ones when I can. Sure I still use some installed applications, I still love Live Writer for blogging for instance, Picasa for managing my photos and Google Earth for well just looking at my house from space, but those are my last few on premise applications I use at home (that aren’t games) and Google Earth and Live Writer are conduits for an online services.

I could do the same with my photos if I ever got the time, and there is the crux, as it becomes easier and easier to move data and information to the cloud, or if it has only ever resided there for many digital natives, then more people will and hopefully will be better off because of it.

Steve Jobs has my money, again.

Sometimes it doesn’t matter how much you tell yourself you don’t want something, how much you know an item isn’t really worth the money your shelling out for it, how my you think that getting a device mainly for browsing the web on the sofa is an unnecessary luxury, in the end you end up opening the wallet and handing over money to someone who is being far to smiley in an Apple store. In fact you do this and wait in line, or in my case shambolic huddle, for the privilege. Come on, I was only there to buy a grown up iPhone, or was I?

Shouldn’t there be angels or something?

I think angels seem to turn up when Stephen Fry goes into an apple store. I’m sure they specially clean up in order to add to the glorious nature of the purchasing experience (from what I’ve read I don’t think mere mortals deal with him when he buys only UberAppleSalesCherubs, either that or it’s something they give him to drink). In my case my buying experience was less straight forward. Walk into an Apple store, avoiding blue shirted sales staff looking for till. Notice that till was absent and most of the store was full of people checking their email. Find concierge (in black shirt) who was wondering around store in a pattern created by a bad random number generator and give him name. Check email on a variety of Mac’s until I realize I’m standing in the Genius bar queue, a Genius

in the Apple store seemed to be a person who knows how to hard reset an iPod/IPhone because that’s all they seemed to be doing for people. Move to another, unmarked, area of the store to hang around with the other people who are also lost, but might be less lost than me. Check some more email on the nearest Mac. Finally my name is called and I have all of 1 minute to pay for my iPad. The guy did say usually he wold spend three years taking me through how to set up and usethe product, but unfortunately I’d used up all of his spare nanoseconds and I was on my own, fine I thought the advert had said that ‘I already knew how it worked’.

Yeah it’s an experience, not a particularly good one, but an experience nonetheless. Next time I order by post, impersonal but it saves on the aimless wondering.

But my other device isn’t an iPhone!

I used to have an iPhone, I used to think it was good. I didn’t pay for it, I managed to be in part of my previous organisation where iPhones were becoming a right (the rest of the company was left to languish with nothing or old Nokia’s, I’m unsure who was worse off). I used to use iTunes for organizing my music, movies and applications. I could even dock it in a few stereos I had at home. I have a Nexus One now, I used to think the iPhone was good but I don’t miss it any more. Transitioning from the iPhone to the Nexus was a bit of a harrowing experience, I spent a month cold turkey using an old Windows Mobile handset, an HTC S710, it reminded me what a mobile operating system would look like if I ever ended up in purgatory. I used to look at my iTunes library and my new mobile device and think, hmm what now?

This taught me a valuable lesson.  Never back only one horse, because if a better horse comes along, let’s call it an ‘open horse’ for the sake of argument, you will never know what your missing. You will never know that life might be better backing the ‘open horse’ until you’ve tried it. That there are companies out there that might produce software that run on any horse. Because if your not on the ‘open horse’ then you have to stick with the other horses, let’s say they are iHorses, and if you ever want to get onto the ‘open horse’, the other one has bolted with your music, movies and books and you’ve forgotten to shut the door.

Ok enough of the equine analogies already.

I <3 Kindle

Having lived with my Android phone for two months and enjoyed the wonders of an iTunes free life (come on you know you can do it) my first download from the Apple App Store was the Kindle application. Why Kindle and not the ‘magical’ iBooks application, well for one thing I could now read any of the trashy novels I download when I’m not tethered to my iPad, which might be quite a lot as both my Wife and Daughter seem to be eyeing it up. I might even be able to read them on my PC (you know I still have one or two) which still thinks an app store is somewhere you go and buy games.

Amazon know how to do book rentals, their apps run on most platforms (that anyone cares about) and you can download the titles again and again from archive. This is a killer function for me and something that has always created annoyance with online music purchases. The whole cloud idea has been the fact I can store data away from my device, if it dies, I can just re-sync and everything is there, over the air without the need to dock into an even more expensive computer to do so.

It’s the reason why I now use Spotify for my music rather than just having useless CD’s cluttering my house that I no longer listen too, my boredom threshold is high and my music tastes horribly mainstream, Spotify checks the boxes for me and like Amazon allows me to listen to trashy pop without the evidence lying around embarrassing my family. Sure there are times when having the physical product is excellent (usually when giving gifts 🙂 ) but like many other people I’m feeling that more and more physical products are moving to an online existence only and whilst part of me mourns the time when I had 15 floppy disks to install software onto an unconnected machine, it’s only a small part and I’m trying to rid myself of it.

When was the last time you loaded data?

Do you know, most of the people using spatial information in applications never even bother to load any spatial data, using esoteric formats beyond CSV or KML. They never had to build a gazetteer, they never had to choose the colours of the basemap? They never had to worry about scaling or wondering about upgrades or downtime, server management and patches. Most of them never even had to pay.

Google and Bing Maps (mmm balance!) both provide the Kindle and Spotify equivalent of GIS (look I used the G word!), there are applications and app stores being provisioned based upon both where existing functionally can be easy combined into a bespoke applications, again often without much installation worries. Like most people don’t care how Spotify and Bing work, most people don’t care how Google Maps works, as long as the API is stable and the maps are fast. Sure like physical books there are times when you need to run on-premise applications and actually get your hands dirty loading spatial data, but they are increasingly becoming more niche.

In the era when digital books will soon sell more than physical books I wonder how long before access to larger and larger data sets will be online rather than on-premise. When the ‘change only update’ will be a thing of the past and you will have only the ‘changed’ data. Sooner rather than later I hope, until then I’ll be happily playing with my iPad enjoying the excellent spatial applications that run on both it and my Nexus.

When web hosts go bad.

There are a number of reasons why this blog has taken a bit of hiatus in terms of posts in the last fewimage months. Probably the most important of which is that fact for once in the UK we seem to actually be having a summer, coupling this with my daughter forcing me to watch all of the Doctor Who’s since the reboot (honestly she’s only 4 but she’s quite persuasive) means I spend a lot of time outside on the bike or in explaining the fighting differences between Daleks, Cybermen and everything in between.

A slow spiral down the web plug

Secondly has been the slow demise of an excellent webhost, Webhost4Life. Whilst never the fastest host it has always been reliable for everything I’ve needed it for, especially for the price and features I paid for imageit. Recently though they decided to move to a new platform which I think this was due imageto the company being sold to another and a change of management went for a cheaper option. Initially it seemed good, the admin site was much improved and it seemed that things were going tickey-boo. Unfortunately as others have also found the support and stability of the site has gone way down. This blog was up and down more times than the Grand Old Duke of York, and even when it was up it was slower than an England footballer in front of goal. Finally having enough after 3 days of downtime I decided to pull the plug and move this blog over to a new host, Arvixe.

imageNot only has this host been a lot quicker for me, it obviously doesn’t seem to be as contested, but it also supports a variety of development platforms, from PHP to ASP.NET and whilst the admin pages aren’t as nice and I had to install WordPress 3 the manual way (actually it wouldn’t even upgrade on WebHost4Life so it was very broken) which took all of one hour including getting the data loaded from the old blog.

I wonder if it’s like changing banks, hosting is one of the most painful things to change once you have it all set up, once you’ve done it though, you wonder why you never did it before.

Moving On

image Finally I had my final, final, final leaving do this week from the hallowed halls of ESRI(UK) esri(UK) ESRI UK. After 7 weeks of being at Google and one month of garden leave (people kept telling me there was no ‘–ing’) it was about time too. Moving on from a job where I talked to people about GIS technology to a company which I talk to people about GEO technology has been less of a shift than you might think. The complexity of solutions might be less, I haven’t yet touched SAP since I left (phew), but it’s still based around understanding how people’s workflows might fit and integrate with the respective technologies and API’s. Although it’s a lot more consumer-focused though, a bit like web mapping was with ESRI UK 8 years ago, I like it.

image There is so many geo-technologies to learn at Google, from the server based Google Earth Enterprise to the cloud based Fusion tables, as well as the well know Google Earth and Maps. My first seven weeks I have have been like a kid in a candy store both metaphorically with the learning of new technology and physically in the micro kitchens (note to self, must do more exercise).

Expect to see much more Google Geo related tips and trips around here as I work the stack of technology in my sweetie bag. I won’t give up the esri thing just yet though and hope to do some integration work between the various systems and I’m giving a talk on integration next week

Now with added location and mobile

On a side note, whilst updating the site I implemented two new features. Firstly the move had broken my imageprevious ‘where are you from’ section so I thought I’d update it with some Google Maps code. Unlike the previous attempt I decided to take the easy geolocation route and have the browsers (or gears) do it for me. I’ll discuss how this works in a subsequent post, it’s not tricky and takes about 10 lines of code, all of which you can get from the Google Maps v3 API page here

The second feature was the addition of the WPTouch plug-in which formats WordPress sites to work nicely on mobile devices like the iPhone and Android devices. The free version seems to work nicely for me, the pro version seems to have some nice features that one day I might find I need. Give it a go, it looks nice.