The Local JavaScript API for Local People

I was giving a presentation at an ESRI gathering hosted in reading at Microsoft this week, talking about all things Silverlight and MapIt, and very little unsurprisingly about the Flex API. During a presentation imageabout the Web API’s I mentioned JavaScript and Silverlight as two offerings that whilst can do similar things, need to be considered carefully in relation to the audience in questioning and the tasks they wish to perform. Now I like the Silverlight API and as I mentioned in the talk it can really allow for the avoidance of browser support pain by abstracting away the layer between your application  and the browser hosting it.

Often though you don’t want a plug-in to preclude people accessing your site, this might be down to a number of reasons but unless your going to start hand cranking simple code for map access (people who have to disable JavaScript within your browser should stop reading now) then you have the JavaScript API. Usually this is delivered to you from somewhere in the cloud (http://serverapi.arcgisonline.com specifically), but often there are a number of scenarios that might occur that will mean you have to install it locally.

Why Local?

If you are an ArcGIS server subscriber you are able to request the JavaScript API from ESRI or your local imagedistributor as explained here on the support site. There are a number of reasons why you might wish to do this:

You are super secure and think that access to the internet is a potential security risk  and therefore getting access to the online API is not possible. Many organisations have yet to embrace the concept of Content Delivery Networks (CDN’s), such as that used for hosting the ESRI JavaScript API. Whilst in a consumer space this is a common architectural practice as it places the download on the users connection for the CDN and not on the hosts (saving bandwidth allowing that to be used for serving images and functionality). Often within the firewall of organisations it is not possible to access external files as the network is secure.  This is also the case where you have a slow external connection and you wish to limit the amount of information that is downloaded from the ‘cloud’.

imageYou want to create a custom build of the API to reduce the number of files downloaded, improving browser response. The classic number one rule from High Performance Web Sites of making fewer HTTP  requests asks for front end developers to minimise the number of files they download from a site, when building on top of the JavaScript API it is common to build your application into many separate files to separate functionality or Dijits and to allow the usual spaghetti of development that can happen with JavaScript to be controlled.

A concurrent dilemma

The problem with this is that each JavaScript library you add to the application the longer it will take to download due to the limited number of concurrent files a browser can download at once. This is the same if you include new CSS files and images as well. This is down to the way HTTP 1.1 was implemented and setting a limit of no more than two files concurrently per hostname. BrowserScope explains it nicely as:

When HTTP/1.1 was introduced with persistent connections enabled by default, the suggestion was that browsers open only two connections per hostname. Pages that had 10 or 20 resources served from a single hostname loaded slowly because the resources were downloaded two-at-a-time. Browsers have been increasing the number of connections opened per hostname, for example, IE went from 2 in IE7 to 6 in IE8.

image This often leads to developers hosting parts of their application on different hostnames one starting with a URL such as images.site.com and another such as scripts.site.com, to get around such limitations. With large JavaScript libraries the time it takes to download these on slower connections can leave an application imageunresponsive and give a very bad user experience.

You can see from their tests (see image left) that modern browsers have upped this limit to 8 connections, this can have a dramatic affect on perceived speed as more of the application can be downloaded at once and when partitioned correctly (using a lazy loading pattern preferably) then users can see something happening in the browser much earlier than they could with a plug-in such as Silverlight or Flash (beware the loading circle of doom).

Of course with external (public facing) applications you will always have some users using IE6 and the limitations of 2 connections per hostname. This is where the ability to package your JavaScript files up into as small package as possible and as few packages as possible becomes more important and the need for a copy of the local JavaScript API imperative for use in a custom Dojo build.image

One build for all?

Unfortunately each build is different for each project and therefore there is no magic bullet for this. Fortunately there are a number of key pieces of information and discussions about how to perform this task. There is a thread on the ESRI support forums going through the process an excellent Geocortex post upon why they did a custom build here and finally a link to the Dojo documentation outlining the process in full. Whilst it might be a complex procedure, for people who are interesting in getting their application to load in the fastest time within the browser, it should be a procedure worth looking at.

Leave a Reply

Your email address will not be published. Required fields are marked *