Using MD5 Digests to Make Long GET requests

By on February 12, 2009 12:38 am

Have you ever needed to make a GET request containing a complex request only to find out that it exceeds the maximum length that a url can contain when url encoded? Usually URLs come nowhere near the 2k length on IE and certainly not on other browsers where the length limit is 4k. In most cases, switching to using a POST request without the associated payload limits solves the problem. However, POST requests do not get cached. While this is appropriate in most cases where you are sending a large request, in my work I’ve found several cases where I want to make a complex (long) request, but still retain the caching benefits of GET.

In the past I’ve always been able to resort to some alternative, such as splitting up the request into multiple XHR GETs, but in a current project of mine, the alternatives were ugly or had some other drawbacks. Without getting sidetracked in those details, for purposes of this article, we’ll assume there are valid cases for long GET requests and that is the situation we are in.

The solution I’ve come up with is simple, but allows us to retain the benefits of browser caching with the complex requests. Lets assume we have a large JSON object that contains the set of parameters we want to use and this JSON object, when URL encoded, is too long to include as part of the URL. The trick is to make a MD5 digest of the JSON string. This can be done using the utilities in dojox.encoding (thanks to Tom Trenka and Eugene Lazutkin). This MD5 digest is then included in the URL instead of the JSON string, which is placed in a cookie (restricted to the path for which the request is being made). The cookie is then transported along with the request. Once the request has been made the cookie can be deleted. Subsequent requests for this same dataset will generate the same MD5 digest and will be pulled from the browser’s cache.

Here is some example code to implement the above technique. First we need a function that, given a string, will split it into a number of different cookies:

function CreateCookies(JsonString, url){
	var sliceLen = 4000;
	var txtLen = JsonString.length;
	var current = 0;
	var cookieNum=0;
	while (currenttxtLen){
			end=txtLen;
		}
		var slice = excludeList.slice(current, end);
		var c = dojo.cookie("hashedRequest_"+cookieNum, slice, {path:url});
		current = txtLen+1;
		cookieNum++;
	}
        return cookieNum;
}

And the request function…..

function makeLargeRequest(JsonObject){
	var jsonString = dojo.toJson(JsonObject);
	var url = "/getData/" + dojo.toJson(JsonObject);
	var escapedLength = url.length;
	var maxLength = (dojo.isIE)?2000:4000;
	var numCookies=0;

	if (escapedLength>maxLength){
		var md5hash = dojox.encoding.digests.MD5(JsonString);
		url = "/getData/hashedRequest/" + md5hash;
		var numCookies = CreateCookies(JsonString,url)
	}

	dojo.xhrGet({
		url: url,
		handleAs: "json",
		sync: true,
		error: function(err){
			throw new Error("Unable to load resources from ", url);
		}
	});

	if (numCookies>0) {
		for (var i=0; i

That's all there is to it. The server will receive the simple GET request, but will get the data it needs from the cookie and return an appropriate response.

Comments

  • Nathan Toone

    Very nice approach. However, isn’t there a limit on the length of the cookies as well?

  • Thanks Nathan. Yes but it is 4k/cookie and I believe that it supports something like 20 cookies total in the worst (IE6) case. In my case, I have yet to really need to exceed 4k, but 2k gets exceeded frequently. While this technique will allow >4k to be used in browsers, my primary intention was to get a full 4k out of IE.

  • Hi Dustin,

    If the trick is good, you’re limited by the number of cookies allowed per domain (it can be as low as 20 with IE).

    In some high availability environments, where Infrastructure architects consolidate many services on clusters of servers, you can see some nasty side-effects on other applications:
    – Your application creates 5 cookies, for example;
    – The browser is maybe going to recycle 5 cookies used by other applications.
    – Best case, there are not crucial (like the ones created by dijit.layout.BorderContainer to save its parts’ dimensions).
    – Worse case, one of the cookie used by the Single-Sign-On (SSO) service is going to be ejected, breaking then your application and possibly all others.

    By experience, I try to avoid using cookies. When you develop an enterprise webapp, you don’t always know in which configurations it’s going to be used.

    My 2 cents,
    A+, Dom

  • Dom, very true. However in practice most requests will use one or two cookie (8k is pretty darn large for a request though). Note that the cookie can also be applied to be specific to that path to help avoid some conflicts. For the purposes I most recently needed to use it for, it was more of a technique for bringing IE up to 4k rather than for the other browsers to exceed it.

  • David

    Nice!
    For your guys reading this post and looking for an online tool to hash a string, I recommend this online md5 converter
    David

  • I recommend this MD5 Online For file