Using MD5 Digests to Make Long GET requests

By on February 12, 2009 12:38 am

Have you ever needed to make a GET request containing a complex request only to find out that it exceeds the maximum length that a url can contain when url encoded? Usually URLs come nowhere near the 2k length on IE and certainly not on other browsers where the length limit is 4k. In most cases, switching to using a POST request without the associated payload limits solves the problem. However, POST requests do not get cached. While this is appropriate in most cases where you are sending a large request, in my work I’ve found several cases where I want to make a complex (long) request, but still retain the caching benefits of GET.

In the past I’ve always been able to resort to some alternative, such as splitting up the request into multiple XHR GETs, but in a current project of mine, the alternatives were ugly or had some other drawbacks. Without getting sidetracked in those details, for purposes of this article, we’ll assume there are valid cases for long GET requests and that is the situation we are in.

The solution I’ve come up with is simple, but allows us to retain the benefits of browser caching with the complex requests. Lets assume we have a large JSON object that contains the set of parameters we want to use and this JSON object, when URL encoded, is too long to include as part of the URL. The trick is to make a MD5 digest of the JSON string. This can be done using the utilities in dojox.encoding (thanks to Tom Trenka and Eugene Lazutkin). This MD5 digest is then included in the URL instead of the JSON string, which is placed in a cookie (restricted to the path for which the request is being made). The cookie is then transported along with the request. Once the request has been made the cookie can be deleted. Subsequent requests for this same dataset will generate the same MD5 digest and will be pulled from the browser’s cache.

Here is some example code to implement the above technique. First we need a function that, given a string, will split it into a number of different cookies:

function CreateCookies(JsonString, url){
	var sliceLen = 4000;
	var txtLen = JsonString.length;
	var current = 0;
	var cookieNum=0;
	while (currenttxtLen){
			end=txtLen;
		}
		var slice = excludeList.slice(current, end);
		var c = dojo.cookie("hashedRequest_"+cookieNum, slice, {path:url});
		current = txtLen+1;
		cookieNum++;
	}
        return cookieNum;
}

And the request function…..

function makeLargeRequest(JsonObject){
	var jsonString = dojo.toJson(JsonObject);
	var url = "/getData/" + dojo.toJson(JsonObject);
	var escapedLength = url.length;
	var maxLength = (dojo.isIE)?2000:4000;
	var numCookies=0;

	if (escapedLength>maxLength){
		var md5hash = dojox.encoding.digests.MD5(JsonString);
		url = "/getData/hashedRequest/" + md5hash;
		var numCookies = CreateCookies(JsonString,url)
	}

	dojo.xhrGet({
		url: url,
		handleAs: "json",
		sync: true,
		error: function(err){
			throw new Error("Unable to load resources from ", url);
		}
	});

	if (numCookies>0) {
		for (var i=0; i

That's all there is to it. The server will receive the simple GET request, but will get the data it needs from the cookie and return an appropriate response.