Yesterday at work, we ran into a very interesting problem involving a jQuery auto-suggest feature on one of our client sites. We had implemented auto-suggest on this particular site several times before and it had always proved to be very zippy and responsive. This time, however, the "suggest" page requests were taking 5, 6, sometimes 8 seconds to complete; this kind of delay clearly defeats the purpose of auto-suggest. But, more than that, I was getting extremely frustrated with this seemingly nonsensical delay that none of the other auto-suggest instances were exhibiting.
After checking the HTML, the jQuery, and the SQL query powering the auto-suggest, I was just about at my wit's end; then, blankly staring at FireBug's network activity in disbelief, it finally dawned on me! This particular auto-suggest feature's result set included thumbnail images of products (it was an e-commerce website). That was the problem: there's only so many parallel requests that the browser can make to the same domain at the same time. What was happening here was that the product thumbnails were hogging up all of the available HTTP requests; as such, when the auto-suggest went to make a subsequent request, it got queued after the pending image requests.
As you can see in the video, when the auto-suggest feature launches a second request, it gets queued after the all of the pending image requests. My particular browser (all browsers are different and can be custom configured) can handle 6 parallel requests to the same domain. As such, you can see in FireBug that not even all of the image thumbnail requests can be made at the same time; of the 10, only the first 6 are actually processing, leaving 4 pending. Then, only when the first 6 thumbnails load, the last 4 can be processed. At that point, since there are only 4 concurrent requests, the second auto-suggest request can finally be processed.
The solution to this is not to remove images from the auto-suggest results - remember, we want to create a rich, interactive experience for our users; the solution is much more core to the problem. The thumbnails aren't the issue, it's the nature of the parallel requests. The browser has limits on how many parallel requests it can make to the same domain; so, to solve the problem, rather than removing the images, we "simply" have to pull them from a different domain. Once we do that, the browser will have no problems processing more than 6 parallel requests.
As you can see in this video, once the thumbnails are being pulled from a different domain than then auto-suggest results, the browser no longer needs to queue the auto-suggest results. As such, they come back fast even when there are many pending images still loading.
To be honest, multi-domain strategies are something that I understand in theory, but have never really put into effect. Things like loading binary assets off of Amazon S3 or sub-domains makes perfect sense - I just haven't implemented them all that much in production. After seeing this problem, however, it definitely feels like something that I need to start implementing as best practice. Whether or not I need it for a particular site, I certainly don't ever want to run into a situation where domain-use prevents me from delivering the most compelling user experience possible.
While the code I was using is not really the point of this post, I will list it below for reference. Here is the main HTML page. Please keep in mind that this was not supposed to be an accurate auto-suggest feature - I just needed something that "looked" like auto-suggest to test the feature:
Here is the code that gathers the auto-suggest results. Notice that I have the code for using two different domains for the IMG SRC attributes.
<!--- Param the query value. ---> <cfparam name="url.query" type="string" /> <!--- Make sure the query never goes more than 10 characters - this is ONLY for demo purposes. ---> <cfset url.query = left( url.query, 10 ) /> <!--- Store the content of the response. For demo purposes, we are simply going to make the results (10) minus the number of letters. ---> <cfsavecontent variable="results"> <cfoutput> <!--- Local domain. ---> <cfset domain = "./" /> <!--- Different domain. ---> <!--- <cfset domain = "http://localhost:8501/jquery/autosuggest/" /> ---> <cfloop index="index" from="#len( url.query )#" to="10" step="1"> <!--- When we make the results line item, notice that we are setting the value of the IMG tag to be a ColdFusion page - this will come into play with the delayed time. NOTE: I am using the randRange() function to prevent any caching attempts by the browser. ---> <a href="##"> <img src="#domain#img.cfm?id=#randRange( 1, 99999 )#" /> <strong>Tanya Hyde</strong> One of the hottest female bodybuilders. </a> </cfloop> </cfoutput> </cfsavecontent> <!--- Convert the response to a binary for streaming. ---> <cfset binaryResponse = toBinary( toBase64( trim( results ) ) ) /> <!--- Set the content length so the browser knows how much to content to expect. ---> <cfheader name="content-length" value="#arrayLen( binaryResponse )#" /> <!--- Stream the content back. ---> <cfcontent type="text/html" variable="#binaryResponse#" />
And, here is the page that loaded the image thumbnail. The only reason this page had to be a ColdFusion page, rather than a binary URI, was to allow ColdFusion to simulate the network latency. I had to sleep the requests so that they didn't complete immediately.
<!--- Sleep the thread for a few seconds. To simulate network delays and request latency. ---> <cfthread action="sleep" duration="#(10 * 1000)#" /> <!--- Simply stream the image back to the client. ---> <cfcontent type="image/*" file="#expandPath( './thumbnail.jpg' )#" />
Anyway, this was just an interesting problem that had me stumped for a good thirty minutes. It definitely gives me a lot of food for thought on what I want to consider "best practices" for asset delivery going forward.
Want to use code from this post? Check out the license.
Could you not have used a dynamic sprite image so you would only have had the one image request?
It's funny you mention that - I actually considered that idea. My concern with that was that I would have to create a new sprite for every search results set; I wasn't sure if ColdFusion could resize and paste so many images without seeing a speed issue.
But, it would be something cool to play with; perhaps I'll explore that in a follow-up post. Thanks!
Using Data URLS for the images would elminate the requests on non IE browsers, as the image data would simply be encoded in base64 and returned in the same request as the data itself. You would still need to use a separate domain to server the images for IE though. I like being able to serve my static files from a separate domain anyway, for this very reason.
Following Ryan's suggestion and using base64 encoded images may also allow you to dequeue by aborting the XHR's.
Thanks for explaining this so thoroughly, Ben. It seems like a very useful technique.
That's an interesting idea. I looked into base64 images a while back (unrelated) and found that it wasn't so cross-browser friendly. Not sure if it was just IE6 or all IE (as it seems @Ryan is implying). I'll do some more exploration in that.
data: URIs don't work in IE7 and prior, but are supported in IE8. I can't test it right now, so I'm not sure if earlier versions of IE degrade relatively gracefully (by showing nothing), or if they display a broken-image icon.
One of the things that can be done is to make use of the way browsers cache images. A lot of the time people link directly to a script in the format like you have in your code <img src="Domain/img.cfm?id=#id#" /> the problem with that is that the browser can't cache it due to the way that it reads the get params as unique data. The way around this is to use url rewriting for example <img src="Domain/img/#id#/img.cfm" /> it then sees this as a unique singular file and will cache it, thus decreasing the load time for other images that are the same.
Ah, good to know. I can understand not supporting IE6; but, supporting IE7 is probably still necessary.
You make a really good point. In my example, I happened to be doing everything I could to prevent caching (since I wanted to be creating parallel request threads); but, if I wanted to leverage browser caching, building the unique URLs would definitely be a solid strategy.
I don't know about IE, but for mozilla there are actually four settings buried in the config (my current values):
You're addressing the second one here, but not the first (comes out of the box as 2x the second).
And of course you never really know what Joe User's done to his browser ("Do you know what you're doing?" prompt aside), so the guy with those prefs set to 6/3 is gonna bog down pretty fast no matter what ;-).
I've only ever messed with the config in FireFox. Not sure what's available in the other browsers.
We do something similar for this when loading pages with a lot of images.
I noticed the browser would slow down on image heavy pages even though the bandwidth usage was minimal.
We have several sub-domains that are just for this purpose. eg. //imgs0.domain.com //imgs1.domain.com.
Then on each image a call is done to grab a different host.
<cfset thisImageHost = 0 />
<cfset imgHosts = ["imgs0","imgs1","imgs2","imgs3"]>
<cfset thisImageUrl = ".domain.com/imagedb/products/180x210/" />
<cfif thisImageHost GT 3>
<cfset thisImageHost = 0 />
<cfset thisImageHost = thisImageHost + 1 />
<cfreturn "http://" & imgHosts[thisImageHost] & thisImageUrl />
Yeah, this looks like a pretty good practice. I have to look into setting up a "*" sub-domain in my registrar so I can start handling things like that in my server.
The only thing that would be frustrating about this approach is that you need to have a full URL, not just a relative url (meaning, you need the server name). This seems like just another thing to worry about when going from development to production servers.
This is the same method that Google uses on their Google Maps requests-- they use 4 sub-domains just for their image requests.
We had to follow suite because we show a lot of thumbs on each request-- especially on our wholesale site and our art archives (not public).
Managing the separate configurations is always an issue.
I have been using ANT to compile my builds now.
I can do a programmatic configuration using a local server to generate a ColdSpring XML list map.
In the XML I include the list map bean to create the array of sub domains, so my local build can have a local static relative URL, them my generate build can have the array of sub-domains to use when generating the HTML.
It could be done just as easy just by providing a configuration variable with a array of sub-domains, or just a single relative path depending on the environment.
The environmentConfig.cfc on RIA forge could handle this quite well.
We used the wildcard method at first but we switched to the static URL's because we wanted a bit more control over it, and it was less resources overall (wildcard lookup is a bit more overhead).
This is a good article though-- I had situations like this using AJAX where I could not figure out why it was so slow, this is a common issue across the board.
I'm sure most people solution to this would be to nuke the images, I like the point you made that you wanted to keep it to "create a rich, interactive experience" which is the whole point to AJAX. Persistence is key here. You didn't take a step backwards and remove the images, you found a solution. Good stuff man. I love it.
I think my desire to create a "*" sub-domain is simply a response to the "emotional" feeling that somehow I am gonna need to do a LOT of work. This, of course, is simply not true; it's really just a one-time setup if I want to create several sub-domains. So, as you are saying, going static could be good.
As far as the number of sub-domains used, any advice on the number to use. If Google uses 4, I have to assume that's a really good number :) I would also assume that at some point there is simply a diminishing return on the number of parallel threads (as there is probably a limit to how much the browser can even take advantage of this, no matter how many sub-domains).
Here is a great reference from Google about Parallelize downloads across hostnames.
Thanks for the link. I keep wanting to try this out on something, but I don't know what. Maybe this blog! I kind of wish I had more stuff linked.
Would it be possible to cancel 'old' image requests using this snippet of code at stackoverflow
It essentially 'clicks' the stop button in your browser. So could you use it just before each ajax call, to halt the loading of previous images?