To test this, I created a demo page that contained a fixed-position image. Then, I created a function that did nothing more than nudge the image to the right a bunch of times and then nudged it back to the left a bunch of times. With each nudge, the function re-evaluated the position of the image in order to execute the nudge.
Once I had this in place, I then tried to execute this function several hundred times in "parallel" using a combination of setTimeout(), setInterval(), and AJAX success callbacks:
Notice that after each nudge, I check to see if the current position of the image is at the expected value. This would always be true unless two parallel executions of the function were nudging the same image at the same time (thereby nudging faster than the expected position).
This page takes about 5 minutes to run with all the position calculations and AJAX requests, so I won't bother trying to demo it in a video. What I can say, however, is that I never once logged a single unexpected image position. I don't know if this is conclusive in any way; but, it appears on the surface that setTimeout() and setInterval() do not create race conditions.
If anyone sees anything glaringly wrong with this logic, please let me know. Each execution of the mover() function seems to take long enough to ensure that timers of equal delay won't coincidentally finish executing in serial. As such, I can only conclude that the serial execution is enforced by the browser.
Want to use code from this post? Check out the license.
Well if you think about it, the JS VM in browsers are single threaded (event loops ftw). Each call gets pushed onto the call stack. Even for non blocking calls like setTimeout,setInterval and XHR. They just get queued up on the event loop and will run after one another, which means they will never ever have a race condition. (As far as I know)
Ben et al,
It turns out that you can setup race conditions if you're using AJAX and one server response happens faster than the next. Instead of requesting a file that is plain text, request a ColdFusion file that has some sort of delay built into its response that is random. This may trigger a different result.
It's starting to sure up in my mind. @Justice, I believe we've actually talked about this briefly before in the comments of another post (though I was not able to find it). Good thoughts on the eventing.
I've played around a bit with Node.js and I get the event loop they talk about; but, as far as execution per-loop, I don't think I fully understood how things happened.
That's a different kind of race condition - that has to do with the order in which AJAX requests are returned.
One one you can run into race conditions like this in JS is using Web Workers:
With multiple web workers doing the same thing, you could certainly run into race conditions because the web workers work outside the single threadedness of your normal JS (which is the whole point--to allow you to do expensive processing that doesn't interfere with the normal JS execution process.)
At this point, I've only read about Web Workers but have not used them. But, they work through messaging, right? So, at some point they have to dip back into the single-threaded main page; so, would they just fall back into the event loop like an AJAX callback?
Right, I didn't pick up from your post which kind of race condition you were trying to recreate. The event loop will otherwise handle things in a consistent order. I have run into crazy race conditions with AJAX returns so I just warned about it because it has bitten me more than once :)
No worries; I've definitely heard of people having AJAX order problems :) Someone hinted to me the other day that jQuery 1.5 may have actually added some serial-AJAX queuing situation; but, I've not actually seen or ready anything about it.
But don't workers only have access to worker global scope (not window global scope) they won't be able to effect that aspect? I would also assume the messaging API listening on the window for worker messages would also be event based and not have any race conditions.
Correct, web workers may run in their own fibers/threads/processes; but because they communicate with each other only via message-passing, and because web workers are not permitted to touch the DOM, you will not see odd race conditions with them.
Each callback being executed on the main thread in response to a message from a worker will execute *to completion* before another callback may be executed on the main thread.
Too bad we can't see in the browser engine, how it's handling each request, and if it's taking too many cpu requests, as we can with windows task manager.
But I think your right about being concerned.
These days, everyone wants an ajax powered something or other, and have we really given as much concern to the eventual security and performance bottlenecks?
Are there better tools or ways or methods to debug jquery/ajax?
If you use google-chrome:
* you can browse to about:memory to see memory usage per-tab
* you can right-click the top and click Task Manager to see CPU usage per-tab
* and you can hit ctrl+shift+j to see network requests (including timing, caching, etc) for the current tab and you get a console and a debugger.
The security concerns arising from use of AJAX are already handled by good web-development frameworks.