Ben Nadel
On User Experience (UX) Design, JavaScript, ColdFusion, Node.js, Life, and Love.
I am the chief technical officer at InVision App, Inc - a prototyping and collaboration platform for designers, built by designers. I also rock out in JavaScript and ColdFusion 24x7.
Meanwhile on Twitter
Loading latest tweet...
Ben Nadel at Scotch On The Rocks (SOTR) 2011 (Edinburgh) with: Chris Laslett

Writing A RequireJS Plugin To Load Remote jQuery Templates

Posted by Ben Nadel

Over the past couple of weeks, I've started looking into RequireJS for asynchronous script loading and JavaScript dependency management. While I don't have a lot of experience with such things, there is definitely something about RequireJS that feels really good. All the stuff that I've tried so far just seems to work. In my last post, I used the Text plugin to load remote HTML templates. For this post, I wanted to take that concept one step further; this time, I want to try writing a RequireJS plugin that returns remote HTML templates in the form of detached jQuery DOM wrappers.


 
 
 

 
  
 
 
 

As I talked about in my last post, plugins in RequireJS are simply JavaScript modules that expose a given interface (ie. the plugin API). So, when you see the Text plugin, for example, being used like this:

text!somefile.txt

... you can think of this as a call to the AMD (Asynchronous Module Definition) module:

./text.js

When a plugin is loaded it must adhere to the RequireJS plugin interface which exposes up to four methods, only one of which is required:

  • load (required)
  • normalize
  • write
  • pluginBuilder

Both write() and pluginBuilder() are used during the optimization and build process; and, since I haven't looked at that stuff yet, I didn't bother experimenting with these two functions. The normalize() method is used to help RequireJS properly cache resources; I played around briefly with this one, but couldn't get it to work as I had expected. The last method - load() - is the only required plugin method and takes care of actually loading and executing the plugin request.

To keep this experiment light, I am only going to look at the load() method. But, before we look at the plugin code, let's look at what kind of plugin we are going to make.

As I demonstrated in my previous RequireJS blog post, the Text plugin allows us to load remote HTML templates:

text!template.htm

Putting this value into a require() or define() statement will load the content of the "template.htm" file and pass the content as an argument to the respective callbacks. In my previous demo, I then took that content and transformed it into an actual detached DOM (Document Object Model) tree using the jQuery constructor:

  • var template = $( templateContent );

As this is the way in which I will pretty much always be using the template content, I wanted to see if I could write a plugin that would encapsulate both the loading of the template content and the transformation of said content into detached DOM trees.

Let's assume, for this demo, that I define my template content using HTML Script tags. And, let's assume that I maintain a library of templates by placing multiple Script tags in a single remote file:

  • <script type="text/x-application-template" class="templateOne">
  •  
  • //
  • // Some template content.
  • //
  •  
  • </script>
  •  
  • <script type="text/x-application-template" class="templateTwo">
  •  
  • //
  • // Some template content.
  • //
  •  
  • </script>
  •  
  • <script type="text/x-application-template" class="templateThree">
  •  
  • //
  • // Some template content.
  • //
  •  
  • </script>

With this kind of setup, it would be nice to have a RequireJS plugin that could, in a single step, load the given HTML content and then parse and return a given Script tag within said content. To do this, let's create a plugin - "template" - that uses both the template file path and the class name of the target script:

template!templates.htm:templateOne

In this case, the name of the plugin is "template" and will be defined within the template.js file. The resource being passed to the template.js plugin is:

templates.htm:templateOne

Notice that we have a colon (:) separating the file path from the class name of the target Script tag (class = templateOne).

With this in mind, let's create our remote HTML template library:

templates.htm (Remote HTML Template Library)

  • <!-- BEGIN: Friend Template. -->
  • <script type="text/x-application-template" class="friend">
  •  
  • <li>
  • <span class="name">[NAME]</span>
  • <span class="age">
  • ( <span class="value">[AGE]</span> )
  • </span>
  • </li>
  •  
  • </script>
  • <!-- END: Friend Template. -->

For this demo, our remote HTML template library only contains one template the represents a Friend view.

Now, let's look at the code that will make use of this HTML template and our template plugin to synthesize a page using DOM manipulation.

  • <!DOCTYPE html>
  • <html>
  • <head>
  • <title>Writing A RequireJS Plugin</title>
  • </head>
  • <body>
  •  
  • <h1>
  • Writing A RequireJS Plugin
  • </h1>
  •  
  • <h2>
  • Friends
  • </h2>
  •  
  • <ul class="friends">
  • <!-- To be populated dynamically. -->
  • </ul>
  •  
  •  
  • <!-- Load the RequireJS + jQuery library. -->
  • <script type="text/javascript" src="./require-jquery.js"></script>
  • <script type="text/javascript">
  •  
  •  
  • // Define some friend data.
  • var friendsData = [
  • {
  • id: 1,
  • name: "Tricia",
  • age: 37
  • },
  • {
  • id: 2,
  • name: "Sarah",
  • age: 29
  • },
  • {
  • id: 3,
  • name: "Libby",
  • age: 40
  • }
  • ];
  •  
  •  
  • // Load our template and populate the page.
  • require(
  • [
  • "template!./templates.htm:friend"
  • ],
  • function( friendTemplate ){
  •  
  •  
  • // Create our raw DOM node array to add to the page.
  • var buffer = $.map(
  • friendsData,
  • function( friendData ){
  •  
  • // Clone the friend template.
  • var friend = friendTemplate.clone();
  •  
  • // Set the name and age.
  • friend
  • .attr( "data-id", friendData.id )
  • .find( "span.name" )
  • .text( friendData.name )
  • .end()
  • .find( "span.age span.value" )
  • .text( friendData.age )
  • .end()
  • ;
  •  
  • // Return the raw DOM node.
  • return( friend.get() );
  •  
  • }
  • );
  •  
  • // Add the nodes to the page.
  • $( "ul.friends" ).append( buffer );
  •  
  •  
  • }
  • );
  •  
  •  
  • </script>
  •  
  • </body>
  • </html>

As you can see, our RequireJS code is using one require() method call that defines the following dependency:

"template!./templates.htm:friend"

This is going to invoke our template.js plugin module to get the "friend" template from within our remote HTML library, templates.htm. And, when once the template is loaded, we then merge it with the data to create the following page output:


 
 
 

 
 Writing a RequireJS plugin to load remote HTML templates and convert them into a jQuery collection. 
 
 
 

As you can see, our Template plugin returned a detached DOM tree (within a jQuery wrapper) which we were then able to clone and manipulate. Now that we see what that produces, let's take a look at the actual RequireJS plugin definition. Please keep in mind that this is my first exploration of RequireJS plugin authoring and I did not attempt to make use of any caching or optimization.

template.js (Our RequireJS Template Plugin Module)

  • // Define the TEMPLATE PLUG-IN.
  • define(
  • function(){
  •  
  •  
  • // I load the given resource.
  • var loadResource = function(
  • resourceName,
  • parentRequire,
  • callback,
  • config
  • ){
  •  
  • // Parse the resource - extract the path to the template
  • // file and the class name of the script.
  • var resourceConfig = parseResource( resourceName );
  •  
  • // Get the path to the template file.
  • var resourcePath = resourceConfig.resourcePath;
  •  
  • // Get the class name of the script tag (with our markup).
  • var templateClass = resourceConfig.templateClass;
  •  
  •  
  • // Load the template class.
  • parentRequire(
  • [
  • ("text!" + resourcePath)
  • ],
  • function( templateContent ){
  •  
  • // Wrap the template content in a DIV tag just
  • // to make sure we have a parent element.
  • templateContent = (
  • "<div>" + templateContent + "</div>"
  • );
  •  
  • // Create the templates node.
  • var templates = $( templateContent );
  •  
  • // Get the template that was requrested by the
  • // class name.
  • var targetTemplate = templates.find( "script." + templateClass );
  •  
  • // Create a jQuery DOM element out of the
  • // template markup and pass it back to the
  • // loader.
  • callback(
  • $( targetTemplate.html() )
  • );
  •  
  • }
  • );
  •  
  • };
  •  
  •  
  • // When the resource name is passed to this plugin, it is in
  • // the form of:
  • //
  • // resourcePath:className
  • //
  • // ... where resourcePath is the path to the HTML file that
  • // contains our templates and className is the class attribute
  • // of the Script tag that contains our template markup.
  • var parseResource = function( resourceName ){
  •  
  • // Split the resource into parts.
  • var resourceParts = resourceName.split( ":" );
  •  
  • // Get the resource path to our HTML file.
  • var resourcePath = resourceParts[ 0 ];
  •  
  • // Get the class name of our template markup container.
  • var templateClass = resourceParts[ 1 ];
  •  
  • // Return the resource configuration.
  • return({
  • resourcePath: resourcePath,
  • templateClass: templateClass
  • });
  •  
  • };
  •  
  •  
  • // --------------------------------------------------- //
  • // --------------------------------------------------- //
  •  
  •  
  • // Return the public API for the plugin. The only required
  • // function in the plugin API is the load() function.
  • return({
  • load: loadResource
  • });
  •  
  •  
  • }
  • );

The meat of this plugin is the loadResource() function, which is exposed as the required load() API method. The loadResource() method parses the incoming resource, extracting the template path name and the class name of the target Script tag. It then uses the Text plugin to load the given template file. Once loaded, the plugin parses the file content using the jQuery constructor, locates the target script tag, and returns a new jQuery wrapper for the contents of said script tag.

Clearly, there is a huge opportunity for caching here, which I am not taking any advantage of for this demo. The good news is, since I am using the Text plugin internally, the content of the remote HTML file is cached automatically. So, while I have to keep parsing the HTML content with each plugin request, at least I only have to incur the HTTP request cost once.

As with everything I've tried so far involving RequireJS, this stuff just worked. I know there is a lot more to plugin authoring that has yet to be examined; but, for the stuff I did look into, everything felt very intuitive. The RequireJS library continues to impress me.




Reader Comments

...make my day! Cool image, must be first time I've seen it :)

Interesting post and experiment, especially the idea of "just" wrapping the existing module for your own special use case - thereby saving dev time and like you said network request.

So, where do you go from here? Planning on expanding the plugin to "full blown " status?

Reply to this Comment

@Atleb,

As far as next steps, in this plugin, there is a good amount of caching that could be done - parsing of the template file and parsing of the individual Script tags. Right now, every time a user requests a particular template, the Script tag content is relocated and re-parsed. Clearly, I could do some caching, but I wasn't sure how to best approach that.

Anyway, overall I'm just enjoying digging into RequireJS. They have some really cool features that seem to be quite easy to use :) So that's beasty.

Reply to this Comment

I am working on exactly the same setup, except we have just dropped jQuery Templates because it is no longer being actively developed as jQuery and Microsoft have decided not to take it past beta. (see docs)

We have now converted to Handlebars.js for templates and have abstracted the creation of the compiled template to a factory that handles caching - we are also looking to compiling templates on the server in pure js (handlebars supports this on node and rhino)

We currently have a homegrown dependancy management library for requiring js, but I would LOVE to move over to require.js as I want standardization so 3rd party library requires play nice.

Reply to this Comment

RequireJS is indeed very useful and plugins make it even more awesome, been using it on most projects and the AMD format helps a lot the development process since you can split your code into multiple files during development (good for organization) and just run the optimizer before deployment to combine all the files and reduce the amount of HTTP requests, another major improvement is that by using the AMD format you also avoid global variables and favors good practices (the page "Why AMD?" on the RequireJS website gets into more details about why this format is good).

I've been following a slightly different approach, instead of combining all my templates into the same file I create separate files for each template and simply load them using the "text!" plugin, if you set the RequireJS optimizer build setting "inlineText" it will merge all the text files loaded using the "text!" plugin inside your JS file, avoiding additional file requests ( https://github.com/jrburke/r.js/blob/daafe42980/build/example.build.js#L100-102 ).

There is a plugin "database" on RequireJS wiki with useful plugins ( https://github.com/jrburke/requirejs/wiki/Plugins/ ).

AMD FTW!

Reply to this Comment

@Luke,

Yeah, my templating system is mostly home-grown. I've heard great things about Handlebars, though. I like the idea of compiling things on the server as well. I work mostly with ColdFusion; but, I could probably use ColdFusion to do all the string-manipulation that is typically involved in the compiling of the template rendering.

@Miller,

I'm loving the benefits of the modularity that you are talking about - organization and non-global variables. I haven't looked at the compiler / optimizer yet, though. I do have node.js running on my local machine, so I think I should be able to run it without a problem.

I like the loading of text! values inline to the file! That is badass. That should cut down on a lot of HTTP calls that don't need to be performed in production.

I don't yet have my head wrapped around the balance between file size and number of HTTP requests. In a presentation I attended by Kyle Simpson on LABjs, I think I remember him talking about a sort of optimal file size that wasn't too small or too large... but I don't want to put words in his mouth - it was several months ago.

What I do know is that I am really liking RequireJS. Perhaps the compiler is the next thing I will experiment with.

Reply to this Comment

If you want you can run the optimizer on Rhino (Java), I've been favoring this approach on most of my projects since installing nodejs on windows is hard and most of my coworkers already have Ant installed on their machines (Ant comes pre-installed on Mac and some IDEs) - here is an Ant task for it: https://gist.github.com/825117

About multiple parallel downloads I haven't seen conclusive data about it, there was a huge thread at HTML5-Boilerplate repository but no real conclusion...

I've been breaking my apps into multiple deploy files only when it makes sense - if some parts of the code may not be needed on the home page or initial load for instance - but if the whole app have less than 80KB of JS (and is loading fast) I don't care about splitting it that much. For me the greatest benefit is code organization. Better performance and lazy-loading is just an added bonus.

PS: some of my projects have 100+ small JS/template files (with ~100 LOC each) which gets compiled into 1-10 JS files for production.

Reply to this Comment

@Ben,

As kong as your have sites / solutions that you might want to be snappy on mobile as well, chances are pretty good that one js file will be better. As long as you minify and gzip the actual dl size should counter the overhead of an extra network connection :)

IMHO, main reason for having split up files is when there is a large difference in the update pace, so you could get more caching of parts of the "bundle" as opposed to others. Or on the same thread, using a CDN or common URL across sites for libraries like JQuery

Chances are you could save in the kb with a bit of extra compression on the images used instead (I know we could and can for our site - especially with running a news site as the images are mostly new ones each time the users visits)

- your mileage may vary :)

Looking forward to the v2 version!

Reply to this Comment

@Miller,

I can imagine that loading 100 small files individually, even if TINY, would incur a significant HTTP overhead; if nothing else, it would certainly delay the loading of other assets like images.

I'm on a Mac and have Node installed, so I should be able to run it. However, when I tried running it the other day, it didn't seem to do anything. I used the intro example they have on the site, hit enter, and the console just sits there with a ">" on the next line. Clearly, something is not working properly :( Or, more likely, I am not understanding how to get this thing running. I'll have to give it another look.

@Atleb,

Good point on situations where partial caching would give you a good benefit. Right now, my JavaScript loading is not well optimized, so, ironically, it does get to take advantage of the browser-based caching.

Reply to this Comment

Another benefits I would point with optimization + AMD templates, at least concerning my micro-templates implementation, is the pre-compilation. Instead of asking the client to compile them (doing all the

  • string.replace()

things ), instead of being eval'd functions, they can just be functions only doing one thing : some

  • array.push()...join()

.
So less work on client side in production, this should theorically mean better perf, and, my favorite, no eval. Almost every template engines use eval, and ... well... everybody knows it's evil right ? :)

Actually, I'm wrong : my favorite feature is the "real LOC effect" : debugging templates is sometimes pretty hard because they're anonymous functions in most template engines. Debuggers (at least Webkit inspector) won't give you any file path and line number to search to because there is none ! In the best case : it can give you the LOC where the eval occured, but it will be the same for every templates, and tracing from this point is boring.

Optimization via r.js is easy, fast and it can optionally give you a non-minified version of your code, in one file, including those compiled templates. Executing this one big file (I'm calling it the "build") including those annoying bugs will help you to find where those bugs are because templates are not anonymous eval'd functions anymore !

Reply to this Comment

Post A Comment

?
You — Get Out Of My Dreams, Get Into My Comments
Live in the Now
Oops!
Comment Etiquette: Please do not post spam. Please keep the comments on-topic. Please do not post unrelated questions or large chunks of code. And, above all, please be nice to each other - we're trying to have a good conversation here.