Ben Nadel
On User Experience (UX) Design, JavaScript, ColdFusion, Node.js, Life, and Love.
I am the chief technical officer at InVision App, Inc - a prototyping and collaboration platform for designers, built by designers. I also rock out in JavaScript and ColdFusion 24x7.
Meanwhile on Twitter
Loading latest tweet...
Ben Nadel at cf.Objective() 2010 (Minneapolis, MN) with:

Writing My First Node.js Module And Event Emitter

By Ben Nadel on

Last week, I started to experiment with Node.js, a server-side Javascript runtime environment. In my example, I created a one-page web service that created an HTTP server and listened for CRUD-style requests on a cached data collection. The example was not huge; but, had it been any bigger, a single-page approach would have been far too unwieldy to maintain. As such, I wanted to take the existing code and refactor it into Modules. Module are cohesive bits of code, somewhat akin to Classes, that can be loaded into a Node.js application.

 
 
 
 
 
 
 
 
 
 

In traditional Javascript, you might use a self-executing function to define a Class and then return either a singleton or a class constructor:

  • (function(){
  •  
  • // Define your class.
  •  
  • return( Singleton || Constructor );
  •  
  • })();

This allows you to create a local, isolated memory space in which your class can be defined without the risk of global-namespace pollution. From what I've read, Node.js uses this same approach; but, some of the underlying mechanics are implemented behind the scenes. Specifically, Node.js creates an implicit local memory space for your module; and, rather than using a return() statement, Node.js provides an "exports" collection for publishing your module API.

So, where as a function-based module definition might look like this:

  • // Define our module.
  • (function(){
  •  
  • // Counter variable.
  • var counter = 0;
  •  
  • // Return public API.
  • return({
  • countUp: function(){
  • return( ++counter );
  • }
  • });
  •  
  • })();

... a Node.js module might look like this:

  • // Define our module.
  •  
  • // Counter variable.
  • var counter = 0;
  •  
  • // Export our public API.
  • exports.countUp = function(){
  • return( ++counter );
  • };

As you can see, the local scope is implied; and, we're using an exports-based exposure rather than a return() statement.

Furthermore, modules are automatically cached by the Node.js application upon first load. As such, repeated calls to require() - the global method that loads modules - will all result in a reference to the same cached object.

Now, getting back to my web services example, I wanted to refactor my code into modules. In my first blog post, I already had a GirlService object; so, clearly that would move into its own module. But, I also wanted to extract some of the "routing" logic into its own module as well. What I ended up with was a girl-service.js Domain module and a controller.js Controller module.

In addition to exploring modules, I also wanted to use this as an opportunity to delve deeper into an asynchronous mindset. One of the revolutionary things about server-side Javascript is that it is very heavily evented and non-blocking. When it comes to data I/O (input/output), this means that I can't simply return data; rather, I have to request data and then "notify" observers when that data is "ready."

Notification can happen in two different ways: either through callbacks or true publication and subscription based event management (ie. Pub/Sub). With two layers of separation between the HTTP server and the cached data - Controller and Domain Model - I decided to make use of both of these approaches.

For event subscription, Node.js provides an EventEmitter module. This module provides an on() method for event subscription and an emit() method for event publication. It should be noted that I don't truly feel that the EventEmitter is necessary for this particular application; I am using it more as a vehicle for learning and less as a best-of-breed solution. Honestly, I am quite sure that callback-based notification would actually have been a far better solution.

That said, let's look at some code. Here is the root Javascript file that sets up the HTTP server to listen for web service requests. Notice that I am using the require() method to include the controller module.

Server.js

  • // Include the necessary modules.
  • var sys = require( "sys" );
  • var http = require( "http" );
  •  
  • // Include the Controller layer.
  • var controller = require( "./controller.js" );
  •  
  •  
  • // ---------------------------------------------------------- //
  • // ---------------------------------------------------------- //
  •  
  •  
  • // Create an instance of the HTTP server.
  • var server = http.createServer(
  • function( request, response ){
  •  
  •  
  • // Our controller is an instance of the Event Emitter. We
  • // will be using this to listen to controller events. We
  • // could have handled this with Success and Fail callbacks;
  • // but, I wanted an excuse to experiment with the Event
  • // stuff as it appears to be quite central to Node.js.
  •  
  •  
  • // Bind to the "Data" event. This is the event that gets
  • // emitted when the controller has data to return.
  • controller.on(
  • "data",
  • function( data ){
  •  
  • // Set the 200-OK header.
  • response.writeHead(
  • 200,
  • { "Content-Type": "text/plain" }
  • );
  •  
  • // Return the response from the API request.
  • response.write( JSON.stringify( data ) );
  •  
  • // Close the request.
  • response.end();
  •  
  • }
  • );
  •  
  •  
  • // Bind to the "Error" event. This is the event that gets
  • // triggered when a success response cannot be returned. This
  • // will either be because an error occurred; or, because the
  • // record request by the API cannot be found.
  • //
  • // NOTE: For our simple purposes, we are going to assume that
  • // the errorType is simply an HTTP Status Code. This will
  • // keep our logic very simple for this exploration.
  • controller.on(
  • "error",
  • function( errorType ){
  •  
  • // Set the error header (is really just an HTTP
  • // status for our purposes).
  • response.writeHead(
  • errorType,
  • { "Content-Type": "text/plain" }
  • );
  •  
  • // Close the request.
  • response.end();
  •  
  • }
  • );
  •  
  •  
  • // Pass the request off to the controller.
  • controller.handle( request, response );
  •  
  •  
  • }
  • );
  •  
  • // Point the server to listen to the given port for incoming
  • // requests.
  • server.listen( 8080 );
  •  
  •  
  • // ---------------------------------------------------------- //
  • // ---------------------------------------------------------- //
  •  
  •  
  • // Write debugging information to the console to indicate that
  • // the server has been configured and is up and running.
  • sys.puts( "Server is running on 8080" );

As you can see, for each request, I am binding to the "data" and "error" events of the controller. Then, I pass the request off to the controller for routing and simply let the "eventing" happen.

WARNING: I am re-binding the event listeners for every single request. This is a really horrible thing to do because it means that the list of subscribers increases by 2 for every single request! Like I said above, this is not a good solution - I am simply using the EventEmitter class here so that I can explore the way in which it works.

Now that you've seen the HTTP server configuration, let's take a look at the controller layer:

Controller.js

  • // Include the event emitter class - the controller is a specialized
  • // instance of the event emitter and will emit the events:
  • //
  • // - data / response
  • // - error / errorType (HTTP Status code)
  • var EventEmitter = require( "events" ).EventEmitter;
  •  
  • // Include the Girl Service layer.
  • var girlService = require( "./girl-service.js" );
  •  
  •  
  • // ---------------------------------------------------------- //
  • // ---------------------------------------------------------- //
  •  
  •  
  • // Create an instance of our event emitter.
  • var controller = new EventEmitter();
  •  
  •  
  • // Add a handle method to the event emitter instance (controller)
  • // so that the HTTP server will be able to pass the request off for
  • // proper routing.
  • controller.handle = function( request, response ){
  •  
  • // We are going to be looking at urls for RESTful commands.
  • // These will be in the form of:
  • //
  • // NOTE: I am using the term RESTful in the loosest sense.
  • // Really, this is just easy for illustration purposes.
  • //
  • // girls/get
  • // girls/{id}/get
  • // girls/{id}/delete
  • // girls/add/{name}
  •  
  •  
  • // Define our patterns.
  • var patterns = {
  • getAll: new RegExp( "girls/get", "i" ),
  • getGirl: new RegExp( "girls/(\\d+)/get", "i" ),
  • deleteGirl: new RegExp( "girls/(\\d+)/delete", "i" ),
  • addGirl: new RegExp( "girls/add/([^/]+)", "i" )
  • };
  •  
  • // Strip off the leading and trailing slashes.
  • var restUri = request.url.replace(
  • new RegExp( "^/|/$", "g" ),
  • ""
  • );
  •  
  •  
  • // Loop over the patterns to see if any match.
  • for (var patternKey in patterns){
  •  
  • // Try to match the pattern against the URL.
  • if ( match = restUri.match( patterns[ patternKey ] ) ){
  •  
  • // Pass the request off to the service layer. Since
  • // the service layer is performing asynchronous I/O
  • // (theoretically), we need to pass it a callback so
  • // that the service layer can alert us to data events
  • // when they are available.
  •  
  • // Build the arguments. Our last argument will always
  • // be a callback for our asynchronous API. In this case,
  • // the callback will be expecting an API response for a
  • // successful call; OR, a null response for a record that
  • // could not be found.
  • var apiArguments = [function( apiResponse ){
  •  
  • // Check to see if we have a valid API response.
  • if (apiResponse){
  •  
  • // The API request was successful - announce
  • // the data event.
  • controller.emit( "data", apiResponse );
  •  
  • } else {
  •  
  • // The API request was not successful - announce
  • // the error event.
  • controller.emit( "error", "404" );
  •  
  • }
  •  
  • }];
  •  
  • // If there is a captured group in the regex pattern
  • // that we used above, add it as the first argument to
  • // our collection of service-layer invocation arguments.
  • if (match.length > 1){
  •  
  • // Prepend the captured group (an ID) to the list
  • // of arguments used to invoke the service layer.
  • apiArguments.unshift( match[ 1 ] );
  •  
  • }
  •  
  • // Invoke the service layer (remember, the last argument
  • // of our invocation array is always the callback for
  • // asynchronous I/O).
  • girlService[ patternKey ].apply(
  • girlService,
  • apiArguments
  • );
  •  
  • // The RESTful URL can only match one pattern.
  • // Since we found a match, return out of the request
  • // handler as there is nothing more we can do here
  • // until the data-callback is triggered.
  • return;
  •  
  • }
  •  
  • }
  •  
  •  
  • // If we have made it this far, then the incoming request did
  • // not match up with any known API signature. As such, we will
  • // announce (emit) a server error.
  • controller.emit( "error", "500" );
  •  
  • };
  •  
  •  
  • // ---------------------------------------------------------- //
  • // ---------------------------------------------------------- //
  •  
  •  
  • // Expose the controller / event emitter. Since we are exposing
  • // the whole object, rather than just an API interface, we are
  • // redefining the entire exports value.
  • module.exports = controller;

As you can see, the controller object is an instance of the EventEmitter class. Once it is instantiated, I then augment it with a handle() method which provides for request routing. I stated above that Node.js modules provide an "exports" collection for publishing the module API. In this particular module, however, notice that rather than curating an API, I'm actually reassigning the entire exports reference. In this way, I can allow the controller instance to define the API and I don't have to worry about creating method proxies between the calling context and the controller instance.

The controller itself works off of the EventEmitter class. But, internally, the controller interacts with the GirlService using callbacks. All of the data I/O methods in the service layer might be asynchronous; as such, when the controller makes a request to any of the CRUD methods, it passes through a callback function which, when invoked, will use the provided data in order to trigger (emit) an event for all of its subscribers (ie. the HTTP server).

The GirlService object is pretty much the same as it was in my first blog post. The only critical difference is that it uses callbacks rather than return() statements to deliver data:

Girl-Service.js

  • // Create our collection of girls. This collection is keyed by the
  • // ID of the girl.
  • var girls = {};
  •  
  • // Keep a running auto-incrementer.
  • var primaryKey = 0;
  •  
  •  
  • // ---------------------------------------------------------- //
  • // ---------------------------------------------------------- //
  •  
  •  
  • // Now, we have to define the exteranl, public API of the module.
  • // Each of the methods in this API gets "exported" as part of the
  • // exposed object.
  •  
  •  
  • // All of these methods deal with data READS and WRITES. In Node,
  • // this kind of I/O is supposed to be asynchronous; that is, a
  • // method that reads or writes to storage can't return a value
  • // directly (as it is non-blocking... for the most part). Therefore,
  • // we have to pass callbacks to all of our IO methods so that our
  • // calling context can be *alerted* to data events.
  •  
  •  
  • // I am creating a noop function so I can simplify the logic
  • // surrounding my callbacks. With a noop() (read as No-Op), I can
  • // always have a reference to *a* callback.
  • var noop = function(){};
  •  
  •  
  • // I add a girl to the collection.
  • exports.addGirl = function( name, callback ){
  • // Make sure a callback is defined.
  • callback = (callback || noop);
  •  
  • // Create the new girl instance.
  • var girl = {
  • id: ++primaryKey,
  • name: name
  • };
  •  
  • // Add it to the collection.
  • girls[ girl.id ] = girl;
  •  
  • // Pass the girl to the callback (the calling context).
  • callback( girl );
  •  
  • // Return this object reference to allow for method chaining.
  • return( this );
  • };
  •  
  •  
  • // I delete the girl with the given ID.
  • exports.deleteGirl = function( id, callback ){
  • // Make sure a callback is defined.
  • callback = (callback || noop);
  •  
  • // Get the girl.
  • var girl = (girls[ id ] || null);
  •  
  • // If the girl exists, delete her.
  • if (girl){
  • delete girls[ girl.id ];
  • }
  •  
  • // Pass the girl to the callback (the calling context).
  • callback( girl );
  •  
  • // Return this object reference to allow for method chaining.
  • return( this );
  • };
  •  
  •  
  • // I return the girl with the given id.
  • exports.getGirl = function( id, callback ){
  • // Make sure a callback is defined.
  • callback = (callback || noop);
  •  
  • // Pass the girl to the callback (the calling context).
  • callback( girls[ id ] || null );
  •  
  • // Return this object reference to allow for method chaining.
  • return( this );
  • };
  •  
  •  
  • // I get all the girls.
  • exports.getAll = function( callback ){
  • // Make sure a callback is defined.
  • callback = (callback || noop);
  •  
  • // Create a holder for our ordered collection.
  • var orderedGirls = [];
  •  
  • // Loop over the primary keys to build up the collection
  • // of ordered girls.
  • for ( var i = 1 ; i <= primaryKey ; i++ ){
  •  
  • // Check to see if a girl exists at this key.
  • if (girls[ i ]){
  •  
  • // Add this girl to the result in order.
  • orderedGirls.push( girls[ i ] );
  •  
  • }
  •  
  • }
  •  
  • // Pass the collection to the callback (the calling context).
  • callback( orderedGirls );
  •  
  • // Return this object reference to allow for method chaining.
  • return( this );
  • };

As part of the local memory space of the GirlService module, I have defined a no-operation function, noop(). This is an empty function that does nothing more than simplify my callback logic. It's always possible that the calling context won't pass-in a callback to my CRUD methods. As such, I can use the noop() function to default the value of the callback. This way, I don't have to put logic around my callback invocation (to test for callback existence); I can just assume that it is always available, even if it doesn't do anything.

This example is quite a bit sloppy; the use of the EventEmitter actually causes more problems than it solves (albeit ones that we don't necessarily feel at this scale). But, my understanding of Node.js is so minimal at this point that I don't mind making horrible decisions if it means that I get to learn more about the core libraries. Once I get better handle on what Node.js provides, I am sure that I can start looking at more purposefully structured solutions.




Reader Comments

Hi Ben,

This is a great post. I like your way to learn new things ;) we have all made some bad code or at least not really optimized when we expriment new features. (confirm me that I'm not alone! :P)

To return to Node.js, I'm very excited to see your code! I never had the oportunity to experiment server side javascript and I confess that it trouble my brain. Work with events like this is not familiar to me. My server side skills are limited to Coldfusion (and other like php) and this radicaly different method is very exciting to me!

Thanks for your post I hope you continue to publish your experimentation!

@Loic,

I am glad you are liking this kind of stuff. I too am having some difficultly wrapping my brain around this radical shift from ColdFusion-style application design. That's why I'm trying to just take it one step at a time, one feature at a time. I'll definitely keep posting anything I learn :)

@Ben
Nice posts, yet again you post on what I am doing myself at the moment, AWESOME!

I found that it made it a lot easier to get used to once I started using Express & Connect.

For those who haven't looked into node yet the documentation appendix shows a list of the 'standard' modules for use to help with sorting out which modules to use as core. I found their documentation and examples so useful for getting used to Node.
http://nodejs.org/docs/latest/api/appendix_1.html

Great to see you getting people interested in NodeJS Ben, great work, best server to come along since... well... ColdFusion! If only Adobe would produce an integration layer and allow control of Node from the CF Admin. Then we could have the best of both worlds.

@Marcel,

Ha ha, awesome timing then - great minds clearly think alike :D

That's a good list of resources in the appendix. I think tomorrow, I'm gonna break down and install NPM. I've been trying not to, so that I can get a feel of how the modules get installed. But, some of them have just too many dependencies (some of which need to be compiled and involve make files and stuff like that... which is just a bit out of my league).

Someone just told me on Twitter today, however, that NPM can install things in a local directory (ie. per-app directory rather than in a global Node directory), which is what I want to stick to while I'm experimenting. No use in installing a bunch of junk for playing and not needing it... maybe? I guess it really doesn't do much harm.

Certainly, this stuff is a lot of fun! I hope we can get some good idea-exchange going.

hey i am a student .... started working on node.js ... but this platform ... my mind just start spinning ... probably i am working with DOM for the first time thats why .... i want to know how can i call a search function(controller) by not giving the ID to search or by giving the name or any other attribute ....?? from db in node... like we search in sign in ... just user name and then compare password .... to login...

I know that this post is rather old, but it does rank high for relevant searches.

It seems to me that `controller` is in the global state and thus you're binding to the emitted `data` event whenever controller emits it. Doesn't the code above leak data across requests?