views:

244

answers:

3

I am creating a small application with a JavaScript client (run in the browser) and a Node.js server, communicating using WebSocket.

I would like to share code between the client and the server. I have only just started with Node.js and my knowledge of modern JavaScript is a little rusty, to say the least. So I am still getting my head around the CommonJS require() function. If I am creating my packages by using the 'export' object, then I cannot see how I could use the same JS files in the browser.

I want to create a set of methods and classes that are used on both ends to facilitate encoding and decoding messages, and other mirrored tasks. However, the Node.js/CommonJS packaging systems seems to preclude me from creating JS files that can be used on both sides.

I also tried using JS.Class to get a tighter OO model, but I gave up because I couldn't figure out how to get the provided JS files to work with require(). Is there something am I missing here?

+3  A: 

Don't forget that the string representation of a javascript function represents the source code for that function. You could simply write your functions and constructors in an encapsulated way so they can be toString()'d and sent to the client.

Another way to do it is use a build system, put the common code in separate files, and then include them in both the server and client scripts. I'm using that approach for a simple client/server game via WebSockets where the server and client both run essentially the same game loop and the client syncs up with the server every tic to make sure nobody's cheating.

My build system for the game is a simple bash script that runs the files through the C preprocessor and then through sed to clean up some junk cpp leaves behind, so I can use all the normal preprocessor stuff like #include, #define, #ifdef, etc.

no
Serialising javascript functions as strings never occurred to me. Thanks for the tip.
Simon Cave
+1  A: 

The server can simply send JavaScript source files to the client (browser) but the trick is that the client will have to provide a mini "exports" environment before it can exec the code and store it as a module.

A simple way to make such an environment is to use a closure. For example, say your server provides source files via HTTP like http://example.com/js/foo.js. The browser can load the required files via an XMLHttpRequest and load the code like so:

ajaxRequest({
  method: 'GET',
  url: 'http://example.com/js/foo.js',
  onSuccess: function(xhr) {
    var pre = '(function(){var exports={};'
      , post = ';return exports;})()';
    window.fooModule = eval(pre + xhr.responseText + post);
  }
});

The key is that client can wrap the foreign code into an anonymous function to be run immediately (a closure) which creates the "exports" object and returns it so you can assign it where you'd like, rather than polluting the global namespace. In this example, it is assigned to the window attribute fooModule which will contain the code exported by the file foo.js.

maerics
+4  A: 

If you want to write a module that can be used both client side and server side, I have a short blog post on a quick and easy method: http://caolanmcmahon.com/writing_for_node_and_the_browser.html

Alternatively there are some projects aiming to implement the node.js API on the client side, such as Marak's gemini.

You might also be interested in DNode, which lets you expose a JavaScript function so that it can be called from another machine using a simple JSON-based network protocol.

I'd post a link to gemini and dnode but stackoverflow won't let me post more that one link because I'm a new user!

Caolan
Gemini: http://github.com/marak/gemini.js
Caolan
DNode: http://github.com/substack/dnode
Caolan
Excellent. Thanks for the info, Caolan.
Simon Cave