views:

318

answers:

2

Hi,

I'm developing a html5 browser multi-player RPG with node.js running in the backend with a web sockets plug-in for client data transfer. The problem i'm facing is accessing and updating user data, as you can imagine this process will be taking place many times a second even with few users connected.

I've done some searching and found only 2 plug-ins for node.js that enable MySQL capabilities but they are both in early development and I've figured that querying the database for every little action the user makes is not efficient.

My idea is to get node.js to access the database using PHP when a user connects and retrieve all the information related to that user. The information collected will then be stored in an JavaScript object in node.js. This will happen for all users playing. Updates will then be applied to the object. When a user logs off the data stored in the object will be updated to the database and deleted from the object.

A few things to note are that I will separate different types of data into different objects so that more commonly accessed data isn't mixed together with data that would slow down lookups. Theoretically if this project gained a lot of users I would introduce a cap to how many users can log onto a single server at a time for obvious reasons.

I would like to know if this is a good idea. Would having large objects considerably slow down the node.js server? If you happen to have any ideas on another possible solutions for my situation I welcome them.

Thanks

+3  A: 

As far as your strategy goes, for keeping the data in intermediate objects in php, you are adding a very high level of complexity to your application.

Just the communication between node.js and php seems complex, and there is no guarantee this will be any faster than just putting things right in mysql. Putting any uneeded barier between you and your data is going to make things more difficult to manage.

It seems like you need a more rapid data solution. You could consider using an asynchronous database like mongodb, or redis that will read and write quickly (redis will write in memory, should be incredibly fast)

These are both commonly used with node.js just for the reason that they can handle the real time data load.

Actually redis is what your really asking for, it actually stores things in memory and then persists it to the disk periodically. You can´t get any faster than that, but you will need enough ram. If ram looks like an issue, go with mongodb which is still really fast.

The disadvantage is you will need to relearn the ideas about data persistance, and that is hard. I´m in the process of doing that myself!

orangutancloud
Looks like I didn't make myself clear about my initial idea. I was planing on storing the data in a JavaScript object on the server and only using PHP for querying the server when the user connected.Non the less, It sounds like Redis is what I am looking for. Thank-you for your input. :)
Ryan