views:

69

answers:

2

On a quest to learn a bit about Python and sockets I'm writing a little 2d-game server.

And although I don't see more than a few people on this server at any given time, I want to write it as efficiently as I can.

I have a global dictionary called "globuser", in it is another dictionary containing the user stats (like the X & Y coordinates)

Is that the best way to store them? How big can a dictionary get?
I guess you could also try to use a regular database, but that would be insanely slow. Or should I use an entirely different scheme?

Can multiple threads access the same variable at the same time, or are they put on hold?
I guess when lots of users are on-line, every move would require an update. If they can happen at the same time, great! But if they each require to "lock" the variable, that would be less great.

+3  A: 

One thing I might look at is storing the users as a list of Player objects. Look into __slots__, as that will save you memory when creating many instances.

I also would not worry much about performance at this stage. Write the code first and then run it through a profiler to find out where it is slowest -- making blind changes in the name of optimization is bad juju.

Regarding thread safety and sharing of data, I found this, which seems to give some info on the subject.

Daenyth
+2  A: 

Use multiprocessing instead of threading . Its has lots of advantage and one of advantage is to handling global storage for all process. This module use global dictionary which is initiated by manager.

Here is sample example taken from PyMOTW

The Manager is responsible for coordinating shared information state between all of its users. By creating the list through the manager, the list is updated in all processes when anyone modifies it. In addition to lists, dictionaries are also supported.

import multiprocessing

def worker(d, key, value):
    d[key] = value

if __name__ == '__main__':
    mgr = multiprocessing.Manager()
    d = mgr.dict()
    jobs = [ multiprocessing.Process(target=worker, args=(d, i, i*2))
             for i in range(10) 
             ]
    for j in jobs:
        j.start()
    for j in jobs:
        j.join()
    print 'Results:', d
$ python multiprocessing_manager_dict.py
Results: {0: 0, 1: 2, 2: 4, 3: 6, 4: 8, 5: 10, 6: 12, 7: 14, 8: 16, 9: 18}
Namespaces

In addition to dictionaries and lists, a Manager can create a shared Namespace. Any named value added to the Namespace is visible across all of the clients.

import multiprocessing

def producer(ns, event):
    ns.value = 'This is the value'
    event.set()

def consumer(ns, event):
    try:
        value = ns.value
    except Exception, err:
        print 'Before event, consumer got:', str(err)
    event.wait()
    print 'After event, consumer got:', ns.value

if __name__ == '__main__':
    mgr = multiprocessing.Manager()
    namespace = mgr.Namespace()
    event = multiprocessing.Event()
    p = multiprocessing.Process(target=producer, args=(namespace, event))
    c = multiprocessing.Process(target=consumer, args=(namespace, event))

    c.start()
    p.start()

    c.join()
    p.join()
$ python multiprocessing_namespaces.py
Before event, consumer got: 'Namespace' object has no attribute 'value'
After event, consumer got: This is the value
Onsy