views:

67

answers:

1

I'm currently in the process of writing a client server app as an exercise and I've gotten pretty much everything to work so far, but there is a mental hurdle that I haven't been able to successfully google myself over.

In the server application am I correct in my thinking that threading the packet handler and database handler to work from a stack is the right thing to do? The idea is that one thread loops listening for packets and adds the data to a stack and then the other thread pops the data off the bottom of the stack and does some checks against an SQL db.

In this particular case, it's more important for the packet handler to keep working. I guess my question is, is this an appropriate use of threads and where am I going to run into problems that require thread locking, for example, should I lock the db handler when the packet thread adds to the stack to avoid an issue with trying to write and read say, the only value in the stack, etc.

Thanks everyone!

Here is a snippet of the code, mind you it's in progress so don't judge, also my first attempt at python (which I am enjoying more than perl or php at the moment!).

class socketListen(threading.Thread):
    def run(self):
        while True:
            datagram = s.recv('1024')
            if not datagram:
                break
            packetArray = datagram.split(',')
            if packetArray[0] = '31337':
                listHandle.put(packetArray)
        s.close()

class stackOperations(threading.Thread):
    def run(self):
        while True:
            #pull the last item off the stack and run ops on it
            #listHandle.getLast is the last item on the queue
    def 

class listHandle():
    def put(shiftData):
        if not mainStack:
            mainStack = []
        mainStack.insert(0,shiftData)
    def getLast:
        return mainStack.pop()
+4  A: 

This is what queues are for. Replace stack with queue and no, you won't have to use any other synchronization methods. Incidentally, multiprocessing is better than threading, since it can take advantage of multicore/hyperthreaded processors. The interfaces are pretty similar, so it's worth looking into switching.

Nathon
No that's perfect! I must be all messed up though, isn't threading multiprocessing? And do you mean use a queue with the current setup I have here with the threads or switch to a different module?
Melignus
Threading is running processes concurrently, usually on the same CPU. Multiprocessing is running processes concurrently distributing the work amongst multiple CPUs. Similar, yet distinct.
jathanism
Gotcha, yeah I found the multiprocessing module docs page just now. Good stuff and I completely understand. I guess I was figuring that there might have been something where the interpreter passes threads off to cores if they are present, but I guess that's just wishful thinking. A multiprocessing module makes sense. Thanks for the help guys and the quick responses, you're awesome!
Melignus
Ideally, your original view of threading would be accurate. However, in Python, there's this (much maligned) thing called the global inter-lock (GIL) that basically ensures that Python threads really just share a single processor thread. This is not true in most other languages.
Nathon
Also note that the GIL is CPython-specific. Neither Jython nor IronPython use a GIL.
Nathan Davis
@jathanism: Not really - threading is running separate *threads*, no guarantee of concurrency. Multiprocessing is running separate *processes*, also no guarantee of concurrency. Concurrency is determined by lower layers and the difference between threading and multiprocessing is orthogonal to whether there is concurrency.
Nick Bastin