Hello SO, this question relates to performance penalities that may or may not arise from having a large number of sleeping python threads on a webserver.
Background: I am implementing an online shop using django/satchmo. A requirement is for delayed payment. The customer can reserve a product and allow a third party to pay for it at a later date (via a random and unique URL).
To handle unreserving an item I am creating a thread which will sleep for the reservation time and then remove the reservation/mark the product as sold when it awakes. It looks like this:
#Reserves a product when it is placed in the cart
def reserve_cart_product(product):
log.debug("Reserving %s" % product.name)
product.active = False
product.featured = False
product.save()
from threading import Timer
Timer(CART_RESERVE_TIME, check_reservation, (product,)).start()
I am using the same technique when culling the unique URLs after they have expired, only the Timer sleeps for much longer (typically 5 days).
So, my question to you SO is as follows:
Is having a large numnber of sleeping threads going to seriously effect performance? Are there better techniques for scheduling a one off event sometime in the future. I would like to keep this in python if possible; no calling at
or cron
via sys
.
The site isn't exactly high traffic; a (generous) upper limit on products ordered per week would be around 100. Combined with cart reservation, this could mean there are 100+ sleeping threads at any one time. Will I regret scheduling tasks in this manner?
Thanks