hm - what problem are you trying to solve? I suspect the answer depends on what you are trying to do with the data.
Since in general you don't want a whole 3Gb file in memory, I'd not store the chunks in an array, but iterate over the http_response and write it straight to disk, in a temporary or persistent file using the normal write() method on an appropriate file handle.
if you do want two copies of the data in memory, your method will require be at least 6Gb for your hypothetical 3Gb file, which presumably is significant for most hardware. I know that array join methods are fast and all that, but since this is a really ram-constrained process maybe you want to find some way of doing it better? StringIO (http://docs.python.org/library/stringio.html) creates string objects that can be appended to in memory; the pure python one, since it has to work with immutable strings, just uses your array join trick internally, but the c-based cStringIO might actually append to a memory buffer internall. I don't have its source code to hand, so that would bear checking.
if you do wish to do some kind of analysis on the data and really wish to keep in in memory with minimal overhead, you might want to consider some of the byte array objets from Numeric/NumPy as an alternative to StringIO. they are high-performance code optimised for large arrays and might be what you need.
as a useful example, for a general-purpose file-handling object which has memory-efficient iterator-friendly approach you might want to check out the django File obeject chunk handling code:
http://code.djangoproject.com/browser/django/trunk/django/core/files/base.py.