Remember what your mom taught you:
always flush()
(in python, sys.stdout.flush()
followed by os.fsync()
)
Remember what your mom taught you:
always flush()
(in python, sys.stdout.flush()
followed by os.fsync()
)
If you have opened the file with a file handle in Python, remember to close it when finished. eg
f=open("file")
....
f.write(....)
f.close()
Obviously, you forget to close the file upon finishing working with it. If you want to check the contents of the file before closing it, call the flush()
method. Example:
file = open("hello.txt", "a")
file.write(...)
file.flush() // force write on the disk
file.close() // finished using the file, close it
Check you code, not all open files are closed in it.
Regarding the code: It looks like the actual problem is one relating to threads, not to files:
Whilst you are executing this code:
for t in list:
fileo.write(t + "\n")
list = []
fileo.close()
fileo = open(OutFile, 'a')
k = 0
list
is being modified by the threads you have spawned. I don't know the details of how 'for x in y' works with threads, but I imagine it is missing out the elements added to the list after the body for loop has been executed the first time.
To solve this you need a mutex for list
which you lock for the entirety of this for loop (until you have cleared the list), and which you lock whenever you are adding an item to the list.