Hi ,
i have got CopyFile and Directory project.But when i started to copy Gui is freezing.I cant do anything file copying.So i found my solution at BackgroundWorker Component.But i got a problem with this component too.There are 3 radio button and command button.When i clicked command button its checking if radiobutton1 checked or else r...
Okay, mkstemp is the preferred way to create a temp file in POSIX.
But it opens the file and returns an int, which is a file descriptor. From that I can only create a FILE*, but not an std::ofstream, which I would prefer in C++. (Apparently, on AIX and some other systems, you can create an std::ofstream from a file descriptor, but my c...
I need to be able to read a file format that mixes binary and non-binary data. Assuming I know the input is good, what's the best way to do this? As an example, let's take a file that has a double as the first line, a newline (0x0D 0x0A) and then ten bytes of binary data afterward. I could, of course, calculate the position of the new...
Is there a built in method for waiting for a file to be created in c#? How about waiting for a file to be completely written?
I've baked my own by repeatedly attempting File.OpenRead() on a file until it succeeds (and failing on a timeout), but spinning on a file doesn't seem like the right thing to do. I'm guessing there's a baked-in...
Suppose I have a filehandle $fh. I can check its existence with -e $fh or its file size with -s $fh or a slew of additional information about the file. How can I get its last modified time stamp?
...
Hi All,
I need to create a java program which will create thread to search for a file in particular folder(source folder) and pick the file immediately for process work(convert it into csv file format) once it found the file in the source folder. Problem i am facing now is file which comes to source folder is big size(FTP tool is used t...
Much like a similar SO question, I am trying to monitor a directory on a Linux box for the addition of new files and would like to immediately process these new files when they arrive. Any ideas on the best way to implement this?
...
I want to get the size of a file on disk in megabytes. Using the -s operator gives me the size in bytes, but I'm going to assume that then diving this by a magic number is a bad idea:
my $size_in_mb = (-s $fh) / (1024 * 1024);
Should I just use a read-only variable to define 1024 or is there a programmatic way to obtain the size of a...
Is it possible to open a file in .NET with non exclusive write access? If so, how? My hope is to have two or more processes write to the same file at the same time.
Edit: Here is the context of this question: I am writing a simple logging HTTPModule for IIS. Since applications running in different app pools run as distinct processes, I ...
Hi ,
I have got a project that can copy files to another client's desktops in my domain.There is 300+ client machine.But there is a problem.When i run this project in a non admin user account in my domain.It cant copy files getting error about Access Denied , user restrictions.I wanna do this program like this , in non admin user account...
We have a C# Windows service polling a folder waiting for an FTP’ed file to be posted in. To avoid using the file when it is still being written to we attempt to get a lock on the file first, however, there seems to be occasions where we are getting a lock on the file after the FTP’ed file is created but before the file is written to, so...
I have a very big file 4GB and when I try to read it my computer hangs.
So I want to read it piece by piece and after processing each piece store the processed piece into another file and read next piece.
Is there any method to yield these pieces ?
I would love to have a lazy method.
...
Is there a way to check if a file is already open in Perl?
I want to have a read file access, so don't require flock.
open(FH, "<$fileName") or die "$!\n" if (<FILE_IS_NOT_ALREADY_OPEN>);
# or something like
close(FH) if (<FILE_IS_OPEN>);
...
To put it simple: a swing app that uses sqlitejdbc as backend. Currently, there's no problem launching multiple instances that work with the same database file. And there should be.
The file is locked (can't delete it while the app is running) so the check should be trivial. Turns out not.
File f = new File("/path/to/file/db.sqlite"...
I have an application that reads lines from a file and runs its magic on each line as it is read. Once the line is read and properly processed, I would like to delete the line from the file. A backup of the removed line is already being kept. I would like to do something like
file = open('myfile.txt', 'rw+')
for line in file:
process...
Hi guys! I'm practicing for a competition (that's where my previous question came from).
I got the algorithm sorted out for the question, but I'm having some problems with the actual programming. It's a solo competition, so I really need to get this sorted out before I go for it. This is the question.
TASK 3: GECKO During the rainy s...
Hi,
This should be a fairly trivial problem. I'm trying to open an ofstream using a std::string (or std::wstring) and having problems getting this to work without a messy conversion.
std::string path = ".../file.txt";
ofstream output;
output.open(path);
Ideally I don't want to have to convert this by hand or involve c-style char po...
I have a small application that I'm developing, that I may want to give/sell to others. I want to persist some settings, and create an admin interface to modify them. What would be the best way to store them away? A DB table seems like overkill for the 10-20 settings I'll have, and I want the retrieval of these settings to be as fast ...
If I have two directories on an nfs server, between which I would like to copy a large amount of data (in several thousand files, rather than one large block), is there any way to optimize this to be a "local" copy on the server? Does NFS do this automatically, and if not, is there an option to enable it to do so, or is there some inevit...
I have recently come up with a situation where I need to trim some rather large log files once they grow beyond a certain size. Everything but the last 1000 lines in each file is disposed of, the job is run every half hour by cron. My solution was to simply run through the list of files, check size and trim if necessary.
for $file (@fil...