I have an sqlite database full of huge number of URLs and it's taking huge amount of diskspace, and accessing it causes many disk seeks and is slow. Average URL path length is 97 bytes (host names repeat a lot so I moved them to a foreign-keyed table). Is there any good way of compressing them? Most compression algorithms work well with ...
Is there any reason why XML such as this :
<person>
<firstname>Joe</firstname>
<lastname>Plumber</lastname>
</person>
couldn't be compressed like this for client/server transfer.
<person>
<firstname>Joe</>
<lastname>Plumber</>
</>
It would be smaller - and slightly faster to parse.
Assuming that th...
I have enabled IIS 6's built in HTTP compression for the following types of files:
HcFileExtensions="htm
html
txt
css
js"
HcScriptFileExtensions="asp
dll
exe
aspx
asmx
ascx"
I am unclear however if it's appropriate to add the axd extension so that my WebResource.axd files will be compressed.
Lastly, are ther...
Hello all,
I'm writing a compression library as a little side project, and I'm far enough along (My library can extract any standard gzip file, as well as produce compliant (but certainly not yet optimal) gzip output) that it's time to figure out a meaningful block termination strategy. Currently, I just cut the blocks off after every...
I'm trying to backup a database in SQL Server 2008 and have the database compressed using the new compression feature. However, when I run the following code, I get a weird error message:
Backup Database <Database> To Disk 'C:\Backup' With Compression
I get this error message:
Backup Database With Compression is not supported on S...
I happened across this article about hardware based hard drive encryption and realized that not only would this give a great way to protect your data but it would also speed up the applications that we use to encrypt that data.
This lead me to wonder...
Would it be possible to do the same thing for compression so that all of the data i...
From msdn:
http://msdn.microsoft.com/en-us/library/system.io.compression.deflatestream.aspx
This class cannot be used to compress files larger than 4 GB.
Do you know any other implementations for .net without the 4 gb limit?
Thanks
NOTE: I really need to decompress a file in GZ format with content larger than 4gb. Do you know any co...
I'm working on a project using C++, Boost, and Qt. I understand how to compress single files and bytestreams using, for example, the qCompress() function in Qt.
How do I zip a directory of multiple files, including subdirectories? I am looking for a cross-platform (Mac, Win, Linux) solution; I'd prefer not to fire off a bunch of new pr...
I publish Windows Forms application using ClickOnce. The installation is quite big considering the overall size of this app. It's something over 15 MB. If I compress locally built application it is squeezed to 2.5 MB.
Can ClickOnce deployment be compressed somehow?
If not, is anyone using IIS compression to speed up transfers? Would t...
Does anybody know of a library or piece of software out there that will locate irregularities in text? For example, lets say I have...
1. Name 1, Comment
2. Name 2, Comment
3. Name 3 , Comment
5. Name 10, Comment
This software or library would first cut up portions of text that it would find similar (much alike a piece of compression...
Hello. I'm resizing jpegs by using the Graphics.DrawImage method (see code fragment below). Can anyone confirm that this will not affect the compression of the new image?
I have seen this thread, but I am talking specifically about compression of jpegs.
private byte[] getResizedImage(String url, int newWidth)
{
Bitmap bm...
Hi!
I'm building a index which is just several sets of ordered 32 bit integers stored continuously in a binary file. The problem is that this file grows pretty large. I've been thinking of adding some compressions scheme but that's a bit out of my expertise. So I'm wondering, what compression algorithm would work best in this case? Also...
I'm extending a utility class that bundles a set of images and .xml description files. Currently I keep all the files in a directory and load them from there. The directory looks like this:
8.png
8.xml
9.png
9.xml
10.png
10.xml
...
...
50.png
50.xml
...
Here's my current constructor. It is lightning fast and does what I need it to...
JPEG is a lossy compression scheme, so decompression-manipulation-recompression normally reduces the image quality further for each step. Is it possible to rotate a JPEG image without incurring further loss? From what little I know of the JPEG algorithm, it naively seems possible to avoid further loss with a bit of effort. Which common i...
I'm looking for the description of an algorithm that can be used to decompress RAR files. I don't need to create new archives, only decompress existing ones.
The Wotsit.org has the description of the RAR file format (version 2), but the description does not describe the decompression algorithm.
Also, does anyone know whether RAR versio...
Hi,
I am trying to implement GZip compression for my asp.net page (including my CSS and JS files). I tried the following code, but it only compresses my .aspx page (found it from YSlow)
HttpContext context = HttpContext.Current;
context.Response.Filter = new GZipStream(context.Response.Filter, CompressionMode.Compress);
HttpContext.Cur...
Does anyone of you know a technique to identify algorithms in already compiled files, e.g. by testing the disassembly for some patterns?
The rare information I have are that there is some (not exported) code in a library that decompresses the content of a Byte[], but I have no clue how that works.
I have some files which I believe to be...
Does anyone of you know a lossless compression algorithm, which produces headerless outputs?
For example do not store the huffman tree used to compress it? I do not speak about hard coded huffman trees, but I like to know if there is any algorithm that can compress and decompress input without storing some metadata in its output. Or is t...
I am developing an application using asp.net 2.0 (C#), in which I am trying to implement the compression of my files, so that performance of my website will improve.
For that I have added a code in my Global.asax file to compress all requests (.aspx, .js, .css) But when I am running my application it works well for first time then the C...
Our shop is constantly running out of disk space, because we have a mandate from the developers and management to keep all of the nightly builds, as it greatly aids debugging.
Each build generally has few changes. However, almost every file is different from one build to the next, because the compiler and linker insert date/time stamps...