You are invited to Log in or Register a free Frihost Account!

Compression Algorithms

Xcelerate
This is a discussion on loseless compression algorithms, such as which are best for what purposes.

I know the LZW algorithm makes use by referencing how many times a particular sequence of bytes is used in its dictionary, then goes back and replaces these with a 256-4095 number to mark its entry in the dictionary. Unfortunately, this is a rather simple algorithm that anyone could think of and I can't believe you have to pay royalties to use it because it's patented.

The RLE algorithm makes use of repetitive bytes and byte patterns, for instance 244, 244, 244, 244, 244 becomes 255, 244, 5, with the real 255 byte being represented as 255, 0. This is a fairly simple algorithm as well.

If anyone else knows of any compression algorithms, please let me know.
ocalhoun
Jpeg
mpeg 1,2,3,4
zip
tar
Really there are innumerable compression algorithims, many of them optimized for a certain type of data.
Xcelerate
JPEG and MPEG are both lossy compression algorithms, which are both good for their purpose. I was sort of meaning just loseless algorithms, because they are somewhat easier to discuss how they actually work.
exeption
7z is great!

Visit sourceforge.net
Nyizsa
 Xcelerate wrote: JPEG and MPEG are both lossy compression algorithms

Not true! However, in most of the cases these are made lossy by the quantization matrix! This is the way to achieve the 1:20 - 1:40 compression ratio, which would be around 1:2 - 1:3 if such a quantization matrix would be used that made it lossless.
The JPEG and MPEG compression have the DCT (Discrete Cosine Transformation) in common, and MPEG also implements movement prediction.
Anyway, I think that in case of simple files, one of the most effective compression methods is the Huffman-encoding. (Do a search for it!) It is implemented in ARJ, for example.
Daniel15
 exeption wrote: 7z is great! Visit sourceforge.net

Yeah, 7-Zip LZMA is the best compression algorithm I've seen. It compresses more than most algorithms.

ashok
What about rar? and i think zip is also a lossy compression algorithm...
exeption
Zip and RAR are not generally Lossy.
Lossy means the compression algorithm omits details/data according to the extend of compression. Lossy compression algorithms are more generally used in media compression. i.e. Images, Video, Audio etc
eg:
JPEG,GIF etc.
DivX, Xvid, MPEG etc. (Remember AVI is just a file format, not actually a codec)
MP3, Ogg etc.

ZIP and RAR are lossless compression algorithms. If they were lossy, how could we compress valid data in it, like source codes, documents, programs and others?!
dkill001
I have used the following lossless compression programs for data in the past:
zip
rar
lha
arj
gzip (?tar)
jar
cab
ace
ice

I dont know how they work - I am not really interested in that aspect, sorry
Nyizsa
tar is not a compressed file format, it only stores data. However it can be (and usually is) sent through the gzip filter (=compressed).
cab is the Windows way of storing installable program info. (mostly)
jar uses the zip algorythm, but includes some info about the java program it contains so it is executable.
The others are simply compress the data and store them.
There is a method for every data to compress, so the list is endless.
Shirish
hi friends

you can use run length encoding

then there are hash trees used

in fact Discrete mathematics can be employed tok create even

much more powerful compression algorithms
onur
WINUHA

You cancompress 100MB to 30 or 40 MB

http://www.klaimsoft.com/winuha/

leodv
Generally, all zipping program uses lossless algorithm, else u can never get back the data that you have zipped. Huff man is a great algorithm, but it is created so long ago, i suppose there should be some other zip algorithms currently being developed by companys and institutes.
Nyizsa
 onur wrote: You can compress 100MB to 30 or 40 MB

Depends on what type of data you have. For text files it can be even less, but try to compress an MPEG movie... you will be surprised. Or try making a 100 MB file consisting of only one character (quite easy with some programming kowledge) and compress that! Again - you will be surprised!
djclue917
Free Lossless Audio Codec (FLAC)

The name says it all...
Animal
 djclue917 wrote: Free Lossless Audio Codec (FLAC) The name says it all...

FLAC is actually a pretty advanced codec.

Firstly, it's open source which is always good in termas of development.

Secondly, it's (as it states) a Lossless compression method for audio. It'll compress a CD of full-quality music to about half its original size.

Finally, and most crucially, it's been specifically designed to use a low amount of processor power to decode. This is good from a PC processor point of view, but when running portable devices such as iPods and iRiver players, it's excellent in terms of battery life. Also, these devices tend to have fairly low-performance processors (a 4G iPod runs at 85MHz), so it allows them to play back the audio in real time. If you're going to go Lossless, FLAC is an excellent choice - it is Open Source, as I said, so that means you will always be able to use it, unlike Apple's proprietary Lossless Audio Codec and WMA Lossless.

Believe it or not, you can actually get your iPod to play FLAC files. You need to download iPod Linux which works like a second firmware on your iPod, and it also gives you access to more games and stuff. Check out their site for disclaimers / full details. Might be worth a go if you're bored with your old iPod and can't afford a new one.
mike1reynolds
When it comes to video compression, nothing that I've seen beats Sorenson Squeeze. It is a product, not a format, but it will do amazing compression for most formats, AVI, MPEG, MOV, it will even write the output in Flash's FLV and SWF formats. I don't like the SWF output because, even if looping is set on, it won't loop. Instead I write in FLV format and import it into flash.

As an example, I had a 40 frame animation, each frame being 60K-70K in JPG format, so that adds up to around 2.5M. After Sorenson compressed it it was about 140K, about the size of just 2 JPG frames.
Nyizsa
 mike1reynolds wrote: As an example, I had a 40 frame animation, each frame being 60K-70K in JPG format, so that adds up to around 2.5M. After Sorenson compressed it it was about 140K, about the size of just 2 JPG frames.

A typical example of differential encoding. Your first jpeg was stored in the animation, then some info about what needs to be changed in it, and maybe some parts of the other files. A clever idea, isn't it?