> No I meant that if you used some kind of binary encoding to begin with
> instead of XML
> (i.e., the "TEXT" element
> is encoded as 17, the "transformation" attribute is encoded as 27, etc),
> and then compress that,
> I wonder if it would be smaller than using GZIP to compress an XML file.
It
> might be a little
> smaller but I was saying that history has shown that using binary encoding
> formats like that makes debugging a nightmare (i.e., EDI).
Theory:
From the information theory point of view, the amount of information is
always the same. That means, here is a lower bound for the compressed size,
where you cannot compress more.
My view:
Some compression algorithms are better than others and also it depends on
the parameters you feed to the algorithm. In general, I would say you can
get 90% or more of the possible compression with standard tools while for
the remaining percentage(s) you would need a lot of effort.
Reto
[ Did you check the archives? http://www.appelsiini.net/keitai-l/ ]
Received on Tue Jun 12 04:54:22 2001