Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

FrozenVoid in the wild

Name: Anonymous 2017-05-30 19:15

Name: Anonymous 2017-06-08 8:01

>>13
https://www.reddit.com/r/frozenvoid/wiki/pigeonhole_principle
Pigeonhole principle states that for every set a unique correspondence of 1:1 occurs and the set cannot be represented by smaller set.

Why this doesn't work.
Suppose a subset of input set is compressible.
If we only compress the compressible subset and don't modify the input in case of incompressibility, there are not 1:1 correspondence, but 1:1-m where m is the compressible subset.

Now what if we could move items from the incompressible subset to a compressible subset?
Given a transform which is reversible X->Y->X
We could try all possible transforms and parameters,until X is in the compressible subset. The key idea is with enough search space
almost all data could be converted into compressible form of same size

The cost of transform: the main cost is time required to search through transformation parameters. Since each transform is an operation on the entire file, time grows proportionally to filesize.
The extra metadata introduced by storing the transform is neglicible until the critical size.
Critical size: when the transform parameters
datasize is longer than savings gained by compression, the compression is no longer viable and file should be returned unmodified.

So the questions is: How N input into decompression produce N+M outputs?
The question is invalid as majority of files will not compress at all or require search space that impossible to check before heat death of the universe.
The idea is that the subset of compressible files grows, making the set of files of size N more compressible than a simple straight test for compression shows(i.e. variants of data transformed can reach into the compressible subset ).

Name: Anonymous 2017-06-08 8:58

>>14
More. Most of the files in set 2^N are random garbage with no use whatsoever.
We just need to transfer a small minor of useful but not compressible files into the easy-compressible subset: the only overhead is transformation parameters(which for large files would be trivial compared to savings). The limit of this scheme begin the break is when the file size become small enough that parameter metadata is larger than compression savings.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List