> > context modelling.
> Ah... Unfortunately there's no time on my (very slow) target platform for that.
There is time on any platform, depending on the implementation. CM is any method based on assumption that yet unknown data has different properties depending on the known context. Or, alternatively, CM can be described as a model for a (markovian) source with a hidden state depending on which it changes the statistical properties of generated data.
Although imho the whole idea of Markov Model convergence is misleading. I mean, it really has sense only for a determinated state machine, but is commonly applied to the cases where probability "converges" to some value far from 0 or 1, and that is considered "stationary". But how it can be truly stationary if the data was generated from a pretty much determinated source (like eg. executables are) and the probability distribution looks like some curve only because the context is far from what it should be. So that's why there're no real stationary data and probability distributions can only be locally optimal.
And btw LZ is not a model, or even a compression method, actually. Its just a (redundant) transformation, results of which can be easily compressed (so similar to BWT in a way), but there's afaik no even a single example of a type of data generated like that.
Well, I just wanted to say that CM can have any speed, depending on the specific implementation. And even traditional implementations can be faster than non-greedy LZ in compression, though their decompression is usually symmetrical. But then again, there's no rule against asymmetrical CM with faster-than-LZ decoding.
> Or Context Mixing! (As noted at LTCB)
> Context Modeling is really wide term: PPM, CM, CTW, ...
Well, I do agree with the last line.
Btw, I don't have any foundations but feel that "modeling" is something different from "modelling".
2013-08-09 00:25:29 >
2014-11-26 21:45:35 >
2015-01-11 20:43:43 >