It it possible to "compress" a list of large numbers using their prime factors?
On a computer I can have integers on arbitrary size thanks to GMP, so it's represented in base 2 in memory.
I'm wondering if it's possible in theory to use less memory if I store only prime factors and their exponent, I think it's worthless for most numbers, but I wonder if it works for a very small minority of numbers with large prime factors.
How can I prove me wrong ?
No, it is not possible. In general, there is no algorithm that compresses every $n$ bit string, or reduces the length of a random string of bits (in expectation).
Your algorithm may compress some numbers but then it will use more memory to store others. If your data is random, there is no benefit in using your compression scheme.
Actually the fact that no such compression scheme exists can be used to prove that there are infinitely many prime numbers — otherwise, if there were only finitely many prime numbers your compression scheme would be very efficient!
There are excellent lecture notes on Kolmogorov Complexity by Alexander Shen that discuss “incompressability”; in particular, Shen writes about your compression algorithm in Section 18.
https://www.google.com/patents/US6373986
I've done this too. It is not that hard. The reason it works is because the primes become the character set for a language. Instead of base10 or base2 possible characters describing the result, you end up with prime**prime possible characters and the decompressor/program/language becomes the context in which it is interpreted. You end up with unique values WITHIN THAT SET of prime^prime possible values. While the character may be referenced by other sets or other 'languages', fundamentally as long as the value is not outside the maximum possible value for that set, every single value will uniquely represent some other final value. WITHIN that set. Not extending beyond a value of prime^prime.