Does the Riemann hypothesis guarantee that integer factorization is difficult?
In an exchange of comments at Is there any mathematical conjecture that is successfully applied in the real world in spite of being yet unproven?, user R.J. Etienne claims that
RH guarantees that integer factorisation is difficult.
I was not able to identify any argument in support of this claim in the remainder of their comments. Since the question was asked to find material for a thesis, I thought it would be good to resolve this claim here.
There are related questions at Proving the Riemann Hypothesis and Impact on Cryptography and Would a proof to the Riemann Hypothesis affect security?. If I understand the accepted answers correctly, they basically say that resolving the Riemann hypothesis could lead to new insights that could lead to better integer factorization algorithms, and that this would be more likely if the hypothesis were proved false, as this would likely require insights into an unexpected regularity in the primes, which could conceivably be exploited for factorization.
This seems plausible to me, but it is far from the strong claim that the Riemann hypothesis guarantees that integer factorization is difficult, which would be quite surprising to me. As far as I know, neither would proving the Riemann hypothesis prove that integer factorization is difficult, nor is it by any means guaranteed that disproving it would lead to better integer factorization algorithms; but I’m not an expert in this area, and we have several experts here who can probably say more about this than I.
I should perhaps point out one concrete argument that R.J. Etienne did provide, which I seem to have misunderstood:
The fast prime number tests important in cryptography have so far only been proven under the assumption that the Riemann hypothesis is true.
I thought that “prime number tests” referred to primality tests (in this case, the argument would be invalid, both because integer factorization cannot be reduced to primality testing and because the AKS primality test has been proved to have polynomial time complexity without assuming the Riemann hypothesis), but they later stated that they were not referring to primality tests.
One more remark: Of course few problems are really guaranteed to be difficult, since we don’t even know whether NP-complete problems can be solved in polynomial time; so I’m taking “difficult” in the claim to mean something like “NP-hard”.
Solution 1:
Long Comment
Every prime greater than 2 can be written as a difference of squares in only one way. i.e. $3=2^2-1^2$, $5=3^2-2^2$, etc. Multiplying two primes leads to a composite product $N$ that can be expressed as the difference of two squares in two ways, representing $N=1\times N=p_1 \times p_2$
I think that the basic underlying problem in reconstructing the original difference of squares, representing $p_1$ and $p_2$, is the loss of information that implicitly or explicitly arose when multiplying them in the first place.
(For the algebra of multiplying numbers represented by the difference of two squares see Prime number sieve using difference of two squares)
The statement "RH guarantees that integer factorisation is difficult." when seen from the information point of view, seems to me then equivalent to; the RH guarantees that there is no short cut available to reconstructing the explicit information lost when two primes are multiplied using the uniquely equivalent difference of two squares representation.
However also from the information point of view, I think it is the so called "random" nature of the distribution of primes that primarily guarantees there is no short cut to reconstructing the information that is implicitly lost in the normal process of multiplication.
However what does the so called "random" nature of the distribution of primes really mean in terms of this question?