
Re: Quantum computers cannot compute trancendentals?
Posted:
Jan 18, 2014 12:45 AM


On 01/01/2014 11:23 AM, David Bernier wrote: > On 12/24/2013 04:24 AM, David Bernier wrote: >> On 12/24/2013 02:16 AM, David Bernier wrote: >>> On 12/20/2013 05:37 PM, quasi wrote: >>>> Jimmy Weissmuller wrote: >>>>> Clark Smith wrote: >>>>>> 7 wrote: >>>>>>> >>>>>>> The quantum 'logic gates' I have cannot in any shape >>>>>>> or manner computer a trancendental number directly. >>>> >>>> Can you represent the transcendental number e _exactly_ >>>> in finite time _without_ a computer? >>>> >>>> That depends on what you mean by "represent". >>> >>> [...] >>> >>> In algebratype manipulations, we can replace an expression by >>> an expression with the same value; the hard part is first deriving >>> identities, (e.g. arctangent formulas for pi), but >>> also the nontrivial question of which of numerous equal >>> expressions to do a substitution with ... >>> >>> In more finetuned susbstition questions, it might be good >>> to target a specific precsion, such as 100,000 places after the >>> decimal point in computing the real and imaginary parts >>> of zeta(0.5 + i*14.13), and if using the EulerMacLaurin method, >>> I've found I need to include say the term B_70000, the >>> 70,000th Bernoulli number: >>> >>> < http://en.wikipedia.org/wiki/Bernoulli_number > >>> >>> These mysterious numbers are rational, with rather slowgrowing >>> denominator, and >= exponential growth of magnitude for the >>> evenindexed Bernoulli numbers. >>> >>> For instance,  B_70000  is of the order of 10^252887. >>> Now if it multiplies a number of the order 10^(350887), >>> the product is of order 1/(10^98000). >>> >>> If we're targeting 100,000 decimals after the point, it seems to >>> me that the 20003000 most signifcant digits only of >>>  B_70000  are needed ... >>> >>> So, if our new problem is to compute  B_70000  closely >>> enough so that the leading 3000 digits (the most significant) >>> are all correct, it could be much easier than using e.g. Simon >>> Plouffe's method (designed to compute the Bernoullis exactly). >>> >>> His method for B_6 = 1/42 would involve using >>> 2*zeta(6)*(6*5*4*3*2*1)/((2*Pi)^6) >>> >>> or 2*zeta(6)*6!/((2pi)^6) [exact] >>> with zeta(6) ~= 1 + 1/(2^6) + ... 1/(k^6) , small k, >>> and additional constraints from theorems on the denominator or >>> fractional part, etc. to pin down B_6 exactly as a rational. >>> >>> For B_70000 , >>> >>> (2*70000!/((2pi)^70000))*(1+1/2^70000) >>> should deliver perhaps the 20000 most signicant digits >>> of B_70000 . >>> >>> So, for efficient computation to some predetermined number of >>> decimals, in EulerMacLaurin summation which is a general method, >>> then even Bernoulli numbers grow so fast and the terms they yield >>> can be so small that "a fraction" of the precision involved >>> in computing the rational value is sufficient. >>> >>> Anyway, I find "algebraic" substitions in standard and special >>> function computations quite fascinating. >> >> [...] >> >> I've experimented with B_120,000 , which is negative and of absolute >> value of the order of 10^461608 , with the objective of obtaining >> an approximation with ~= 200,000 significant digits. >> >> This is what I came up with to be issued to PARI/gp as a command: >> (to get  B_120,000  in fact ): >> >> ? D=exp(log(bernfrac(120000))); >> >> The terminal semicolon suppresses output. >> >> The timing was about 6.6 seconds. >> >> It does indeed seem to deliver the leading 200,000 digits, but >> expresses it in scientific notation , which is fine. >> >> On the other hand, bernreal(.), which delivers a floatingpoint number, >> isn't working, needs more memory: >> >> ? bernreal(120000); >> *** at toplevel: bernreal(120000) >> *** ^ >> *** bernreal: not enough memory >> *** Break loop: type 'break' to go back to GP >> break> >> >> I haven't counted the claimed 200,000 displayed digits in  B_120,000 , >> but it ends as follows: >> >> ...................570339600007904973943801586581893 E376755 >> [ precision 200,000 significant digits ]. >> >> Now I see that most of the time is spent computing the Bernoulli number >> as a rational: >> >> ? bernfrac(120000); >> time = 5,137 ms. >> >> But it's much improved if I use the formula that I believe Plouffe >> used to compute Bernoulli numbers, and use PARI/gp 's >> inbuilt floatingpoint factorial : >> >> ? V=120000;C= factorial(V); C=2*C; C= C/((2*Pi)^V);C=C*zeta(V); >> time = 293 ms. >> >> The formula used can be found as Equation (41) of this MathWorld page: >> >> < http://mathworld.wolfram.com/BernoulliNumber.html > . >> > > It's indeed possible to save time in Bernoulli number computation by > using MathWorld's Equation (41). > > This is for use in the EulerMacLaurin formula for zeta(.). > > One part of EulerMacLaurin for computing zeta(.) involves > addingup many terms in the partial sum: > > sum_{n = 1 ... N1} 1/n^s, to compute zeta(s); > > the analytic formula for the evenindex Bernoulli numbers doesn't help > here. > > I timed the case N = 16300, nu = 37000 with 50,000 decimals > at s = first nontrivial zeta zero: > > http://www.dtc.umn.edu/~odlyzko/zeta_tables/zeros2 > > > so s = 1/2 + i*14.13472514173469... > > The time was 3h 22min for 42,570 decimals accuracy. > > cf.: > < > http://meditationatae.wordpress.com/2014/01/01/experimentingwitheulermclaurinsummationforzetafunction/ >>. > > Copied from there: > > "One term that requires a lot of time to > compute is sum_{n = 1, ? N1} n^(s) ; the analytic formula for the > Bernoulli numbers doesn?t help with that ?partial sum? term. " > > So, it's an ongoing experiment to see what's possible. > Also, a test of the PARI/gp calculator's "consistency", > by trying many formulas ... [...]
I've gone back to the Quadcore Intel in the meantime. You can notice absence of *.sig in the 1 January post, due to using AMD5000+ alternative PC ...
I had already a tentative rho_0 of zeta with imaginary part to 90,000 digits after the point from earlier on.
From Section 6.4 of Edwards book on the Riemann Zeta Function, we have in EulerMacLaurin for zeta:
 the partial sum term, of N1 summands
 two miscellaneous terms that count for little work
 the B_2, B_4, ... B_{2 nu} terms involving evenindex Bernoulli numbers.
Using the analytic formula (41) at MathWorld, http://mathworld.wolfram.com/BernoulliNumber.html
we can compute Bernoulli numbers from values of zeta.
I'm closingin on zeta(rho_1) precise to 90,000 digits;
still, the partial sum term of N1 summands takes a lot of time, and more digits (say 150,000) will require a larger N.
So, the good news is that the Bernoulli terms go faster than with exact rational evaluation through PARI/gp 's bernfrac(.) function.
The bad news is that for the summands here:
 the partial sum term, of N1 summands
the time is the same as before, for identical N.
? g = sum(X=1,N1,exp(s*log(X))); time = 19h, 45min, 50,152 ms.
Almost 20 hours with N = 35000.
I kept N as is, and increased 'nu' or 'khyber' in PARI/gp:
khyber = 74733 (this is 'nu' )
? t=0; q=s/(2*N*exp(s*log(N))); q=q/(Pi*Pi); \\ settingup 'q'
for(X=1,khyber,t=t+ (((1)^(X1))*zeta(2*X))*q; \\ Bernoulli number q = q*((s+2*X1)*(s+2*X)/(N*N)); q=q/(4*Pi*Pi) ); \\ loop (***)
t = t + exp((1s)*log(N))/(s1) + exp(s*log(N))/2 ; \\ misc. terms z2 = t+g; \\ adding partial sums 'g' .
time = 6h, 59min, 40,015 ms.
This results in 89,972 decimals accuracy (almost zero).
B_2n = [(1)^(n1)*2*((2n)!)/((2Pi)^(2n)] zeta(2n)
(Mathworld Bernoulli numbers, eqn. (41) )
As indicated previously,
 B_{120,000}  ~= 10^461608
but for some purposes such as EulerMacLaurin, we don't need 461,000 significant digits of B_{120,000}, including those at `` \\ loop (***) '' above.
The summands there, past nu=1000, are very small in absolute value. So, 90,050 significant digits in every multiplicant should be about Ok.
Also, Fredrik Johansson has computed rho_1 to 1 million bits, and also computed the Stieltjes constants, KeiperLi coefficients to record precision, according to his arxiv preprint:
http://arxiv.org/abs/1309.2877
It points to a 20,000 digits precision repository of zeta zeros (see inside).
David Bernier
 http://www.bibliotecapleyades.net/sociopolitica/last_circle/1.htm

