On Feb 8, 2013, at 5:10 AM, Richard Fateman wrote:
> On 2/6/2013 6:28 PM, John Doty wrote: > =\... snip... >> >> All this demonstrates is that you don't understand how to use Mathematica. > All you know how to do is fight against it. You don't understand when to use rules versus when to use the various algebraic tools. Your code is pointless: no trajectories calculated, no bridges designed, and it doesn't enable anything that Mathematica can't already do. > > There is nothing that cannot be done in assembler that can be done in > Mathematica. By your reasoning Mathematica itself is pointless. > > There are deficiencies in Mathematica, as demonstrated by questions posted > in this very newsgroup. You can dismiss all such questions as claptrap > because people are misusing it, and that is certainly the view of some > contributors. I think that such a view is unhelpful, at a minimum. > >> >>>> Considered as software failures, both of these occurred in application code and were not the result of programming language deficiencies. >>> >>> >>> >>> Was the code not written in a programming language? >>> >>> Even if it was written in assembler, that too is >>> >>> a programming language. I expect that the bug >>> >>> occurred because the programmer did not realize the >>> >>> semantics of the code. >> >> In the Theravac case, the coders appeared not to understand the operation and hazards of the radiation therapy machinery very well. > > That's not my understanding at all.
My understanding is that in the Theravac case when a counter overflew a higher energy dose - a lethal one - was administered by the operator of the machine unknowingly. On one side it was a clear programming error either do not understanding that the 8 bit counter will overflow sooner or later, or negligence not doing an error checking before incrementing the counter. On the other side it was also a clear systems engineering error, by relying only on the software control and taking out all the hardware stops that otherwise would have prevented the administration of the lethal dosage. So I think both of you are right in some sense and both of you are wrong in another :-) The best to both of you from Nemo :-) J=E1nos
> > In Ariane case, I >> have heard that the coders concluded that overflow was impossible in the context of Ariane IV, but that analysis was not redone for Ariane V when the software was reused. > > The re-use of software is usually tied to the programming language. > > In any case, the fundamental failure in both cases was in systems engineering, not code. > > The language of the system engineering was programming languages. A > better programming language might have provided a more reliable framework > for construction or re-use. > > > You shouldn't place much blame on the spark that sets you on fire when you walk around in clothes soaked with gasoline. > > I think I agree with what you said here, in reverse. that is ...using > Mathematica is like walking around with your clothes soaked in gasoline. >> >>>> Not all bugs are of equal importance. An error of 5.5E-79 in a Bessel function is very unlikely > > If that were the only bug or the most significant one, you might have a > point. Except: > 1. There are many errors that can be prompted at the 1st decimal digit of > an answer. > 2. There are other numbers around. If A is supposed to be equal to B, > but A-B is 5e-79, you can with one operation make that error much bigger. > Just multiply by 10e79. Now you could say that 10e79 is an unlikely > number, but how would you know? > >>> >>>> to cause trouble in a practical application. >>> >>> >>> >>> One of the marvels of computing today is that it is possible to do so >>> >>> much in such a short time. >>> >>> One can execute billions of instructions a second. If only >>> >>> one in a million does the wrong thing, and is wrong only >>> >>> by a tiny percent, you can accumulate a whopping mistake >>> >>> in a second. >> >> 5.5E-79 is rather smaller than a tiny percent. The best physical measurements are good to a few parts in 10^16. Assuming teraflop arithmetic, adding numbers with errors of 5.5E-79, it'll take you about 4E45 times the age of the universe to have errors add up to 1E-16. > > Ah yes. assuming you don't multiply or divide. >> >> >>> I've been using Mathematica to do practical work since version 1, >>> >>>> and I've never encountered a bug in its numerics. >>> >>> I guess it is my turn to wonder if YOU know much about Mathematica. >> >> The difference is that I *use* it. You *fight* it. > > If finding bugs is fighting, sure. If criticizing design features and > sometimes suggesting changes is fighting, sure. >> >>>> Crazy results from numerical codes are a normal occurrence, >>> >>>> I don't find Mathematica to be unusually hazardous here. >>> >>> I guess I disagree on this point. >> >> But you don't actually use numerical codes. You don't design bridges, compute trajectories, or study turbulence. So your opinion is uninformed. > > Um, who is uninformed? > See > http://adsabs.harvard.edu/abs/1998IJMPC...9..509F > > a paper entitled > > Symbolic Computation of Turbulence and Energy Dissipation in the Taylor > Vortex Model > > > >> >> Well, I'm not so sure that using Scheme is so good: it means that few potential collaborators are willing to try to read my code. To the average engineer or scientist, Lisp screams "forget your application and pay attention to the cool computer science". > > Just suggests that the average engineer is a sucky programmer. That's > not so surprising. > > Now, I actually know enough of that CS crap that I can see past it to reality, but most don't. I primarily use Scheme to access a collection of useful functions that are actually written in C, but have Scheme (Guile) interfaces. I think it would be much better if those interfaces were in a language friendlier to engineers, like Python. Scheme's fine for me, with my eccentric background, but not for most others. >> >>> Though >>> Common Lisp (a Lisp dialect) is used in space computations -- >>> the Hubble telescope. >> >> My old colleague Mark Johnston developed SPIKE for Hubble, but it has been and is used for other missions as well. I was involved in getting the ASCA and Chandra space observatories to adopt it for their operational planning. The last time I talked to Mark (a long time ago), he seemed rather unhappy with CL as an implementation language for a practical AI application. I don't think he'd choose CL again if he had to rewrite SPIKE. >> > I think the phrase "practical AI application" is still, for most people, > an oxymoron. There are plenty of practical CL applications. > While people differ on their favorite languages for different > purposes, my experience with students at Berkeley is that the > time for them to write major parts of a compiler in C or C++ is > LONGER than the time for them to learn Lisp and then write the same > program in Lisp. That's not rocket science, though. > > But then they are probably not average engineers. > > > > >