On 1/16/2013 8:54 PM, W Craig Carter wrote: > On Jan 17, 13, at 12:14 PM, Richard Fateman wrote: > >> On 1/15/2013 10:39 PM, Murray Eisenberg wrote: >>> On Jan 14, 2013, at 11:31 PM, David Bailey <firstname.lastname@example.org> wrote: >>> >>> >>> >>> It all depends on just what you want somebody to accomplish when learning his/her first programming language. >> There are several issues here. >> For a starter, 4-year colleges generally do not offer credit for "a >> course to teach you to program in language X" in a department of >> computer science. >> There may be such courses in physics, statistics, etc departments, >> but this should be classified as a utility course, akin to "how to use >> the microwave oven in the lunchroom". > > Was this analogy meant to be pejorative? Not especially,. just illustrative.
I have no problem with physicists (etc) teaching each other programming, as long as they don't think that they know "computer science" as a consequence of knowing a little FORTRAN.
I have met physical scientists who assume that a computer scientist knows LOTS of programming languages, not just FORTRAN. I have seen programming languages designed by physicists who tried to learn by osmosis, but made some fundamental errors.
Microwave ovens can be fairly complicated, with power settings, times, and extra features like convection, buttons for defrost, and "popcorn".... But I would not confuse "how to use a microwave" with general knowledge of electromagnetism, signal generation and propagation in the microwave part of the spectrum, and biological effects of such. Anyway, that's the idea.
> > I teach such a course and receive current and a posteriori feedback on the benefits of learning to rapidly construct a model, compute it, and visualize it. I hope it is not called Introduction to Computer Science. > Equally important, the students learn math and a means to acquire more math on their own---and quickly. So the course is something like "Introduction to Mathematical Modeling in Material Science" perhaps "... with an introduction to programming language X...". That's fine. There's a course at UC Berkeley in Engineering E 7, which teaches simple numerical analysis, introduces modeling etc. Its relation to the computer science curriculum tends to evolve, partly because engineers think that their students don't need to know more about computers than they do. > I teach fundamental concepts of my discipline by using novice programming, numerical analysis, symbolic algebra: voila, canonical discipline knowledge and transferable skills in one course. eh,sounds like E7 except last I heard, it didn't include symbolic algebra, but made some fuss about object-oriented programming.
> > The "microwave usage" analogy diminishes the importance of an indispensable tool to an applied scientist or engineer. I think a microwave oven is indispensable too :)
> > I have physical science colleagues who consider programing and linear algebra to be superfluous because they use spreadsheet tools. I think that is justifiable, for some people. Many mathematicians consider computers quite worthless because to them, mathematics has to do with creating proofs, and computers (mostly) don't do this.
Some professional engineers lived their whole careers using tables in handbooks. Times change, of course, but not all engineers do terribly novel things.
It is unlikely that an engineer will be called upon to write one, much less a collection, of different sorting programs. So a course on algorithms that provided such an experience would seem to be, vocationally speaking, unnecessary. Though the exercise might be intellectually enlightening nevertheless. > I consider their myopic view marginally more perverse than (paraphrasing) `classifying computer language tools as microwave ovens for those who don't build microwaves or understand their operating principles'. It is too late at night for me to figure this out, but I suspect we are agreeing, mostly..