I studied Linear Programming at the Danzig Simplex Algorithm as well.

Coding an algorithm oneself is a way to help it sink in. 

Having a crummy language, though, can get in the way.

We have a handsome traditional math notation evolved over the centuries that is pretty good at expressing concepts and workflows, but it does not inherently "do" anything, that's left to the reader.  And in so many cases, the "doing" becomes sheer drudgery.  Mathematics conveniently then changes the label to "engineering" i.e. if you want to *do* anything that takes a human hours and hours worth of computation, then you must be an "engineer".

I find these distinctions (in the English language) to be more of an obstacle these days (crummy programming), when computer languages have become far easier and more interactive.  In APL, for example, just coming on-line at Princeton in my undergraduate days (1970s), you can invert a matrix with a single operator.  But that might be too "black boxy" for beginners, as it's the technique of matrix integration they need to follow.

Wayne's early edition Linear Algebra text (which I got a copy of) has lots of code in the back for doing linear algebra.  The language (BASIC I think) is not taught in the regular text and in the back it's somewhat dense and uncommented.  You're just supposed to type it in and cross your fingers you didn't make too many typos.

But again, technology has moved on, and therefore pedagogy (potentially).  Languages can be *interactive* now, meaning you enter a line, hit return, and get a response.  Statistics courses typically build around a command line of some kind.  You learn to use some tool interactively even as you build the concepts.  Many college courses follow this model.

'Digital Mathematics and Programming in Python' is a good example of a high school text that weaves language learning with traditional math notation.  I find such synergy useful because the two notations play off each other.  You see it in "regular math" (say with a sigma, i = 0 to n, function of x) then you see it in Python (say with a for loop, also with a function of x). 

You go back and forth quickly and *both* sink in, because they're "in dialog" (Python and traditional math notation are covering the same concepts in close proximity).  It's like seeing in stereo.

These options have not all been with us for the last 60 years.  It's not like Wayne's Linear Algebra book "should have" been done in Python.  It didn't exist yet.

Then there's Mathematica, the big behemoth that stays with traditional math notation as much as possible, yet automates whatever is algorithmic or "by the book" (rule governed) which includes algebraic simplification, integration as a symbolic activity etc. 

I think rather than putting all eggs in one basket and saying "here's the best new way to do things" our approach should be "lets do many experiments". 

To some here, the "many experiments" model seems unethical because there's a "chalk 'n talk" method (direct instruction) that's proven to work.  I don't think so.  For some people it works, in some circumstances. 

Plus one might argue one learns different things. 

If you learn calculus or trig or statistics in concert with learning a computer language (say Mathematica, or R, or Pandas, or APL, or J....) then you end up with this added language skill as well, so there's value added we should take into account (apples and oranges, to compare calculus with Mathematica versus calculus without in some ways).

I'm not suggesting any "winner take all" approach here.  That goes for my little tetrahedrons too.  Just because something works, and people get value from it, is not an argument for replacing all other similar / parallel / overlapping curricula with this one working one.  Lots of things 'work'. 

I'm against 'one size fits all'. 

I'm pro 'lets experiment'. 

I'm against "we've tried that for 60 years and it hasn't worked so lets give up.' 

Many interesting experiments have produced positive results for many people (including the late Jerry Uhl's Calculus&Mathematica at University of Illinois -- where I'll be tomorrow, visiting one of the spin-offs of that effort) and the tools continue to change, so it's anything but a static picture.

Kirby