On Thu, Jan 7, 2010 at 11:45 AM, Steve Cooke <firstname.lastname@example.org> wrote: > I'd move straight to Church's lambda notation. I'm only half joking. >
You could do that, and I used to argue with Herman on sci.math about whether we should do that. Alonso Church.
> It is more universal in the computing world than "dot notation" syntax and it is solidly in both the CS and math worlds. CS equations behave more like math equations. Imperative computer languages will say "x = x + 1" but does that ever make sense? >
"Computing world" is not defined, so you would have to be right in your own private language I suppose.
Yes, we should teach about these different paradigms, without digressing for too long. I'm pretty adamant about NOT just looking at one computer language, even if we concentrate on one more than the other.
There's this language called J, a descendant from APL, that I often toss in there as my "evil twin" for Python. It's hardly a dot notation, though it is (unlike APL) a part of ASCII. I'm really not a master of J by any stretch of the imagination. Python I'm comfortable with, like a math teacher might be, without spending hours a day writing reams of code (I do have reams of free code at my website that I wrote, so it's pretty easy to assess that it's accessible, though most of it isn't in 3.x yet).
So is APL part of the unicode standard then? Let's go see...
OK, a bit of a hodgepodge, but I suppose I'm seeing most of 'em: