>I personally believe that the 'quantitative techniques' (and tools/processes) are not really relevant or appropriate to apply to human beings and their problem-solving or learning processes.I'm sympathetic to your view, but I think some amount of monitoring is essential to ascertain that kids are being given what amounts to something like standardized treatment, i.e., they are getting what we pay for. (Now, choice of what treatment is desired is another matter, but you could hardly argue for choice if you don't know what is being offered is the first place.) Another type of monitoring is needed to determine, if, Johnny, can, in fact read.
Imagine a world in which junior could piece together a learning trajectory using a combination of on-line and classroom resources. Age matters, but in a mixed-age setting one has older appointed escorts for bus trips if necessary (Portland uses its Metro / Trimet for school transport -- different regions have different practices when it comes to transportation, some depending more of "soccer mom" volunteer fleets).
The article paints a picture though of monitoring and data collection in clearly excessive amounts. So, digging deeper, I would want to know can all that data gathering even be rationalized as supporting one of the above two
goals. If not, I would be immediately suspect of its necessity.
Our world: Parents are left out of helping with choices a lot of times because (a) there's no real choice and (b) they're not connoisseurs of the differences. Choosing your education should be more like wine tasting, where you develop your palate and savor the alternatives. We usually don't expect parents / guardians to have to time to look at textbooks and compare them. They don't get to hear a lot of competing pitches either.
Other world: theaters have these Ignite.gov type events where schools each get five minutes to pitch their STEM offerings. There's quite a spectrum. Parents and kids pack in, eager to sample the warez. Portland has some schools that use a lot of open source software, such as Python. LEP focuses on entrepreneurship. Few of the schools expect to see the same student body from day to day as the trajectories have become much more interesting. Students bus around more using their student IDs. Skyscrapers downtown have more classrooms. Department stores are teaching more what we used to call "vocational" skills (like designing clothing) but that's going back to the Age of Stupidity (1900 - 2012). We don't divide "real world vocational blue collar" from "other-worldy professional white collar" in the stupid why they used to.
For the remaining data collection, I would ask whether
at what point it becomes so intrusive as to be absolutely counter-productive. If one is in the hospital, blood is drawn for testing and monitoring. But if its drawn every half hour, it may start to counteract your healing.
Students willingly and happily take lots of tests, otherwise known as computer game simulations, some of which are just simple memory games, like flash cards. Almost every kid in Portland learns 1000 Kanji for example, considered part of basic literacy, in addition to learning the Romanji (ASCII alphabet / Latin-1). Lots of testing on those Kanji. Lots of stats pouring into computers on a daily basis as fun games all over town get played, over and over. Lots of times from home. No shortage of statistics, and a lot of data sets that have been suitably anonymized, FERPA having a similar affect as HIPAA in teaching use how to code a firewall between aggregate stats and identities (not saying it's impenetrable in all cases plus students may access and share their own data, just as students can broadcast their own scores today, if they know them).
So, where I agree with you, is that in situations like education that have been traditionally been based on close relationships (though that now is slipping away)
injecting a lot of data gathering can interact negatively with what is going on. Another way of viewing the original article's claim that in the past, the system was "loose". I think another way of saying that is that it was based on trust. The more intrusive the monitoring, the more trust leaves the scene, with negative consequences.