Our Latest Myth: Adaptive Learning

I’ve long been fascinated with what I can only call (pardon my Marx) the ideology of bourgeois individualism that underlies so much of U.S. education. It really shows up when you talk about grading and commenting on papers. Students need, it is said, what is called “individual” help. Of course, students are members of cultures, and so the help we give is often as collective as it is individual. There’s nothing unique or individual about the conventions of writing. Most students need “collective” help with their writing; they need to understand that it’s not all personal expression.

Facebook writing has it’s conventions as much as college writing .  We don’t always teach individual expression,  as often as not we teach the collective traditions and standards that transcend individuals and that make communication possible. Yet acknowledgment of our collective existence is one of the taboos of pedagogy. It’s not simply pedagogy, either, it’s morality, too. If we don’t use “individualized” instruction, we are teaching poorly, or so it is said, but more importantly, we are doing something wrong. We are denying a student’s humanity.

Our humanity, of course, is more than individual. Americans, though, don’t like to be thought of as members of a class,although we don’t mind putting others into categories or groups.  If current politics teaches us anything, it teaches us that we fear our collective identity. These were my admittedly cranky thoughts as I read, “Why You Should Root for College to Go Online.” The public universities do need to move more quickly into more substantive online programs. They don’t need to get bogged down in the bourgeois muddiness of so-called adaptive learning.

It’s the marketing, stupid!

I’ve said before that we– those of us who love computers and new communication technologies and who adapted them early and often– have often been very wrong in our initial assumptions. In the late 1990s we thought that multitasking was a technologically enhanced way to work and learn and play. As it turns out, brains don’t work or play or learn that way at all.

Or, rather, brains can work and learn and play that way, but only by severely limiting the quality of work or play or learning. It’s probably fine to have the radio on the background as you write, but you can’t email with one hand while answering questions in an online classroom with the other; both email and forum postings will be littered with errors at best. Focus matters.

We also believed that our students were increasingly what we called “digital natives” who would not struggle to learn these new technologies in the way we had. This begs some interesting questions. Here’s how one writer, Arthur Goldstuck, puts it:

How is it possible that the typical child is so much more adept at using gadgets than the typical adult? How did we come to stereotype the neighbour’s 12-year-old son as the expert who will sort out our computers, cellphones and TV programming? (“The Myth of the Digital Native“)

In my experience, this idea never held water. At first, I did meet  at least some students, mostly boys, who were fascinated with computers and so knew a lot about them. Very quickly, though, it became clear that students’ interests were very different from my own as a college teacher. I knew about the web and .html, they knew about My Space and video games. Facebook didn’t change that at all.

Goldstuck argues that the difference is developmental. At 15 you are more capable of learning than at, say, 50. That may be true. I think he’s also missing the obvious: a lot of the difference has to do with marketing. Young people, who are arguably more vulnerable to ads, are interested in certain technologies because that’s what they have been sold. That may not help education at all.

Teachable Moments

I don’t mind memorials, of course, and there were a lot of heroes killed on September 11, 2001. I admire firefighters who, as the cliché goes, ran to the disaster when everyone else was running away. Those passengers on Flight 93, probably taught al Qaeda an important lesson. You can’t quite trust crazy Americans to sit quietly and accept their fates. A few might charge the cockpit. Yesterday, though, was like a marathon of the big lie.

A big lie is a lie repeated so often that people forget that it is a lie. One of the worst, which I heard on National Public Radio, is the notion that we “were at war, but didn’t know it until those planes hit the World Trade Center.” That’s untrue in a dozen ways. al Qaeda isn’t a state, and can’t be at war with anyone. When it declared war, it was trying to justify a violent criminal conspiracy. It’s still a lie. This is not just splitting hairs; the difference matters.

We  are at war with much of the rest of the world, especially the Middle East. As horrible as 9-11 was, it pales next to what a country with our resources can do. This has been true from the so-called Spanish-American war, in which we committed near genocide in the Philippines, to our current and often very violent occupation of both Iraq and Afghanistan. These wars are not clarified by the so-called al Queda war, they are obscured by it.

Perhaps we should also think of the day after the memorial as an important teaching moment in which we try to come to terms with imperialism, and the choice that was made in our name to respond with two real wars to a war that was more metaphorical than real. We should try to imagine another history entirely in which we fought al Qaeda, perhaps at times using military means, on our terms,  within the law and the criminal justice system.

 

The (Academic) Mindset List

I know I am being a party-poop, but I find this so-called Mindset List endlessly irritating. First, there’s the weirdly inflated claim by its authors– it’s the public relations team speaking here, no doubt– that the list is “a globally reported and utilized guide to the intelligent if unprepared adolescent consciousness.”  In truth, the list says almost nothing interesting– especially this year– or revelatory, unless you see it as a reflection of a very insulated academic culture forever afraid that outside those ivory walls the worlds has left them behind.

It’s a sentimental nudge to the quaint idea of the professor lost in his or her books. Who can afford that anymore? A few items on the list– mostly about women– seem to suggest substantive change, but most of it is just plain silly: “O.J. Simpson has always been looking for the killers of Nicole Simpson and Ronald Goldman.”  Or: “Jim Carrey has always been bigger than a pet detective.”  What 18-year-old knows who OJ Simpson is? What academic saw “The Pet Detective“? This isn’t a description of of consciousness, or epochal events, it’s a list of  marketing’s biggest hits.

The list, as Henry Ford said, is bunk. Very little of it has any impact on students’ educations or on how we communicate with them.  Students need a list to explain to them what has happened to education in the last two or three decades: “Standardized tests have made teaching critical thinking an uphill battle; science has been conflated with religious irrationality; professors have almost never had a full-time job, tenure has always been a dirty word.”  These are the things that will continue to have a profound influence on “the adolescent consciousness.”