Not That Different

When I was writing my dissertation one of my advisers, Dr. Syverson, used to gently tease me about my over-use of the word crisis. Academics, she said, always feel that academia is in crisis. It’s true, and yet I still believe that academia is nearing some sort of profound change, even if that change is less revolutionary than evolutionary. It’s a big sluggish set of institutions and nothing happens quickly.

What  happened to the U.S. postal system is happening to education: the public monopoly is over, for good or worse. It was a bad idea to allow the fully unregulated growth of online private education. Too often, it allowed the industry to fall victim to it’s own worst instincts. Careful regulation might have slowed growth, but prevented a lot of problems. Now we have a lot of ground to make up.

As this slow-motion crisis unfolds, it’s interesting to see what sorts of ideas and models are held up as potential solutions. The most typical, as exemplified by Jeff Silengo, in the Chronicle of Higher Education, is business (“Think Different? Not in Higher Ed“).  Universities, Silengo says, ought to innovate like Apple. After nearly three or four decades of emulating business, this claim seems silly at best.

The Chronicle also posted an article this week on a very different model, used at Syracuse, rooted not in business but in public service. (“Syracuse’s Slide“).  Even more interesting, this model– it’s not new as much as return to another tradition– is ignored by Silengo, even though it is  discussed just a few clicks away. As  the title suggests, universities should think differently, but not that differently.

 

 

Coming in from the Cold

The  ongoing consolidation of the online higher education system, especially in the for-profit sector, is one of the most important developments in the last twenty years.  Yet, like the emergence of the internet in the early to mid 1990’s, it remains almost completely invisible in the mainstream– I am tempted to say lamestream— media.  I think it’s under-reported even in the education media.

There’s a lot to be concerned about the emerging online system– arguably, the most transformative development of the internet so far– yet the emergence of the new institutions seems to be happening without much public discussion, much less scrutiny.  The discussion that is going on, such as in Inside Higher Ed (“Going Off on Online Rankings“) seems so lost in the trees that it never considers the forest.

The U.S. News and World Report’s rankings of online schools are significant because they signal the first stages in the maturation of the online industry, led by for-profits, but increasingly joined by public schools. The final shape of the system– it’s ratio of for and not for profit institutions– has yet to be determined, mostly because the online system so radically widens the pool of potential students.

We need answers or at least a debate. Will the new system make life-long learning a practical reality? It’s not a part of  the Republican or Democrat deadbeats’ agendas, but ironically that absence  may signal its significance.  Just as importantly, is this emerging system going to reproduce the traditional system’s exploitative labor policies,  massive debt, and alienating mass consumption?

Our Latest Myth: Adaptive Learning

I’ve long been fascinated with what I can only call (pardon my Marx) the ideology of bourgeois individualism that underlies so much of U.S. education. It really shows up when you talk about grading and commenting on papers. Students need, it is said, what is called “individual” help. Of course, students are members of cultures, and so the help we give is often as collective as it is individual. There’s nothing unique or individual about the conventions of writing. Most students need “collective” help with their writing; they need to understand that it’s not all personal expression.

Facebook writing has it’s conventions as much as college writing .  We don’t always teach individual expression,  as often as not we teach the collective traditions and standards that transcend individuals and that make communication possible. Yet acknowledgment of our collective existence is one of the taboos of pedagogy. It’s not simply pedagogy, either, it’s morality, too. If we don’t use “individualized” instruction, we are teaching poorly, or so it is said, but more importantly, we are doing something wrong. We are denying a student’s humanity.

Our humanity, of course, is more than individual. Americans, though, don’t like to be thought of as members of a class,although we don’t mind putting others into categories or groups.  If current politics teaches us anything, it teaches us that we fear our collective identity. These were my admittedly cranky thoughts as I read, “Why You Should Root for College to Go Online.” The public universities do need to move more quickly into more substantive online programs. They don’t need to get bogged down in the bourgeois muddiness of so-called adaptive learning.

It’s the marketing, stupid!

I’ve said before that we– those of us who love computers and new communication technologies and who adapted them early and often– have often been very wrong in our initial assumptions. In the late 1990s we thought that multitasking was a technologically enhanced way to work and learn and play. As it turns out, brains don’t work or play or learn that way at all.

Or, rather, brains can work and learn and play that way, but only by severely limiting the quality of work or play or learning. It’s probably fine to have the radio on the background as you write, but you can’t email with one hand while answering questions in an online classroom with the other; both email and forum postings will be littered with errors at best. Focus matters.

We also believed that our students were increasingly what we called “digital natives” who would not struggle to learn these new technologies in the way we had. This begs some interesting questions. Here’s how one writer, Arthur Goldstuck, puts it:

How is it possible that the typical child is so much more adept at using gadgets than the typical adult? How did we come to stereotype the neighbour’s 12-year-old son as the expert who will sort out our computers, cellphones and TV programming? (“The Myth of the Digital Native“)

In my experience, this idea never held water. At first, I did meet  at least some students, mostly boys, who were fascinated with computers and so knew a lot about them. Very quickly, though, it became clear that students’ interests were very different from my own as a college teacher. I knew about the web and .html, they knew about My Space and video games. Facebook didn’t change that at all.

Goldstuck argues that the difference is developmental. At 15 you are more capable of learning than at, say, 50. That may be true. I think he’s also missing the obvious: a lot of the difference has to do with marketing. Young people, who are arguably more vulnerable to ads, are interested in certain technologies because that’s what they have been sold. That may not help education at all.