One of my courses is new this semester, so despite all the advice that people have been getting (see Profacero's post, for example, or the comments in the Another Damned Notorious Writing Group threads) about not spending any time on course prep but saving time for writing, unless you want to go into a classroom with egg on your face, you have to do something.
This is the reason for spending a writing day the other day dreaming up assignments and exercises for this and the other courses. The downside: losing a day of writing. The upside: now these things are done, and I don't plan to revisit them. If you've taught for years, you know what works and what doesn't, and if you make a mistake and write a bad part of an assignment, you adjust your expectations and fix it the next time.
What keeps me going, though, is the back-of-the-mind metaphor that now that I've done this course once, I never have to do it again. It's in the bank, so to speak. Realistically, a course is never really done; you think of what could be done differently or better the next time you teach it. Still, you don't have to invent every single thing from scratch, the way you do with a new course, and it's unlikely that you'll need to (although I would) share materials with others who teach the course in later semesters. It becomes your course, and you are identified with it--for this moment, anyway--whether it succeeds or it doesn't.
In an online course, you're still working with a banking model, but the push from higher-ups is different. In effect, if the course has been taught by someone else before, there's a strong pressure for you not to change anything--assignments, readings, syllabus--and to use what has been done before. Instead of making deposits, you're supposed to withdraw from the account that someone else has established.
Now, in theory, this would be a huge timesaver, since you don't have to put all that time into creating new assignments and could spend it on writing. But if you are stubbornly perverse about teaching your own material, as I am, and if you see ways to improve the course, as I did, you ignore the pressure and design the course the way you want it. The difference is that this time, you're banking the course not on your own computer but on the university's server, and if you don't keep extra copies of the materials on your own computer, all your work could be lost if everyone else doesn't want to teach your course the way you designed it (and why should they?). You've banked it, but you don't own it. If it stays, it's not really associated with you as a teacher, and it can disappear.
We don't really "own" courses, of course, and all that banking imagery just makes the loss of time for other work easier to justify. It's just a different feeling. In one, I'm putting aside material that I can draw on later, and it has my name on it. In the other, I'm developing material anonymously for a collective pool of materials. Both have their advantages, but I'm struck by how different they seem.
Saturday, October 29, 2011
Wednesday, October 26, 2011
The no-laptops-in-class experiment, a midterm report
Like a lot of teachers, for years I've had some students whose faces I've never seen although I stand in front of them (or, during group work, beside them) several times a week for 16 weeks. Why? Because their faces are buried behind a laptop screen, and if I call on them unexpectedly--and it's always unexpectedly, because they rarely seem aware of what's going on in class and never raise their hands--the shocked look they give is so universal that it doesn't give me a sense of their personalities.
(Startled Stu Dent) "What?"
"What did this quotation mean?"
"What quotation? What page are we on?" and so on.
This semester, emboldened by all the "laptops are a distraction" editorials by faculty AND students that Margaret Soltan keeps posting, I banned them (along with cell phones, etc.). Just did it. Put it in the syllabus and everything, along with the requisite proviso about exceptions.
One big general exception is that if there's scheduled group work, everyone can bring a laptop (or cell phone, or whatever) and use it to look things up, and everyone seems to do this who wants to. If they don't have a laptop, they can use mine up at the front of the room to look things up.
So far, so good. Some impressions:
- Class participation seems to be better in all the classes. At the very least there aren't 3-5 people permanently checked out of class, as there used to be when laptops were allowed.
- It cuts down considerably on the Laptop Two-Step of calling on someone:
(Startled Stu Dent) "What?"
"What did this quotation mean?"
"What quotation? What page are we on?" and so on.
- I can catch their eyes before I call on them by name, so they can get ready and not embarrass themselves by seeming clueless.
- Even if they zone out, they come back more quickly than they used to with laptops.
- If they're doodling or taking notes, it's a lot easier for them to break away from doing that and look up to answer a question.
- Of course, they could kill me on evaluations for not allowing their digital native selves to flourish in a wireless and connected environment, but I'm more interested in what they're learning, which seems to be (as gauged anecdatally by discussion and quizzes) more than in previous iterations of the class.
Yes, I could have done all that "incorporating Twitter" and being constantly fact-checked by students that a lot of edutech people advocate, but that might be better for large lecture classes. If it's a discussion, I want students to discuss. Is that unreasonable?
The thing is, I know it's hard to break away from a computer screen. It's hard for me, and, to judge by the people I see shopping at Zappos, checking email, and looking up the speaker's quotations on Wikipedia during conference presentations at MLA, it's hard for other people, too. I figure that for three hours a week in class, we can all look at each other and talk about literature without a digital intermediary. It's not too much to ask.
Sunday, October 23, 2011
Facebook and scholarly communities: a minor rant
I am on Facebook. On Twitter. On Google Plus. I know I'm in a minority on this, but I hate having to check them for work-related things. There are two reasons for this, one personal and one ideological.
The personal one is that when people post calls for papers and invitations for professional events, those places end up being just one more X#$%& place that I have to go to in case there's an announcement. It's not enough to check your email and the official site and the CFP at U Penn and Google Reader and any random blogs that the organization might be running. Oh, no. Now you have to click on the cheery "Follow us on Twitter! Like our page on Facebook!" links. If you find Facebook not only a distraction but kind of depressing (I know, this isn't a universal reaction), you just might be the kind of person who doesn't want to be forced to go there to get professional news.
The more important reason is ideological, and it's a two-parter.
- First, who has time to keep track of all this? When do all those posters have time to write anything of substance?
- Second, I'm uneasy about how much this gets into "closed web" territory. Right now, most things are announced in multiple venues, so even if you are a Facebook grump and don't log in much, you will still get the message. (I leave Twitter out of this because in looking at my Twitter stream, I realize that if you're not posting 4-6 times a day at a minimum and linking to "must-read" articles in each tweet, you're not really "on" Twitter.) But sooner or later, people are going to get tired of posting everything to 6-7 venues just to be sure that everyone gets it. They're going to post to the place where the people are, and that will be Facebook and Twitter. And if you're not on there, or, more important, following/liking/friends with the right individuals on there, you won't get the message. And that ought to be giving us pause, even if we're fans of social media.
Wednesday, October 19, 2011
Books as benevolent zombies
If you follow the links from More or Less Bunk's question "How do you skim an e-book" (my answer: you can't, and I own a bunch of them), you'll find a whole lot of articles on libraries getting rid of books. This is not a new event, of course, but it was a little chilling to read (at The Chronicle) that "It is no longer appropriate to treat most print resources as protected objects, or the college library as a museum for books," in part because the sight of too many books just frightens our little chicken-hearted students to death by being too "daunting." Books are not just dead but scary. They're zombies.
Huh? Are we talking about the same students who thrive on vampire, zombie, and torture porn movies and bloody video games? They're daunted by a stack of books? Seriously? And if they're "daunted," isn't it our job to show them how to get over it?
In my classroom, we're doing more library work than ever before, and the students seem to be really engaged by it. Maybe I'm fortunate that Northern Clime's librarians enjoy showing the library to students. By "showing" I don't mean forcing students to sit passively in a room watching as a librarian conducts Boolean searches and drones on for an hour that seems like a year. No, I mean getting them into the stacks to look at and leaf through the books. Some librarians like to say that e-books are the future, but really, bound books are the great undead, springing back to life in the hands of readers.
Let's take some zombie-age books as an example. Librarians like the one at the Chronicle say that books after 1850 aren't rare (although some seem to be doing their level best to make them so), and some say that Google Books makes getting these books less of a problem.
Well, let's see. This week I needed to read a reasonably obscure novel from 1870. Yep, Google books had it, or part of it: only every other page had been scanned. Descending into the entombed depths of the library, I found a copy of the original novel, from 1870, along with a number of other first editions on the shelves by this author. If this library were following the "books scare students" model of dubious library best practices, these would've been gone a long time ago. Instead, they were right there, waiting for someone to bring them back to life.
Huh? Are we talking about the same students who thrive on vampire, zombie, and torture porn movies and bloody video games? They're daunted by a stack of books? Seriously? And if they're "daunted," isn't it our job to show them how to get over it?
In my classroom, we're doing more library work than ever before, and the students seem to be really engaged by it. Maybe I'm fortunate that Northern Clime's librarians enjoy showing the library to students. By "showing" I don't mean forcing students to sit passively in a room watching as a librarian conducts Boolean searches and drones on for an hour that seems like a year. No, I mean getting them into the stacks to look at and leaf through the books. Some librarians like to say that e-books are the future, but really, bound books are the great undead, springing back to life in the hands of readers.
Let's take some zombie-age books as an example. Librarians like the one at the Chronicle say that books after 1850 aren't rare (although some seem to be doing their level best to make them so), and some say that Google Books makes getting these books less of a problem.
Well, let's see. This week I needed to read a reasonably obscure novel from 1870. Yep, Google books had it, or part of it: only every other page had been scanned. Descending into the entombed depths of the library, I found a copy of the original novel, from 1870, along with a number of other first editions on the shelves by this author. If this library were following the "books scare students" model of dubious library best practices, these would've been gone a long time ago. Instead, they were right there, waiting for someone to bring them back to life.
Thursday, October 13, 2011
Do digital natives crave digital books?
We all know the drill: our students love their computers, what with being digital natives and all, so we need to invest heavily in ebooks. Over at IHE, Barbara Fister bravely looks at this particular flavor of heavily-promoted Kool-Aid and discovers something a little different:
The connection I'm seeing is this: students may live in computerland, as we do, and they certainly communicate with us in that way, but that doesn't mean that they use computers as we do nor should they necessarily want or need to.
We can lead these horses to water, but we ought to stop trying to make them drink--that is, turn them into mini versions of us. Instead of force-feeding them our notions of what they should want based on starry-eyed notions of what "digital natives" do, why don't we pay attention to what they actually want? Sure, we need to expand their horizons beyond enotes and Wikipedia, but we can do that in ways that meet them halfway.
Actually--and this is another heretical thought--I'm starting to wonder if the students use the physical library more than we do. A little anecdata: I was at our library today, as I am most weeks, and it was full of students studying in groups. Once again I was the only faculty-age person there except for a librarian here and there. I know--this proves nothing. Still, I wonder if the atmosphere of the books has at least something to do with it.
This is fresh in my mind because I just attended an interesting day-long virtual conference on ebooks in libraries. In fact, I was a panelist for a session on marketing ebooks to students in academic libraries. Sadly, what I had to say probably wasn’t what the audience came for. Our students aren’t interested in ebooks . . . . I don’t know what students make of all this, but one thing that Project Information Literacy discovered in their latest study is that students are not as excited about gadgetry and electronic sources as we tend to assume. When project teams interviewed 560 undergraduates studying in libraries at ten institutions, they found students were keeping it simple. Most of them had only one or two electronic devices with them: a phone and a laptop. Most of them were focused on getting an assignment done or were studying for a class. Most of them had only a couple of webpages open in a browser, and they weren’t the same websites; they were browsing all over the place. (emphasis added)This reminds me of the big push to use Facebook in classes a few years back. The thinking was that since students live in Facebookland, they would love love love to have their teachers in there friending them and pushing class-related posts at them in their out-of-class spare time. From articles I've read, students were not exactly thrilled about this togetherness concept dreamed up by dewy-eyed teachers. They understood that a social space was a social space and a learning space was a learning space, and they were okay with having boundaries between the two.
The connection I'm seeing is this: students may live in computerland, as we do, and they certainly communicate with us in that way, but that doesn't mean that they use computers as we do nor should they necessarily want or need to.
We can lead these horses to water, but we ought to stop trying to make them drink--that is, turn them into mini versions of us. Instead of force-feeding them our notions of what they should want based on starry-eyed notions of what "digital natives" do, why don't we pay attention to what they actually want? Sure, we need to expand their horizons beyond enotes and Wikipedia, but we can do that in ways that meet them halfway.
Actually--and this is another heretical thought--I'm starting to wonder if the students use the physical library more than we do. A little anecdata: I was at our library today, as I am most weeks, and it was full of students studying in groups. Once again I was the only faculty-age person there except for a librarian here and there. I know--this proves nothing. Still, I wonder if the atmosphere of the books has at least something to do with it.
Friday, October 07, 2011
Ever hear of MOOC?
MOOC? It means "Massive Open Online Course," and I read about it in the comments over at the Chronicle--because I don't have time to sit through a long fanboy podcast when I'm supposed to be working, although apparently writing a blog post is just fine, time-wise.
To say, as the Chronicle diplomatically does, that MOOC courses "poses challenges to traditional education models" is putting it mildly. I clicked through to the courses linked in the article and comments and learned this:
On the other hand, for learning about how social media works in education, maybe yes and maybe no. Maybe the learning and application is in itself the important part and the credentials aren't needed, although if that's so, why are experts named for each module?
To say, as the Chronicle diplomatically does, that MOOC courses "poses challenges to traditional education models" is putting it mildly. I clicked through to the courses linked in the article and comments and learned this:
- MOOC courses are offered to up to 10,000 students at a time.
- You can learn something, or not, and participate, or not, and do the readings, or not. (Okay, so this the way some students approach a traditional course.)
- You don't get credit for the course, as far as I can tell. I've looked extensively at the materials for several sites and couldn't find a mention of it. I don't think it costs anything to take a MOOC course, but again, I couldn't find out from the sites.
- The content for these courses seems to be people talking about social media or education through social media, so in a way, the course is more or less a performance of the subject matter.
- People who are experts in the content area come in and curate parts or lead discussions of their content area, so you have social media people talking to people who are accessing the course using social media. Students end up practicing what they are talking about.
- I can't judge the subject matter, since I never took an education course, but it is a very different content base from what we're dealing with in literature, history, psychology, or traditional disciplines in the sciences. The Mother of all MOOCs has modules on "collective learning," "connecting our learning," "learning in times of abundance," "triangulating our learning," and so on.
- I learn a lot from the experts when I go on a site like fountainpennetwork.com every so often. I don't get credit for going there, but there are many people contributing to a knowledge base. Do discussion forums on specific topics count as a MOOC, or does the subject matter have to be education?
- How would this work for a subject in which there is specific, rigorous content on which students need to be evaluated?
- How would students respond if you tell them, "Hey, kids, here's a swell course for you to take. You'll learn as much as you want to learn and spend a lot of time doing it, but you won't get any college credit for it"?
- According to trusty Wikipedia, the principles of MOOC are to (1) gather information; (2) remix content; (3) repurpose the content; and (4) feed it forward. This is presented as revolutionary, but how is this different from what we have students do in class every single day?
- I'm reminded of the "reinventing the educratic wheel" post at Historiann's where some university thinks it has invented something new in promoting class discussion and group work instead of the dusty old lectures that it thinks rule college courses and that, like the "Paul is dead" legend, get dragged out every month or so as a dead horse to beat.
- See also "The University of Wherever" at the NYTimes, linked from More or Less Bunk.
On the other hand, for learning about how social media works in education, maybe yes and maybe no. Maybe the learning and application is in itself the important part and the credentials aren't needed, although if that's so, why are experts named for each module?
Steve Jobs
I heard about Steve Jobs's death on all academics' primary information source, NPR, as I was driving home, and, like everyone else was saddened by it. (See the tribute at Roxie's World.)
This won't be news to any of you, of course, but he did fundamentally change the way we communicate with each other. I'm thinking not just of the consumer electronics Apple pioneered under his watch but also of the difference he made in teaching. Back in the olden days, teaching with computers meant standing in a computer lab and teaching rows of students sitting at dumb terminals as they stared at a blinking amber cursor on a monochrome screen and tried to figure out what Function and Control keys were. Today, we teach students whom only draconian measures can separate from their iPhones and computers for the length of a literature class. I'm thinking of all the things we used to have to teach students about technology (FTP! Floppy disks! C:\ prompt! Save your file!) that are now either obsolete thanks to Steve Jobs or handled in an elegant, intuitive way.
As his Stanford commencement speech shows, he was an idealist as well as a perfectionist, and he was passionate about his work and encouraged others to be so as well. I never knew the man, of course, except through his products and the press coverage that erupted every time he walked out his front door, but he will be missed.
This won't be news to any of you, of course, but he did fundamentally change the way we communicate with each other. I'm thinking not just of the consumer electronics Apple pioneered under his watch but also of the difference he made in teaching. Back in the olden days, teaching with computers meant standing in a computer lab and teaching rows of students sitting at dumb terminals as they stared at a blinking amber cursor on a monochrome screen and tried to figure out what Function and Control keys were. Today, we teach students whom only draconian measures can separate from their iPhones and computers for the length of a literature class. I'm thinking of all the things we used to have to teach students about technology (FTP! Floppy disks! C:\ prompt! Save your file!) that are now either obsolete thanks to Steve Jobs or handled in an elegant, intuitive way.
As his Stanford commencement speech shows, he was an idealist as well as a perfectionist, and he was passionate about his work and encouraged others to be so as well. I never knew the man, of course, except through his products and the press coverage that erupted every time he walked out his front door, but he will be missed.
[See also the posts by Historiann and Tenured Radical, both of whom make good and less rose-colored points than I do. Oh, and let us not forget The Onion, via Dr. Koshary.]
Wednesday, October 05, 2011
A writing post at the Chronicle
Rachel Toor looked into my brain, I swear, to write "What Looks Like Productivity" over at The Chronicle. A sample:
I decided to let that part go for now and have moved on to dinosaur vocalizations, and, with the aid of my Tomato Master and my Wordmaster, I'm excited about writing again. One's the personal trainer and the other the Stairmaster of my writing right now. They're telling me time's up, so I have to get back to work, and what a refreshing phrase that turns out to be when you're excited about what you're doing.
We keep busy. There are conferences at which to give papers, articles to be crafted from those papers, chapters to be contributed to someone else's book. When you're faced with a project that seems overwhelming, like writing a book, those discrete tasks can look appealing. How long, you ask yourself, could it take to write a paper? An article won't take long, right? And then your procrastination projects are subject to the same delays as the thing you're avoiding.She did inspire me--that, and the lesson from doing the exciting writing the other night. I realized that what I'd been doing was editing and more editing on a section I've already worked on for too long, with a mounting dread about writing about--well, let's say I am an authority on birdwhistles and have written a lot about them over the past couple of years. The section I was working on demanded that I go back and say something fresh about them, and I was dreading it.
I decided to let that part go for now and have moved on to dinosaur vocalizations, and, with the aid of my Tomato Master and my Wordmaster, I'm excited about writing again. One's the personal trainer and the other the Stairmaster of my writing right now. They're telling me time's up, so I have to get back to work, and what a refreshing phrase that turns out to be when you're excited about what you're doing.
Monday, October 03, 2011
Renegade writing
I've been reading Clio Bluestocking's posts on writing with mingled envy and excitement about the process--envy (in a good way) because she's writing so much and excitement because the other day, for the first time in a long time, I worked on a piece of writing that was interesting and exciting to me.
Mostly what I've been doing is editing and writing stuff for others: editing my own work, responding to others' work, and doing service work that I'm committed to doing. What it reminded me of was this: you can, and I did, spend 16 hours on something (a report, say), and no one will notice it or say anything about it, unless it doesn't get done. You can spend 5 hours responding to something (and I did), and what you'll hear by return email is, "Fine. Now how about this other task?"
If it's what you signed up to do, you put in the hours, and you mark them on Google Calendar so you can see the real number of hours that it takes. You vow to remember this when someone contacts you about another piece of work that's a distraction, the kind of thing you deludedly think won't take much time but always does, and you vow not to commit to this kind of work until you're willing to put in the hours it really takes. I've already turned down 2 such tasks this week.
No wonder working on that piece of writing felt like such a guilty pleasure. Reading things I hadn't read before as well as some I had, making connections, putting it together and writing the words on paper, staying up well into the night when it was just me and the ideas and the cool night air coming in through the window--I had forgotten how that felt, writing about something that I cared about and that I wasn't responsible to anyone else for writing.
I'm going to hold that feeling in mind as I turn to grading and, yes, more duty-writing.
Mostly what I've been doing is editing and writing stuff for others: editing my own work, responding to others' work, and doing service work that I'm committed to doing. What it reminded me of was this: you can, and I did, spend 16 hours on something (a report, say), and no one will notice it or say anything about it, unless it doesn't get done. You can spend 5 hours responding to something (and I did), and what you'll hear by return email is, "Fine. Now how about this other task?"
If it's what you signed up to do, you put in the hours, and you mark them on Google Calendar so you can see the real number of hours that it takes. You vow to remember this when someone contacts you about another piece of work that's a distraction, the kind of thing you deludedly think won't take much time but always does, and you vow not to commit to this kind of work until you're willing to put in the hours it really takes. I've already turned down 2 such tasks this week.
No wonder working on that piece of writing felt like such a guilty pleasure. Reading things I hadn't read before as well as some I had, making connections, putting it together and writing the words on paper, staying up well into the night when it was just me and the ideas and the cool night air coming in through the window--I had forgotten how that felt, writing about something that I cared about and that I wasn't responsible to anyone else for writing.
I'm going to hold that feeling in mind as I turn to grading and, yes, more duty-writing.
Subscribe to:
Posts (Atom)