This past weekend, I gave a presentation with Susan Miller-Cochran at the Southeast Missouri State University blended learning conference on critically adopting technology in a hybrid writing classroom. The presentation was well-received and I’ve even gotten some follow-up correspondence based on our discussion. But what I want to talk about here, though, is an idea that came up in a conversation I had after our presentation with some of the those in attendance.
One of the comments made raised the issue of, “Yes, technology is there, but – why should I be teaching it in my course? A student getting an education at a university should be capable of using those tools of his/her own accord, not through me teaching it.” There are a few assumptions here: that course time should be solely focused on course content, not skill development; that if a faculty member learned on his/her own time, the student should be capable of doing so too; and that students these days are “digital natives” and probably know a lot of this anyway. I think the first two could be easily countered, so I’d like to focus on the last, and more complex idea – the one of “digital natives” in the university system.
Many educators question the assertion that students now are “digital natives,” that they grew up with this technology and thus are well-versed in it and can use it in ways that educators and/or older generations cannot. A recent post on Digital Media and Learning Central debunks four key myths in the “digital native” discussion: that digital natives are always young, that they were “born digital,” that they live digital lives (and thus have a hard time living/communicating without a screen or device) and that being connected = being digital. The best point made is that being a digital native means being more than having access – it means being able to critique, evaluate, produce, amplify, respond, and so much more. And this is what educators are not seeing students do with their access.
Case in point: a critical component of digital literacy is sifting through an evaluating the content that is produced on a daily basis and can potentially be used in research and writing. There’s much out there on the web that is a wolf in sheep’s clothing. A good example of a lack of critique came in a debate about technology in education in my PhD class a few weeks ago. In playing an anti-technology role, a student cited from The John William Pope Foundation’s website. The point made drew immediate laughter and dismissal, with a quick reference made to the real mission of the organization and its founder. However, that brief comment also raised a more important issue. Much content on the web is disguised as something it is not, and without having students participate in analysis and critique of sites, we miss an important opportunity to teach them that not everything on the web is genuine, truthful, worthwhile, or credible for citing. (And this Pope Foundation example may be as good as any).
Using word processors is another simple example from my own experience. After teaching students how to use our LMS (Moodle), a word processor is the next tool that I focus on in the class, showing them how to use rulers, change styles, create a hanging indent, save as a different file type, and insert comments. I generally have one or two students every semester who know how to insert a comment, but for every other student, it’s magic, and all of them need time to get the hang of using the rulers and styles. Another critical literacy for word processors? Naming files. That’s more of a rhetorical literacy for technology – how do you name a file so that your instructor knows it’s yours amongst the 22 others in the queue? How do you name it so that you know which project it is when you go back to revise? We need to teach students not only to use the tools adeptly, but also to think critically and rhetorically about how and why they are using them (and this isn’t a new idea; it comes from Stuart Selber several years ago).
Being native in a language means having fluency, and we’d all agree that simply having access to a language everyday does not equal having fluency – so why do we conflate this idea when it comes to technology? We must for now call our students something other than digital natives. Digitally naive (while catchy) doesn’t seem all that appropriate. Must we call them anything? My parents’ and grandparents’ generations were literacy natives, but I don’t think anybody called them that. If we don’t label them as digital anything, then we may be more apt to think of them simply as students we have to teach and prepare for the world that awaits them, and all of the skill sets that they need to do well when they get there.