November 08 2024

A New Pedagogy?

Emily Bailey, University of Pittsburgh

I have to admit that despite how ridiculously excited I am to be nearing the final stages of the graduate school process—to get into the classroom and put some of this knowledge to practical use—I’m a bit afraid of the students who will be sitting on the other side of the desk. We’ve all experienced it in some capacity; a room full of eyes staring at us, brimming with expectations, and if we’re lucky, even interest. However, as I observe the next generation of students I can’t help but be aware of a new set of expectations they have for their faculty—the kind of expectations that leave me wondering if we, as future instructors, are fully prepared.

As new faculty entering the field, we will be dealing with a population of students who have been using computers and other forms of technology on a constant and regular basis since they were old enough to type with their tiny fingers. This is a generation of born “techies”—young men and women who can’t imagine the world before digital everything. They are their own academic breed, wanting not only information, but to be interested every step of the way. While this can be problematic for practical purposes—for instance, how much more time-consuming is it to prepare a multimedia lecture versus a traditional one?—it seems to me that there is a delicate balance between using technology as a tool for learning and having it become a mere distraction.

I will admit that although I might be considered part of this “new” generation of students, the rapid technological advances of the past decade have somewhat eluded me. I didn’t own a computer in my first two years of undergraduate school. My cell phone at that time was the size of a small brick, could only make phone calls, was fueled by prepaid cards, and was strictly for emergencies. There were no iPods, iPads, or smartphones, and Facebook was still restricted to the student bodies of a handful of colleges and universities—and yet I considered myself to be technologically savvy. I wrote all of my papers on a computer and did a considerable amount of research online. And all of this wasn’t that long ago—I graduated with my bachelor’s degree in 2006. Yet the generational gap that I feel at times between myself and the students that have recently come under my tutelage feels as wide as the gap between me and my parents, or even my grandparents. I’ve even caught myself on a couple of occasions in the classroom starting to say something frighteningly close to “Well, back in my day. . . .”

Let me make it clear that I don’t think that technology is a bad thing, nor am I against us learning to use all of the tools available to us so that we can teach as effectively as possible in the field. Quite the contrary—personally, I can’t imagine completing a PhD without twenty-four-hour access to Internet library archives and spellcheck. What I’d like to highlight instead is that it is incredibly important for our new generation of faculty not only to utilize the technologies available to us, but to use them thoughtfully and responsibly for their maximum benefit in the classroom. Beyond PowerPoint presentations, which have started to feel archaic, we are fortunate to now have constant access to our students and their work through Blackboard programs, digital grade books, and course blogs. What concerns me most is not the resources that we have available, but those that students now have access to. I’ve begun to wonder how we can use these technological tools to enhance, rather than cheapen, students’ learning. I’ve witnessed that some students’ expectations for their own work have slackened dramatically with increased access to online materials.

In my experience as a teaching assistant, fellow, and instructor, there has been a marked increase in plagiarism with information taken directly from online sources. While this makes it easier for an instructor to catch such plagiarism, students need to know that copying material from a book or from the Internet, or copying someone else’s original words and ideas without citation is not acceptable. And there is another new disturbing phenomenon on the rise—more and more students seem to think that it is appropriate to use texting abbreviations on exams and (the horror!) in papers. Spelling and grammar is going by the wayside as students rely on their technology, rather than their own knowledge, to correct their mistakes. While this behavior is certainly in no way indicative of the whole, it has been prevalent enough in my experience to make me wonder about the future of higher education.

In other words, with technology has come a sort of apathy. As far as I’m aware, it is still a privilege to be accepted into a college or university (with over twenty million enrolled, as of the 2006 US Census). Yet the standards for many of these institutions seem to be falling short of the expectations placed on undergraduate and graduate scholars mere decades ago, and this may be in part because of the ease of technology. Short of circling and underlining in red pen, what is an instructor to do? How can we make learning interesting and productive for Generation Next without compromising academic standards? What can we do to help students stand apart from their equally technologically savvy peers as they move through higher education and into the job market? I think that it’s time for a little tough love.

As their instructors, it’s our responsibility to make students aware of their strengths as well as the areas in which they can improve. I can only speak to my own experiences, but many of the students that I have encountered (some 800 now in my years of working as a teaching assistant and instructor) are intently focused on the almighty grade. How can we use technology to foster an academic environment in which students are kept engaged with material and the learning process, while also helping them to be effective critical thinkers and writers? I believe this can be done by reestablishing the high, but fair, standards that our esteemed institutions have traditionally expected of them. We need to demonstrate by example that technology is a tool for learning and not an easy way out; a medium for more knowledge and not the bare minimum. It seems to me that this will be one of the most significant challenges of our work as the next generation of faculty: helping to guide our students through their technologically enhanced higher educations and world without sacrificing what it means to earn a college degree.

Emily BaileyEmily Bailey is a PhD candidate in the department of religious studies at the University of Pittsburgh, working on her dissertation “Women, Food, and Power: The Protestant Moralization of Eating in America, 1830–1925,” in which she examines the intersections of food, faith, and gendered performances of power in the American Victorian period. Her broader research interests include religion in America, religion and gender, religion and food, and spirituality and sustainability.