top of page
Search

Is AI Coming for Your Job and Your Classroom?

DISCLAIMER: This article was written by a human, grammatically proofed by an algorithm dressed up like an application, edited by another human, and grammatically reproofed by another algorithm. No AI was harmed in making these or any of our blogs.


I use Microsoft Word grammar and spelling tools on my Word platform and then run it through Grammarly. I then hand it to my human content editor to "tighten" my content. Does using Word editing tools and Grammarly make you think less of my writing? Does that make you less likely to read this blog or any other content that you read online or in print?  


I ask the question because, in November 2022, OpenAI released a text-generating artificial intelligence (AI) called ChatGPT. There has been a lot of noise about AI, with comments ranging from 'It is the greatest thing ever' to 'It will destroy humankind,' with many thoughts circulating in the academic world.  


We know that students are aware of and using AI. 


In a recent survey, (48%) of students stated that they have tried AI writing tools at least once, whereas 71% of instructors and administrators have never used these tools, with 32% reporting that they are unaware of them.


Higher education is no stranger to technology lauded as game-changing, disruptive, and revolutionary. Think about it. We all use websites and email, but how many use the custom-built classrooms with wired response pads on each student's desk? I would bet that the answer would be no one, even though 20 years ago, these classrooms were lauded as the 'next, great thing' in classroom instruction. 

 

As I read articles about AI, I wondered if text-generative AI is just another technological advance that will make educators' work more accessible or if it is a technology that will ultimately cost them their jobs. Or maybe something in between?


I recently used ChatGPT 3.5, and I was impressed. One of the doctoral-level classes I taught was career development and writing documents for faculty job applications. As a test, I asked ChatGPT to write a cover letter for an exercise physiology position at a particular University. Bingo!  


In just a few seconds, ChatGPT delivered a good two-page cover letter. There were areas highlighted by ChatGPT that I needed to fill in concerning my teaching and research. Still, otherwise, the letter was good and so much better than many of the cover letters I've seen over the years as I've served on search committees. I was surprised by the quality. But I was also disenchanted because I realized that what I had taught my former students had suddenly been made moot. Why take a class and struggle to write your cover letter when you can put a few details into ChatGPT and get a custom cover letter with just a few blanks to fill in?


After using ChatGPT, I believe the genie is out of the bottle, and there is no way the genie is going back in. AI is here to stay. According to a national survey of 2,000 2-year and 4-year college students, 51% will continue using generative AI tools even if their instructors or institutions prohibit them.

 

Will AI ease or increase the burden of core faculty duties like teaching?

 

AI can make tedious tasks easier in academia, like initially sorting through faculty job applications or making student admission decisions, which seems perfect for AI's non-judgmental (in most cases) repetitive nature. And this may not be bad. However, I believe the real issue surrounding AI is how it affects the classroom.

 

The key to determining if AI should be permitted in assignments is evident in how its use may hinder the education of students. The use of calculators was predicted to harm students. But knowing how to derive a square root never stopped me from knowing when to derive a square root or how to apply the answer. So, will a student using AI to write a paper hinder what they are supposed to learn and how to use it subsequently in their lives?

 

What skills should a student learn that AI may prevent? Will AI prevent students from learning how to write? If that's the case, we already have good tools based on algorithms to check and correct spelling and grammar, which are vital in composition. Perhaps the concern with using AI in classrooms is that it will do the thinking for students and won't allow them to develop their thoughts about the topics they are writing about.  

 

Regardless of what you think or theorize, students will try AI. It's a new, shiny, free tool. Maybe the best thing you can do is give it a try as well. OpenAI will provide you with a free account to use ChatGPT 3.5 for free (the advanced version of ChatGPT 4.0 costs to use). 

I believe that academic individuals will experience surprises and disenchantments as they see AI used increasingly in their jobs and classrooms. This is where the challenge will lie: how do you help students continue learning to integrate and synthesize material in an environment where AI is used?


As AI makes its inroads into classrooms, I believe the teachers' task is to be clear on what they want students to learn. Is it essential for your students to know how to spell, compose, or derive square roots? Or are you primarily concerned about their integration and synthesis of information, which adds to their knowledge and skill sets? I suspect that most are worried about integration and synthesis. Other than using old-school evaluation methods such as short-answer essay exams on paper, understanding what students need to learn may help teachers concentrate and focus on the essence of their teaching.

 

I don't know about you, but I gladly would have welcomed well-written and grammatically correct papers, as long as the students did their integration and synthesis! 

 

If you are looking for articles and practical suggestions that help educators learn how to use AI in the classroom, check out this article from the  Chronicle. It gives tools to ensure students aren't using AI in their papers and background on why you may want to do so.   


Keep Moving Forward


Cheers,

bottom of page