Beyond AI Bans: An End of Year AI Integration Pep Talk for Educators.

AI image showing a university professor burning AI inspired by the book Fahrenheit 451.

In December 2022, my first experience with AI was using ChatGPT to write a blog article about social media marketing. I’d been practicing and teaching social media for over a decade, yet ChatGPT wrote an impressive and scary good article in less than a minute – something that may have take me hours!

How did you feel after your first use of ChatGPT? Since then I’ve had ups and downs with Generative AI. From full embrace and cautious integration to dystopian fear and overt avoidance. It’s been a long journey, but I’ve learned much along the way.

The end of the year is a time for reflection.

What I find I need at the end of a long hard year is a pep talk. Anyone else? December alone gifted us “12 days of OpenAI” and major updates from most AI companies like Google, Anthropic, Perplexity, Meta, Apple, Microsoft, IBM, and xAI. I’m still processing what happened in Fall classes and have just two weeks to update courses for Spring.

I can relate to what AI expert Marc Watkins says in his latest Substack,

“I need a reset. Truly, we all do. For the past two years, educators have been asked to reevaluate their teaching and assessments in the wake of ChatGPT, adopt or refuse it, develop policies, and become AI literate. Except generative AI isn’t a normal or novel development within our field of study we can attend some conferences or webinars to understand its impact to keep up with it. None of this has been normal…”

University faculty are woefully behind.

I’ve accomplished much since Fall 2022: Two books, four research articles, three conference presentations, a top teaching paper award, and multiple AI presentations to professionals and faculty. Yet, negatives have me losing sight of the positives.

This fall my LinkedIn feed felt full of posts and comments about how far behind university professors are in AI. I know critiques are valid. In my first adjunct appointment in 2009, a media professor still didn’t teach the Internet because “it was a fad.” Like any profession dinosaurs exist.

University faculty are leading AI adoption.

However, the profs I mostly interact with are working hard to learn and keep up. For every head-in-the-sand professor, there are plenty trying to keep their heads above water with the pace of AI change. My workload has increased with AI not decreased.

So it’s hard to read comments that generalize us all as behind and advocate for replacing us with AI teaching agents. The profs I follow, like Ethan Molick and Marc Watkins, aren’t just teaching but innovating AI in education and their professional disciplines.

Professors are old and boring.

Despite many more positive comments and evidence of grads excelling, human tendency is to focus on the negative. Years ago, I got a student comment,

“I can’t believe someone old enough to be my dad is teaching social media.”

Another student once told me I need to update my headshot because I don’t look like the website photo anymore. Then there’s the student who said my voice is monotone and boring. Ouch! Despite being in the minority, those comments still hurt and I have trouble forgetting them years later.

Professors have wisdom from experience.

Does age equate to being behind? I have a much bigger picture of the world and have lived through many waves of tech advancements. I’ve also spent nearly two decades practicing marketing and now a decade researching and teaching it. A week ago I received this comment from a student’s internship report,

“My academic background in marketing, particularly courses in social media marketing and digital, laid a solid foundation for this internship. Concepts learned in these courses proved instrumental in creating effective social media posts. Without these courses, my social content would have not been as effective or efficient.”

Great right? Yes, but I still struggle to get the negative out of my head. I know I’m not auditioning for America’s Got Talent, I’m an educator not an entertainer, so why can’t I let it go? Human brains have a negative bias. We all tend to engage, emphasize, and focus on the negative – something social media algorithms take advantage of to keep us scrolling.

So thanks to the grad from two years ago who recently gave me a LinkedIn shout-out for my project management software and HubSpot certificate integrations preparing him well. I also appreciate the student graduating this Spring who has had two internships and has already been hired into her dream sports marketing job. She thanked me for what she learned in my digital marketing and other classes to get her there.

We need grace, humility, and confidence.

Constructive criticism is key to learning and advancement, but you also can’t take it too much to heart. You’ll either be so discouraged you give up or you’ll become too timid to experiment for fear of the negative. I am in that moment right now.

I apologize to students and professionals in my field for the ways I was behind in AI advancement or days I wasn’t always engaging. Hopefully, there is room for grace. I’m also humble enough to take the things I can improve upon and implement them in this short window before next semester. To do this I need a boost of confidence.

So this is a pep talk to those profs and professionals who don’t have their head in the sand. You’re trying to keep your head above the water. I’m striving for humility to learn from critiques, grace for my failings, and confidence to head into the Spring semester – with the audacity to teach digital and social media marketing in my early 50s.

AI image showing a university professor burning AI inspired by the book Fahrenheit 451.
AI image generated using Google ImageFX from a prompt to show a university professor burning AI inspired by the book Fahrenheit 451. https://labs.google/fx/tools/image-fx

We need to be more human, more bold.

Speaking of audacious. It’s the motivation for my main article image generated by Google’s ImageFX. My prompt? Show a university professor burning AI inspired by Fahrenheit 451. My human fireworks is to not become replaced by AI teaching agents or young YouTubers selling top 10 strategies for social media success. Marketing thought leader Mark Schaefer inspired the image saying,

“AI has helped create a marketing pandemic of dull. It’s not your fault. Your company probably rewards you for being boring. You’re Google-sufficient and optimized. They’re trying to keep you in their box. But the AI bots are coming. You need to do something, and you need to do it now. It’s time to unleash the HUMAN fireworks in your content. There is no choice. You need to be audacious.”

Thanks for leading us to the future Mark (someone older than me). This is my audacious post that couldn’t be written by AI. AI can’t explain what it feels like to be a professor at this moment or a professional fearing their job loss. AI can’t know what it is to fear its own adoption or know what it is to have grace, humility, and confidence. Google’s AI Overview did give me a nice definition though,

“A state of being confident in one’s abilities while also acknowledging limitations and approaching situations with kindness and respect.”

In bold confidence we also need caution.

While we have no choice in adopting AI, we have a choice in how. Human agency still exists. I don’t want to make the mistakes we made with social media. Have you read Haidt’s book, The Anxious Generation?

Between my period of AI avoidance (pushing off meetings with faculty development) to AI embrace (agreeing to a 5 part AI integration workshop), I created a framework and process to strategically apply AI.

“Move fast and break things” may have helped develop AI, but I’d rather not. A benefit of academia I didn’t have in the fast-paced ad agency world is time for reflection. Marketing success is based on frameworks and processes. I needed that for integrating AI. The result was my summer AI blog series:

  1. Artificial Intelligence Use: A Framework For Determining What Tasks To Outsource To AI [Template]
  2. AI Task Framework: Examples of What I’d Outsource To AI And What I Wouldn’t.
  3. AI Prompt Framework: Improve Results With This Framework And Your Expertise [Template].
  4. More Than Prompt Engineers: Careers With AI Require Subject Matter Expertise [Infographic].
  5. Joy Interrupted: AI Can Distract From Opportunities For Learning And Human Connection.

How I integrated AI in Fall classes.

Coming out of summer I went through every class and assignment to specifically look for places where I felt AI would be helpful for student learning and where it would not. I tried AI for tasks in my assignments and shared what I found with students.

Example of how I gave students specific ways to use AI for one assignment.
Example of how I gave students specific ways to use AI for one assignment.

Each assignment had an AI section giving students specific aspects of the assignment to use AI and how. There was no general ban, but also no OK for all-out use. Using AI for everything shortchanges the learning process as the infographic below illustrates.

This graphic shows that in stages of learning you go through attention, encoding, storage, and retrieval. You need your brain to learn this process not just use AI for the process.
Click the image for a downloadable PDF of this graphic.

I also had a consistent general AI statement on my syllabi (see below). I directed students on when and how to cite AI, and what AI to use with links and directions to use it. I sent them to Copilot for convenience and financial considerations as all students had access to GPT-4 and DALL-E 3 free with their university Microsoft 365 account.

Beyond AI-specific uses in assignments, I had a general AI use policy.

I cautioned about AI copyright issues. I also didn’t want them using AI to complete an entire assignment – why I use TurnItIn’s AI checker. I never used it solely, but academia isn’t the only one using AI detection. A digital marketing professional guest speaker last term told students they use AI in many ways but use AI detectors for their writers. If a client is paying for human-created content, they want to ensure it.

Student uses of AI in assignments.

AI helped students brainstorm and express their ideas. Groups in Integrated Marketing Communications created campaigns for brands like Qdoba. In a class with few graphic design or art students, DALL-E through Copilot enabled them to create customized storyboards of their TV ads and YouTube bumper ads.

A custom storyboard for the Qdoba student team's IMC campaign using DALL-E via Copilot.
A custom storyboard for the Qdoba student team’s IMC campaign using DALL-E via Copilot.

We talked about AI content being great to sell ideas but there may be copyright issues publishing it. There’s also a potential consumer backlash as highlighted in recent Adage articles and Harris Polls.

Example Copilot prompt to find social media influencers.
Students used Copilot to find influencers for their brand social media projects following the prompt framework below.

In social media marketing, students used AI to generate variations of social content captions. Our social media simulation requires many organic posts that must vary for engagement and reach (as with real social posts). Students wrote the main message but let AI create versions to word counts for each social platform. For a brand’s social strategies, they used AI to research influencers, get hashtag ideas, and create images to mock up brand social media posts.

I also taught them prompts to get better results. Using the prompt framework below got me and my students much better results. I heard from colleagues at other universities who are using this framework for their students and getting better results as well.

AI Prompt Framework Template with 1. Task/Goal 2. AI Persona 3. AI Audience 4. AI Task 5. AI Data 6. Evaluate Results.
Click the image to download a PDF of this AI Prompt Framework Template.

What’s to come for the new year?

In my next post, I’ll share my plans for the Spring. Recent AI developments have opened up more possibilities. I’ll explain how I’m using NotebookLM as an AI tutor for one class. I’ll share how I’m going beyond Copilot to leverage new AI capabilities in Adobe Express and Google’s ImageFX.

I’ll also get deeper into new multimodal capabilities of AI with videos exploring live audio interactions in NotebookLM’s Audio Overview and a demonstration of live video conversations with Gemini 2.0 as it “sees” what‘s on my screen.

Banning AI and being behind in AI is the furthest from my mind. I want contribute to how AI can and should (or should not) advance marketing practice and teaching to better prepare us all for the AI revolution.

What have been your struggles and successes with AI?

100% Human Created!

More Than Prompt Engineers: Careers with AI Require Subject Matter Expertise [Infographic].

This graphic shows that in stages of learning you go through attention, encoding, storage, and retrieval. You need your brain to learn this process not just use AI for the process.

This is the fourth post in a series of five on AI. In my last post, I proposed a framework for AI prompt writing. But before you can follow a prompt framework, you need to know what to ask and how to evaluate its response. This is where subject matter expertise and critical thinking skills come in. A reason we need to keep humans in the loop when working with large language models (LLM) like ChatGPT (Copilot), Gemini, Claude, and Llama.

Photo by Shopify Partners from Burst

Will we all be prompt engineers?

Prompt engineering is promoted as the hot, new high-paying career.” Learning AI prompt techniques is important but doesn’t replace being a subject matter expert. The key to a good prompt is more than format. As I described in my post on AI prompts, you must know how to describe the situation, perspective, audience, and what data to use. The way a marketer or manager will use AI is different than an accountant or engineer.

You also must know enough to judge AI output whether it’s information, analysis, writing, or a visual. If a prompt engineer doesn’t have subject knowledge they won’t know what AI got right, got wrong, and what is too generic. AI is not good at every task producing general and wrong responses with the right ones. With hallucination rates of 15% to 20% for ChatGPT, former marketing manager Maryna Bilan says AI integration is a significant challenge for professionals that risks a company’s reputation.

AI expert Christopher S. Penn says, “Subject matter expertise and human review still matter a great deal. To the untrained eye, … responses might look fine, but for anyone in the field, they would recognize responses as deeply deficient.” Marc Watkins, of the AI Mississippi Institute says AI is best with “trained subject matter experts using a tool to augment their existing skills.” And Marketing AI Institute’s Paul Roetzer says, “AI can’t shortcut becoming an expert at something.”

Prompt engineering skills are not enough.

As a college professor, this means my students still need to do the hard work of learning the subject and discipline on their own. But their social feeds are full of AI influencers promising learning shortcuts and easy A’s without listening to a lecture or writing an essay. Yet skipping the reading, having GPT take lecture notes, answer quiz questions, and write your report is not the way to get knowledge into your memory.

Some argue that ChatGPT is like a calculator. Yes and no. This author explains, “Calculators automate a . . . mundane task for people who understand the principle of how that task works. With Generative AI I don’t need to understand how it works, or even the subject I’m pretending to have studied, to create an impression of knowledge.”

My major assignments are applied business strategies. I tell students if they enter my assignment prompt into ChatGPT and it writes the report for them then they’ve written themselves out of a job. Why would a company hire them when they could enter the prompt themselves? That doesn’t mean AI has no place. I’ve written about outsourcing specific tasks to AI in a professional field, but you can’t outsource the base discipline knowledge learning.

AI can assist learning or get in the way.

I know how to keep humans in the loop in my discipline, but I can’t teach students if they outsource all their learning to AI. Old-fashioned reading, annotating, summarizing, writing, in-person discussion, and testing remain important. Once students get the base knowledge then we can explore ways to utilize generative AI to supplement and shortcut tasks, not skip learning altogether. We learn through memory and scientists have studied how memory works. Used the wrong way AI can skip all stages of learning.

Click the image for a downloadable PDF of this graphic.

I remember what it was like being a student. It’s very tempting to take the second path in the graphic above – the easiest path to an A and a degree. But that can lead to an over-reliance on technology, no real discipline knowledge, and a lack of critical thinking skills. The tool becomes a crutch to something I never learned how to do on my own. My performance is dependent on AI performance and I lack the discernment to know how well it performed.

Research skills in searching databases, evaluating information, citing sources, and avoiding plagiarism are needed to discern AI output. The online LLM Perplexity promised reliable answers with complete sources and citations, but a recent article in WIRED finds the LLM search engine makes things up and Forbes accuses it of plagiarizing its content.

A pitch from OpenAI selling ChatGPT Edu, says, “Undergraduates and MBA students in Professor Ethan Mollick’s courses at Wharton completed their final reflection assignments through discussions with a GPT trained on course materials, reporting that ChatGPT got them to think more deeply about what they’ve learned.”  This only works if the students do the reading and reflection assignments themselves first.

Outsourcing an entire assignment to AI doesn’t work.

A skill I teach is situation analysis. It’s a foundation for any marketing strategy or marketing communications (traditional, digital, or social) plan. Effective marketing recommendations aren’t possible without understanding the business context and objective. The result of that situation analysis is writing a relevant marketing objective.

As a test, I asked ChatGPT (via Copilot) to write a marketing objective for Saucony that follows SMART (Specific, Measurable, Achievable, Relevant, Time-bound) guidelines. It recommended boosting online sales by targeting fitness enthusiasts with social media influencers. I asked again, and it suggested increasing online sales of trail running shoes among outdoor enthusiasts 18-35 using social media and email.

Then I asked it to write 20 more and it did. Options varied: focusing on eco-friendly running shoes for Millennials and Gen Z, increasing customer retention with a loyalty program, expanding into Europe, increasing retail locations, developing a new line of women’s running shoes, and increasing Saucony’s share of voice with a PR campaign highlighting the brand’s unique selling propositions (USP). It didn’t tell me what those USPs were.

Which one is the right answer? The human in the loop would know based on their expertise and knowledge of the specific situation. Generated with AI (Copilot) ∙ July 2, 2024 at 3:30 PM

I asked Copilot which is best. It said, “The best objectives would depend on Saucony’s specific business goals, resources, and market conditions. It’s always important to tailor the objectives to the specific context of the business. As an AI, I don’t have personal opinions. I recommend discussing these objectives with your team to determine which one is most suitable for your current needs.” If students outsource all learning to LLMs how could they have the conversation?

To get a more relevant objective I could upload proprietary data like market reports and client data and then have AI summarize. But uploading Mintel reports into LLMs is illegal and many companies restrict this as well. Even if I work for a company that has built an internal AI system on proprietary data its output can’t be trusted. Ethan Mollick has warned that many companies building talk-to-your document RAG systems with AI need to test the final LLM output as it can produce many errors.

I need to be an expert to test LLM output in open and closed systems. Even then I’m not confident I could come up with truly unique solutions based on human insight If I didn’t engage information on my own. Could I answer client questions in an in-person meeting with a brief review of AI-generated summaries and recommendations?

AI as an assistant to complete assignments can work.

For the situation analysis assignment, I want students to know the business context and form their own opinions. That’s the only way they’ll learn to become subject matter experts. Instead of outsourcing the entire assignment, AI can act as a tutor. Students often struggle with the concept of a SMART marketing objective. I get a lot of wrong formats no matter how I explain it.

I asked GPT if statements were a marketing objective that followed SMART guidelines. I fed it right and wrong statements. It got all correct. It also did an excellent job of explaining why the statement did or did not adhere to SMART guidelines. Penn suggests explain it to me prompts to tell the LLM it is an expert in a specific topic you don’t understand and ask it to explain it to you in terms of something you do understand. This is using AI to help you become an expert versus outsourcing your expertise to AI.

ChatGPT can talk but can it network?

Last spring I attended a professional business event. We have a new American Marketing Association chapter in our area, and they had a mixer. It was a great networking opportunity. Several students from our marketing club were there mingling with the professionals. Afterward, a couple of the professionals told me how impressed they were with our students.

These were seniors and juniors. They had a lot of learning under their belts before ChatGPT came along. I worry about the younger students. If they see AI as a way to outsource the hard work of learning, how would they do? Could they talk extemporaneously at a networking event, interview, or meeting?

Will students learn with the new AI tools that summarize reading, transcribe lectures, answer quiz questions, and write assignments? Or will they learn to be subject matter experts who have discerned via AI Task Frameworks and AI Prompt Frameworks the beneficial uses of AI making them an asset to hire? In my next post, the final in this 5 part AI series, I share a story that inspired this AI research and explore how AI can distract from opportunities for learning and human connection.

This Was Human Created Content!