How Will AI Agents Impact Marketing Communications Jobs & Education? See Google’s AI Reasoning Model’s “Thoughts” And My Own.

AI image generated using Google ImageFX from the prompt “Create a painting depicting the British army in red coats as AI robots coming into town to take people's jobs." https://labs.google/fx/tools/image-fx

In my last post, I warned of the AI agents coming to take our jobs like Paul Revere warning of the British coming. Large language model companies like OpenAI, Google, and SAAS companies integrating AI are promising increased autonomous action. Salesforce has even named their AI products Agentforce, which literally sounds like an army coming to take over our jobs!

Whether you are in marketing. Advertising, PR, or Corporate Communications or are a professor in these areas it is important to remember that AI agents and the new reasoning models are not magical or human. They are simply really good prediction machines. But they are so good that AI will increasingly take parts of our jobs now and potentially replace entire jobs in the not-too-distant future.

But they are not good at everything and not always right. That’s why you need to be involved in determining how AI will be used in your job. Don’t let AI happen to you. Make AI work for you.

AI image generated using Google ImageFX from the prompt “Create a painting depicting the British army in red coats as AI robots coming into town to take people’s jobs.” https://labs.google/fx/tools/image-fx 

Productivity gains are already happening with AI.

Ethan Mollick, author of Co-Intelligence: Living and Working with AI recently shared a study that found 30% of U.S. workers are using AI every day and that it is tripling their productivity (reducing a 90-minute task to 30 minutes). If you are not in that 30% there is still time to catch up. In all honesty, as much as I write about AI and implement it in my classes I don’t use it as much as I should for my everyday tasks.

That’s why I turned to Gemini for help with this post. I wanted to test a new reasoning model and see how it thinks but also use it as a research assistant. Writing an article like this takes a lot of time. In addition to testing the new Gemini “reasoning” model, I was looking for time savings in researching how AI agents may impact marcom jobs.

In this post, I look under the hood to see how AI crafts its responses while seeing what Google’s new reasoning model “thinks” about the future of marketing related careers. Will AI agents take our jobs? If so, how soon? For my test, I gave Gemini 2.0 Flash Thinking a prompt that I know worries many in my field. Below is my prompt. I wanted a brutally honest assessment.

I asked Google’s reasoning model Gemini 2.0 Flash to give me a brutally honest look at the future of marketing jobs and how they will be impacted.” From https://aistudio.google.com/
I asked Google’s reasoning model Gemini 2.0 Flash Thinking to give me a brutally honest look at the future of marketing jobs and how they will be impacted. https://aistudio.google.com/

What does AI think about AI agents taking our jobs?

First, let’s get to know the reasoning model I used. Google explains it by saying, “the Gemini 2.0 Flash Thinking model is an experimental model that’s trained to generate the “thinking process” the model goes through as part of its response. As a result, the Flash Thinking model is capable of stronger reasoning capabilities in its responses than the Gemini 2.0 model.

How do you see its thinking? In the screen capture of my prompt above you have an option to click on “Expand to view model thoughts” before you read the response. I did this to see its chain of thought and include the thought process in the screen capture below.

Gemini took a 10-step process to get to the final answer:

  1. Acknowledge the User’s Need
  2. Frame the Initial Message
  3. Structure the Timeline
  4. Brainstorm Areas of Impact (Current & Future)
  5. Assign Percentage of Impact – Now
  6. Incrementally Increase Percentages Over Time
  7. Directly Address Jobs Replacement – Hard Truths
  8. Focus on Skill Sets Needed for Survival and Success
  9. Maintain a “Brutal but Constructive” Tone
  10. Refine and Sharpen Language.
I asked Google’s reasoning model Gemini 2.0 Flash to give me a brutally honest look at the future of marketing jobs and how they will be impacted.” From https://aistudio.google.com/
Google’s Gemini 2.0 reasoning model showed me the thinking process for responding to my prompt. https://aistudio.google.com/

Seeing AI’s thought process and its self-correction.

Before my brutally honest prompt, I submitted a prompt to get an honest, yet reassuring answer to the same question. In the screen capture below you can see how numbers 1 and 2 in the thinking process varied from above. I imagine that is how I think when writing for different audiences. That is why tools such as personas are great for marketing professionals in crafting content.

In that first prompt, I also saw an example of how it “self-corrected” in the process. An initial prediction of AI automating 50% of marketing content within a year was second guessed as Gemini talked to itself saying “That’s likely too high and broad. AI can automate some content creation tasks like basic … but not complex storytelling, brand voice development, or strategic content planning.” This self-correction resulted in it dropping that number down to 20-30%.

I asked Google’s reasoning model Gemini 2.0 Flash to give me a brutally honest look at the future of marketing jobs and how they will be impacted.” From https://aistudio.google.com/
Gemini 2.0 Flash Thinking showed how it self corrected a prediction about AI taking on 50% of content marketing tasks next year. https://aistudio.google.com/

Now let’s get to its final response. How worried should we be as professional marketers or communications professionals that support marketing? What should we be doing to prepare ourselves and our students for this revolution? The response is broken into three “Brutal Truths.” From my research and study over the years, most of this feels accurate. Honestly, much of the first category is already happening and has been done for years by other forms of AI. So it is not surprising to me.

Brutal Truth 1: Some parts of your job will be replaced and some jobs will be eliminated.

Below is the screen capture of Gemini’s response. It predicts 5-20% of all tasks will be outsourced to AI in an “efficiency overhaul.” This includes mundane and repetitive tasks, basic content creation, and customer segmentation plus lower-tier performance reporting and analytics. This fits what I know.

In the last two years, we’ve seen more basic content creation being done by AI whether through LLMs like ChatGPT or AI integrations into SAAS platforms such as Owly Writer in Hootsuite. For customer segmentation, I can see AI helping with data collection, but overall segmenting audiences requires more human insight.

The final one is not a surprise. Creating auto-generated reports off previously set-up dashboards has been around for years. The important part is knowing what KPIs are important in the first place – the realm of a seasoned human strategist. The new aspect may be auto-generating the initial language around the reports and a prompt overlay. But I still would not rely on AI to understand the full context.

I asked Google’s reasoning model Gemini 2.0 Flash to give me a brutally honest look at the future of marketing jobs and how they will be impacted.” From https://aistudio.google.com/
Google’s reasoning model Gemini 2.0 Flash Thinking’s brutally honest truth number one about the future of marketing jobs and how they will be impacted. https://aistudio.google.com/

Brutal Truth 2: The demand shift is dramatic. Adapt or fade.

Below is the screen capture of Gemini’s second brutal truth which is that the demand shift will be dramatic. Gemini tells us to “adapt or fade.” After the brutal message, it does try to quickly reassure us saying that marketing isn’t going away. But don’t feel too good about that reassurance because it is followed up with an all-caps pronouncement that it is changing RADICALLY.

Obviously, you want to position yourself to be in one of the high demand areas such as strategic marketing visionaries (AI-augmented), creative directors and brand storytellers (AI-guided), data-driven insight interpreters and storytellers, AI marketing technologists and integrators, ethical AI marketing guardians, and human-connection and empathy experts. At first glance, I feel competent in many of these areas and confident in teaching my students these higher level skills.

Once again, this doesn’t surprise me. My revelation in AI came when I stopped thinking of it as this all-or-nothing entity. The big scary redcoats coming became more manageable when I broke down my job into tasks and reclaimed my human agency to intentionally decide what to use AI for and what not to use it for. What I learned I put into my AI Use Framework. It helped me and can help you break down anything into single tasks and their goals.

I asked Google’s reasoning model Gemini 2.0 Flash to give me a brutally honest look at the future of marketing jobs and how they will be impacted.” From https://aistudio.google.com/
Google’s reasoning model Gemini 2.0 Flash Thinking’s brutally honest truth number two about the future of marketing jobs and how they will be impacted. https://aistudio.google.com/

Whether you follow my framework or not, I encourage everyone to do this exercise of breaking down your job into tasks and intentionally finding the things that can easily be automated by AI. You will be surprised at what you won’t mind giving to AI to spend more time on what you enjoy more anyway. You’ll also discover things that could be automated but should be kept for humans because the goal is to build relationships and relationships can’t be automated.

The high-demand future list looks accurate. Those are all uniquely human-based skills even if parts become AI-augmented or AI-guided. The key is to make this shift yourself now. If you don’t AI will become the thing that happens to you, not the thing that you help shape and influence. Quickly find those tasks that can and should be outsourced to AI and then start using it. Just don’t trust it for everything. No matter how confident it sounds, it doesn’t always get everything right. Use your discipline expertise to discern and verify results.

Brutal Truth 3: Upskilling is not optional. It is survival.

The third brutal truth reinforces what I said above. Upskilling is not an option. It’s about survival. AI innovation is coming quicker than any other technology revolution. You can’t opt out (unless you’re retiring this year). Thus, you need to become AI literate, focus on strategy and creative thinking, embrace data, learn to work with AI, and specialize strategically.

I am not a history professor or war strategy expert, but I’ll make one final connection to the theme of my last two posts. Some factors that contributed to the colonists winning the American Revolution include being familiar with their home territory (your discipline), strong motivation (defend your livelihood), and fighting for something they believed in (human ability and agency).

The Continental Army was also willing to move away from traditional methods of battle. Your discipline, whether it’s marketing, communications, advertising, PR, teaching, or something else, may have a long tradition of doing things a certain way. Now is the time to find new methods to remain relevant and keep humans in the loop in light of the AI revolution.

I asked Google’s reasoning model Gemini 2.0 Flash to give me a brutally honest look at the future of marketing jobs and how they will be impacted.” From https://aistudio.google.com/
Google’s reasoning model Gemini 2.0 Flash Thinking brutally honest truth number three about the future of marketing jobs and how they will be impacted. https://aistudio.google.com/

I’m trusting AI for these predictions, but I’ve been studying AI since 2022 and they seem accurate. They also match a similar prompt I tried in Anthropic’s Claude 3.7 and what SmarterX’s custom GPT JobsGPT 2.0. predicts. I shared JobsGPT with my AI use framework to help break down jobs into tasks to decide what to outsource to AI. The new feature forecasts AI jobs by industry, profession, or college major by job title, description, and skills required – helpful for professors’ curriculum and professionals’ upskilling.

I asked JOBGPT 2.0 by SmarterX to forecast new jobs that could emerge for marketing majors as AI reshapes the industry from https://chatgpt.com/g/g-wg93fVwAj-jobsgpt-by-smarterx-ai
I asked JobsGPT 2.0 by SmarterX to forecast new jobs that could emerge for marketing majors as AI reshapes the industry from https://chatgpt.com/g/g-wg93fVwAj-jobsgpt-by-smarterx-ai

I feel good about what I’m doing in my classes. I’ve always focused on higher-level strategic thinking and creativity focused on human insight and emotions through storytelling. Now I’m teaching students how to integrate AI into marketing, communications, and learning tasks. What can you do to help prepare for this future?

I asked Anthropic's Claude 3.7 to forecast how marketintg related jobs will change with AI agents and make recommendations for professors. https://claude.ai/
I asked Anthropic’s Claude 3.7 to forecast how marketing-related jobs will change with AI agents and make recommendations for professors. https://claude.ai/

This Was 50% Human Created Content!

More Than Prompt Engineers: Careers with AI Require Subject Matter Expertise [Infographic].

This graphic shows that in stages of learning you go through attention, encoding, storage, and retrieval. You need your brain to learn this process not just use AI for the process.

This is the fourth post in a series of five on AI. In my last post, I proposed a framework for AI prompt writing. But before you can follow a prompt framework, you need to know what to ask and how to evaluate its response. This is where subject matter expertise and critical thinking skills come in. A reason we need to keep humans in the loop when working with large language models (LLM) like ChatGPT (Copilot), Gemini, Claude, and Llama.

Photo by Shopify Partners from Burst

Will we all be prompt engineers?

Prompt engineering is promoted as the hot, new high-paying career.” Learning AI prompt techniques is important but doesn’t replace being a subject matter expert. The key to a good prompt is more than format. As I described in my post on AI prompts, you must know how to describe the situation, perspective, audience, and what data to use. The way a marketer or manager will use AI is different than an accountant or engineer.

You also must know enough to judge AI output whether it’s information, analysis, writing, or a visual. If a prompt engineer doesn’t have subject knowledge they won’t know what AI got right, got wrong, and what is too generic. AI is not good at every task producing general and wrong responses with the right ones. With hallucination rates of 15% to 20% for ChatGPT, former marketing manager Maryna Bilan says AI integration is a significant challenge for professionals that risks a company’s reputation.

AI expert Christopher S. Penn says, “Subject matter expertise and human review still matter a great deal. To the untrained eye, … responses might look fine, but for anyone in the field, they would recognize responses as deeply deficient.” Marc Watkins, of the AI Mississippi Institute says AI is best with “trained subject matter experts using a tool to augment their existing skills.” And Marketing AI Institute’s Paul Roetzer says, “AI can’t shortcut becoming an expert at something.”

Prompt engineering skills are not enough.

As a college professor, this means my students still need to do the hard work of learning the subject and discipline on their own. But their social feeds are full of AI influencers promising learning shortcuts and easy A’s without listening to a lecture or writing an essay. Yet skipping the reading, having GPT take lecture notes, answer quiz questions, and write your report is not the way to get knowledge into your memory.

Some argue that ChatGPT is like a calculator. Yes and no. This author explains, “Calculators automate a . . . mundane task for people who understand the principle of how that task works. With Generative AI I don’t need to understand how it works, or even the subject I’m pretending to have studied, to create an impression of knowledge.”

My major assignments are applied business strategies. I tell students if they enter my assignment prompt into ChatGPT and it writes the report for them then they’ve written themselves out of a job. Why would a company hire them when they could enter the prompt themselves? That doesn’t mean AI has no place. I’ve written about outsourcing specific tasks to AI in a professional field, but you can’t outsource the base discipline knowledge learning.

AI can assist learning or get in the way.

I know how to keep humans in the loop in my discipline, but I can’t teach students if they outsource all their learning to AI. Old-fashioned reading, annotating, summarizing, writing, in-person discussion, and testing remain important. Once students get the base knowledge then we can explore ways to utilize generative AI to supplement and shortcut tasks, not skip learning altogether. We learn through memory and scientists have studied how memory works. Used the wrong way AI can skip all stages of learning.

Click the image for a downloadable PDF of this graphic.

I remember what it was like being a student. It’s very tempting to take the second path in the graphic above – the easiest path to an A and a degree. But that can lead to an over-reliance on technology, no real discipline knowledge, and a lack of critical thinking skills. The tool becomes a crutch to something I never learned how to do on my own. My performance is dependent on AI performance and I lack the discernment to know how well it performed.

Research skills in searching databases, evaluating information, citing sources, and avoiding plagiarism are needed to discern AI output. The online LLM Perplexity promised reliable answers with complete sources and citations, but a recent article in WIRED finds the LLM search engine makes things up and Forbes accuses it of plagiarizing its content.

A pitch from OpenAI selling ChatGPT Edu, says, “Undergraduates and MBA students in Professor Ethan Mollick’s courses at Wharton completed their final reflection assignments through discussions with a GPT trained on course materials, reporting that ChatGPT got them to think more deeply about what they’ve learned.”  This only works if the students do the reading and reflection assignments themselves first.

Outsourcing an entire assignment to AI doesn’t work.

A skill I teach is situation analysis. It’s a foundation for any marketing strategy or marketing communications (traditional, digital, or social) plan. Effective marketing recommendations aren’t possible without understanding the business context and objective. The result of that situation analysis is writing a relevant marketing objective.

As a test, I asked ChatGPT (via Copilot) to write a marketing objective for Saucony that follows SMART (Specific, Measurable, Achievable, Relevant, Time-bound) guidelines. It recommended boosting online sales by targeting fitness enthusiasts with social media influencers. I asked again, and it suggested increasing online sales of trail running shoes among outdoor enthusiasts 18-35 using social media and email.

Then I asked it to write 20 more and it did. Options varied: focusing on eco-friendly running shoes for Millennials and Gen Z, increasing customer retention with a loyalty program, expanding into Europe, increasing retail locations, developing a new line of women’s running shoes, and increasing Saucony’s share of voice with a PR campaign highlighting the brand’s unique selling propositions (USP). It didn’t tell me what those USPs were.

Which one is the right answer? The human in the loop would know based on their expertise and knowledge of the specific situation. Generated with AI (Copilot) ∙ July 2, 2024 at 3:30 PM

I asked Copilot which is best. It said, “The best objectives would depend on Saucony’s specific business goals, resources, and market conditions. It’s always important to tailor the objectives to the specific context of the business. As an AI, I don’t have personal opinions. I recommend discussing these objectives with your team to determine which one is most suitable for your current needs.” If students outsource all learning to LLMs how could they have the conversation?

To get a more relevant objective I could upload proprietary data like market reports and client data and then have AI summarize. But uploading Mintel reports into LLMs is illegal and many companies restrict this as well. Even if I work for a company that has built an internal AI system on proprietary data its output can’t be trusted. Ethan Mollick has warned that many companies building talk-to-your document RAG systems with AI need to test the final LLM output as it can produce many errors.

I need to be an expert to test LLM output in open and closed systems. Even then I’m not confident I could come up with truly unique solutions based on human insight If I didn’t engage information on my own. Could I answer client questions in an in-person meeting with a brief review of AI-generated summaries and recommendations?

AI as an assistant to complete assignments can work.

For the situation analysis assignment, I want students to know the business context and form their own opinions. That’s the only way they’ll learn to become subject matter experts. Instead of outsourcing the entire assignment, AI can act as a tutor. Students often struggle with the concept of a SMART marketing objective. I get a lot of wrong formats no matter how I explain it.

I asked GPT if statements were a marketing objective that followed SMART guidelines. I fed it right and wrong statements. It got all correct. It also did an excellent job of explaining why the statement did or did not adhere to SMART guidelines. Penn suggests explain it to me prompts to tell the LLM it is an expert in a specific topic you don’t understand and ask it to explain it to you in terms of something you do understand. This is using AI to help you become an expert versus outsourcing your expertise to AI.

ChatGPT can talk but can it network?

Last spring I attended a professional business event. We have a new American Marketing Association chapter in our area, and they had a mixer. It was a great networking opportunity. Several students from our marketing club were there mingling with the professionals. Afterward, a couple of the professionals told me how impressed they were with our students.

These were seniors and juniors. They had a lot of learning under their belts before ChatGPT came along. I worry about the younger students. If they see AI as a way to outsource the hard work of learning, how would they do? Could they talk extemporaneously at a networking event, interview, or meeting?

Will students learn with the new AI tools that summarize reading, transcribe lectures, answer quiz questions, and write assignments? Or will they learn to be subject matter experts who have discerned via AI Task Frameworks and AI Prompt Frameworks the beneficial uses of AI making them an asset to hire? In my next post, the final in this 5 part AI series, I share a story that inspired this AI research and explore how AI can distract from opportunities for learning and human connection.

This Was Human Created Content!