New Conversation: Professional Development for AI in Education

2025-01-03

New Conversation: Professional Development for AI in Education

In December 2024, Z.W. Taylor interviewed Andria Layton, a high school English teacher from South Mississippi. They discussed Andria's innovative approach to introducing artificial intelligence (AI) to educators through a professional development program. Layton, who has witnessed the rapid evolution of technology in education, shared her insights into the challenges and opportunities presented by AI. Despite working in a district with limited resources, she has been pushing for AI integration not only to support student learning but also to enhance teacher self-efficacy and administrative efficiency. This conversation explores the barriers to AI adoption in education, the ethical considerations, and the potential for AI to reshape the future of teaching and learning.

Z.W. Taylor:

So Andria, share a bit about your background and what got you interested in AI and education. I don’t know many people who are really that interested. They’ve heard about it, but what sparked your interest?

Andria Layton:

I teach 10th grade English at a high school in South Mississippi. I started in a very low-income area, and now I’m at a school with better funding, so it’s been interesting to see the difference in technology between the two schools. My interest in technology started because I grew up during a transformative time in tech evolution. I watched how Nokia phones turned into flip phones and eventually into smartphones. I also saw Myspace, which kind of inadvertently taught us how to code, evolve into Facebook, and then Facebook evolve into what it is now, along with platforms like X (formerly Twitter). So I’ve always been very interested in the way technology evolves.

Once I became an educator, I started to realize that technology in education evolves in a different, almost slower way. I started in a rural area where we didn’t have much technology at all, and then COVID-19 pushed us to adopt tech quickly. When COVID hit, we started getting Chromebooks. Because I was tech-savvy from my personal interest, I was the go-to person for things like setting up Google Meet or Google Classroom. It was a time when some people only knew how to compose an email, and others didn’t even know that. Now I’m at a school with a lot of technology, funding, and training, but there’s still a lot of push back against technology. It’s something that really interests me, especially in the world of education.

Z.W. Taylor:

You’ve touched on a couple of interesting intersections. Let’s start by talking about technology broadly. For those reading this interview, many countries around the world have very socialized education systems, where resources are distributed more or less equally. Public schooling, including colleges and universities, is free—taxpayer-funded as a public benefit. In the U.S., however, the distribution of money really dictates what happens with technology in education. There have been plenty of studies showing that the wealthier the district, the more technology they have—more computers per student, more software, more cloud storage, and even entire IT teams, directors of digital initiatives, and data analysts to use advanced technology to make operations more efficient. This goes far beyond classroom instruction; it also impacts the administration of schools.

I’ve taught at a middle school where one of the buildings was being powered by an extension cord, literally. They ran an extension cord from the main building to a smaller space, where 20 desktop computers were plugged into a power strip. This was because the school was bursting at the seams, and they needed space fast. Meanwhile, just one county over, there was a one-to-one district with high-speed internet in every building, Ethernet wiring to teacher computers, and the capability to stream video in class without lag. These are things you might not think of as essential for 21st-century teaching and learning in public schools, but resources absolutely make a difference.

I hear you about the technological advancements in our lifetimes. For many in Generation X and Boomers, it started with typewriters and basic keyboarding skills. It wasn’t until the ‘80s and ‘90s that we saw technological breakthroughs like personal computers, flash drives, and the internet. We’ve lived through full life cycles, like watching the Nokia brick phone evolve into smartphones and now foldable smartphones with touch screens. It’s incredible.

Building on this discussion of resources and technology—artificial intelligence in education is resource-intensive. The learning curve is steep if you want to program your own AI. Building your own AI is very different from just using something like ChatGPT. What got you interested in AI, and what need did you see in your district to leverage this technology? I know you’re developing an exciting PD program, but what need did you identify at the school level for this kind of technology?"

Andria Layton:

Now that I’m in a much better-funded district with a lot more technology, there’s still a big pushback against AI. I think a lot of it comes from a lack of knowledge. Many teachers immediately view AI as a tool for student cheating, without considering the administrative benefits it could offer or how students will need to interact with AI since it’s not going away. If we’ve learned anything from the evolution of technology, it’s that we’re only going to move forward. So maybe we should teach students how to use it effectively and productively.

But even with the abundance of technology and fantastic training available, there’s still a significant knowledge gap. Some teachers jump on board, and a lot of our administration is supportive, but many still don’t understand what AI is or how it can be used. What’s interesting is that many people are using AI without realizing it. For example, learning platforms like Quizlet and others in public education already use AI, but teachers often don’t recognize that they’re interacting with it. This lack of understanding prevents them from using it effectively.

So the problem really starts with that pushback, and then you see the knowledge gap. We’re teaching a lot of different programs, but we’re not teaching what artificial intelligence actually is. If we understand what it is, it becomes easier to identify and leverage it for our benefit. This is a significant gap in public education, and even in a district with a lot of technology and funding, it’s still very much present. It spans across different demographics and continues to be an issue.

Z.W. Taylor:

Totally. I share your long-term vision that AI is here, and while there will be resistance, it’s going to happen—like the saying goes, "resistance is futile." The technology will keep advancing, and if we don’t learn how to use it, the people who understand it will have an advantage. It will become part of job descriptions, and those who don't embrace AI or have the professional development opportunities to learn it in a non-threatening way could fall behind. I think it’s crucial to create spaces where people can put their guard down and authentically learn about AI without fear.

When you think about the history of educational technology, at one point, the chalkboard was considered cutting-edge, and a battery-powered calculator or an abacus were considered cheating. That shows just how far we’ve come. And now, as you mentioned, the fear around generative AI, especially students providing prompts and AI generating responses, seems to overshadow the broader applications of AI. For example, platforms like Quizlet or adaptive testing use AI to adjust the difficulty of questions based on previous answers—that’s AI. It’s a real application of algorithms in educational settings.

From a communications perspective, AI is everywhere too. Drip campaigns, sending automated emails or text messages, are all powered by AI. You input data, and the system updates statuses or removes contacts without human intervention. Even something like sending a newsletter via MailChimp involves AI because the program does the work without direct human input. It’s not just generative AI—it’s broader, and many people don’t realize how much they’re already using it.

To transition a bit and keep this line of thinking going, as an English teacher, how do you feel about the generative aspect of AI? Do you ever have moments where you think, "This writing doesn’t match the student’s usual level," and then realize it’s something much more advanced, like Shakespeare? How do you feel about the generative nature of AI in your field?

Andria Layton:

Understanding AI helps a lot in the classroom because, having used and familiarized myself with it, I can spot when a student may be using it. It’s not just about catching cheating, but about recognizing the patterns that emerge, especially when students use free versions of AI tools. Free tools often have certain limitations, and the answers they generate can look very similar. When you know what that looks like, you can easily identify it. It's alarming, of course, because you’re dealing with students who may cheat, but this isn’t new—before AI, we had students plagiarizing from each other. When plagiarism became a bigger issue, tools like Turnitin were developed to help catch it. These tools aren’t perfect, but they help, and as AI evolves, we’ll need to keep evolving our methods to catch it.

Personally, I use a couple of programs to detect AI use, and while they’re not 100% accurate, they’re better than nothing. Ultimately, AI is not going anywhere, so the best way to address it, in my opinion, is to teach students about AI and how to use it appropriately. Some students will misuse it, just like some students will copy someone else’s homework. But if we teach them how to interact with AI in productive ways, like using it to generate ideas rather than writing entire essays, we can guide them toward more responsible usage. For example, in class, I could have students put their own writing into an AI tool to get feedback. This might take a full class period, as students would have to exchange papers and provide feedback, but it helps them learn how to use AI constructively.

In my state, where we’re under pressure to hit specific standards due to state testing, time is critical. Using AI in this way can save class time and still help students improve their writing. I know that AI can be intimidating, especially for those who don’t understand it, and even for those of us who do, it can still feel scary. But the best way to deal with it is to work with it. It’s like when people in the printing press feared the rise of the internet, or when those who built horse-drawn wagons thought cars would take their jobs. Technology evolves, and we have to be willing to evolve with it and make it work for us."

Z.W. Taylor:

Great point. I think a lot of the onus is also on the teacher. Speaking from experience and things I did as a young teacher, I used to recycle assignments and often ask for fairly simple tasks. In the grand scheme of Bloom’s Taxonomy, you want to push students to the very top, which is creation—moving beyond synthesis to create something new. However, generative AI, in many ways, isn’t really creating anything new. It’s synthesizing ideas from other texts that have already been programmed and trained into the AI, so it’s not true creation.

That leads to the types of assignments students are being given. Teachers are often creating assignments that AI can easily generate. I think there needs to be a more progressive approach to creating assignments where, one, students can’t use AI, or two, where using AI is a requirement. Can we raise the bar? Can we design assignments that either don’t allow AI or actually require students to interact with AI to complete them? By doing so, students would gain exposure to this technology in a meaningful way, rather than just simple tasks like "Tell me about the War of 1812." Instead, we could ask them to do more complex things, like comparing the War of 1812 to the most influential book they’ve read, or analyzing how their experiences relate to a specific historical event. These kinds of assignments push students to engage with the material in a way that AI isn’t trained to replicate, requiring truly unique creation—human creation.

This type of assignment could coexist alongside AI use, but we need to rethink how we approach assignments. I know we didn’t plan to discuss this, but what’s your take on the complexity of assignments and the teacher’s role in understanding AI to make assignments relevant in the year 2025?

Andria Layton:

I think what you're saying is exactly right. One of the reasons teachers are hesitant to change is that they want things to be easy. The administrative load we carry takes up a lot of time, so when we recycle assignments, we know what to expect, and grading becomes easier. But the downside is that assignments that are easy to grade are also easier to cheat on, especially with the way technology is evolving.

We need to create more complexity in assignments, where even if students use AI, they still have to engage with it in a way that requires their own thoughts and some level of learning. Even if they use AI to cheat, they should still learn something in the process. The challenge, of course, is that this approach might make grading a bit less robotic—more complex assignments require more nuanced reading and grading. But I also think that if we learn how to effectively use AI in the classroom, we could reduce our workload in other areas, giving us more time to focus on these more engaging assignments.

Ultimately, this would be more beneficial for students. Even if they try to use AI, they would still need to figure out how to use it effectively, which means they’re still learning in the process.

Z.W. Taylor:

I think about this from the higher ed perspective as well. The typical expectation for most college students, and research backs this up—Anthony Carnevale from the Georgetown University Center on Education and the Workforce has shown that most people pursue education to improve themselves and get a job. It’s a very employment-driven goal. Sure, a small percentage might go to college for the social experience or because they want to, but for the most part, 90% of students are pursuing higher education to earn a credential that qualifies them for a job and provides the necessary training and skills.

When we think about the jobs of the future, the ones that are more or less "AI-proof" are the ones that will require the kinds of work we’ve just talked about—critical thinking, creativity, communication, and the ability to understand resources and limitations that machines don’t understand. These are jobs that require human dynamics, problem-solving, and creative thinking. While these roles will undoubtedly be augmented by AI, they will likely be much safer from automation.

So, I think there needs to be a balance—learning alongside AI—to prepare the next generation for careers in a workforce where critical thinking jobs and leadership positions won’t be easily replaced by AI. Many of the manual labor jobs, yes, will be taken over by AI and robots in the next few decades. But the "thinking jobs" will remain, and it’s our responsibility to help students prepare for that future workforce reality. How do you feel about how this shift affects young people in finding careers?

Andria Layton:

I think that’s absolutely right. That’s been my argument all along, especially when pushing for technology in general, and when there’s been pushback—like at my previous school, where we just started integrating technology during COVID. My point was always that we have to teach kids how to use technology because without it, they’ll fall behind. College students won’t succeed if they don’t know how to operate a computer, and the same goes for AI. As I said earlier, AI is going to continue to evolve, and we have to learn to work alongside it.

There will definitely be jobs that seem to be lost due to AI, but I think it’s more accurate to say that they will be changed. New opportunities will open up, and we’ll have to adjust to what’s happening. It’s already happening, and we need to make sure students are prepared for that. As a teacher of a state-tested subject, it’s easy to fall into the mindset of just teaching to the test. But the truth is, students need to know how to think critically. They need to be able to read, comprehend, and understand multiple perspectives on an issue. These are skills they need outside of school. Now, those skills are going to include understanding and working with AI. They need to know how to interact with AI to move forward.

I don’t think we can avoid this reality, and I believe it’s doing students a disservice to say AI is bad, don’t use it, or ignore it altogether. We also do them a disservice by giving them simple assignments where they can just plug in information without really thinking. Both extremes are dangerous. To prevent these issues, we need to teach students what AI is, how it works, and how it can be used responsibly.

Z.W. Taylor:

Great transition. So, how do we prepare students better? And how do we do a better job of preparing the teachers? I have a little more inside knowledge than most people who might be reading this interview, but you have a great idea for providing professional development about AI to school teachers in various capacities. Talk to me about this idea. What’s the concept, and what would your best-case scenario look like? If you had the resources—maybe not infinite, but with the resources you may have available, like superintendent-level support, district-level support, and some financial and human resources—what would the ideal version look like?

Andria Layton:

So, I’m working on creating professional development specifically around AI. The goal is to focus on how it affects teachers’ self-efficacy in implementing AI tools in the classroom. I want to give teachers the tools they need to feel more comfortable and confident using AI. One thing I’ve noticed is that the understanding of what AI is has been lacking. So, in this professional development, I want to make that clear—not just by teaching them how to use ChatGPT, but by providing a foundational understanding of what AI is. This includes how it can improve teaching, how it can affect student engagement, and what bias and ethical issues are involved. The idea is that by building this basic knowledge, the fear surrounding AI will subside. Teachers will better understand what it is and how they might already be using it without realizing it, allowing them to use it more effectively.

So far, the feedback has been great. In an ideal scenario, I would love to roll this out in the district where I currently work. It would be fantastic to start there. The best-case scenario would be to get this approved by the superintendent and the school board. I envision delivering the professional development via Google Classroom since our school is already a Google school. That way, there’s no big learning curve. I’d like to make it asynchronous, so teachers can leverage their time more effectively without having to use their planning periods. They can complete it when it fits best into their schedules.

I’d like to create short videos, around 15 minutes long, since our attention spans aren’t what they used to be, especially with platforms like TikTok. These would be followed up with newsletter blasts and possibly discussion activities within Google Classroom to help teachers think about how they might implement these tools in their classrooms. The challenge is, of course, how to motivate teachers to complete this professional development if they’re not being required to do it during their planning periods. I’m hoping to work within the existing system where teachers can earn professional development credits, which could lead to days off. This would provide some incentive and make it more appealing.

Z.W. Taylor:

This sounds like gamification! But it also sounds like a great incentive. Please, go ahead.

Andria Layton:

Yeah, so I’m still working on some of the details of the professional development criteria, but it’s something that can be achieved on a district level. Our district already has a system in place, so it’s a realistic goal to align with that. I can work with the district to ensure it meets the criteria and also motivate teachers to complete the development since it will be asynchronous and on their own time. Of course, there will always be some who won’t engage, but ideally, we can motivate many of them to participate.

What I’d like to do is focus on specific areas, and I’m still finalizing those. Some of the key topics I plan to cover include teacher workload and how to make it more efficient, AI literacy, how AI can affect student engagement, and of course, addressing bias and ethical concerns. Bias and ethics are big issues when it comes to AI, and anyone who understands AI knows that this is a major concern. While we can’t fix these issues directly or teach people to create their own AI, we can raise awareness about them. Understanding how AI works and being mindful of these concerns will allow for better supervision, helping to mitigate potential problems.

From all the research I’ve done, the biggest takeaway is that effective supervision and awareness are key. When teachers understand AI better, they’ll use it more productively and with less fear. That leads to more effective and thoughtful implementation in the classroom.

Z.W. Taylor:

Totally, and I hear you when you talk about the uses and the ethical concerns regarding AI, especially from the generative perspective. You give an AI a prompt, and based on how it was trained, it might generate biased, racist, sexist, or homophobic responses. That’s definitely a concern. But I also see so many other use cases. For example, many schools run on WordPress, and you can customize the HTML and CSS to create a great website. You can generate code, and even in web development classes or computer science classes, AI tools can help students check their code. For instance, with tools like ChatGPT and other AI programs such as Anthropic, Google’s Gemini platform, and more, you can integrate AI into tools like Google Classroom. AI can spot errors in HTML code—like missing brackets or improperly referenced files—helping students troubleshoot their work in real-time.

I also think about my background in marketing and communications and all the things that could be automated using communication platforms like Banner, Infinite Campus, or TK 20—tools that many educational organizations use. For example, when a student enrolls, you can set it up so that an automated system sends a welcome email. Using a tool like Google Forms or SurveyMonkey to collect data, you can program the system to automatically adjust communications. If a student has a sibling, you can ensure only one email is sent to the parent. If the parent is marked as a single parent, or if the student is a commuter or comes from a low-income background, the communication system can be tailored to those specifics. This kind of automation relies on programming and human input, but once set up, it only requires data uploads (like a CSV file) to run. This type of AI doesn’t really involve ethical bias issues—it's just about streamlining processes and improving efficiency. The fear around generative AI often overshadows these more practical uses.

I think there are many use cases for AI that go beyond teaching, particularly for district-level personnel and leaders. For example, think about how AI might impact high school counselors, attendance officers, or other administrative staff. These staff members could also benefit from AI tools that streamline their workflows and make their jobs more efficient. Have you thought about how AI might impact their everyday tasks?

Andria Layton:

My school is large for our area—I'd call it like a mini college campus—and our district is one of the largest in the region. From my perspective, I just see benefit after benefit. We have a lot of counselors, but many of them manage hundreds of students each. Right now, a lot of it’s still paper and pen, with counselors trying to remember students’ names or relying on long-term knowledge, like "I’ve been here for 20 years, and I know their mom." But here’s the issue: people who’ve been in these roles for 20, 30, 40, 50, or even 60 years eventually retire, and when they do, their replacements often struggle to step into those positions because the relationships built over the years aren’t easily transferred. So, what do we do about that?

That’s where technology, like what you were talking about, could really make a difference. Think about how much easier and more efficient the counselors' jobs could be if we could automate more of their tasks. We use a lot of Google tools in our district—Google Forms, Google Sheets, and other Google platforms—but there’s so much more out there that, frankly, I'm not 100% comfortable with or knowledgeable enough to use. However, if we had the right tools and training, it could significantly lessen the workload. In public education, everyone feels overworked and underpaid. What if we could reduce that workload and use technology to help us do our jobs more efficiently, allowing us to better support our students?

Moreover, in public education, because of high turnover—whether from retirements or people leaving the field—automated systems can help keep things running smoothly when someone new comes in. These tools would benefit everyone, but there's a fear of learning new technology. It can be overwhelming, and I understand that it can be scary. But I think necessity should take precedence, especially when the potential benefits are so clear.

Z.W. Taylor:

Yep, I agree. And that’s a really great point. In many other countries, teacher turnover isn’t an issue. In fact, in some countries, teachers are revered. Becoming a teacher is seen as a prestigious career, and they’re respected in society. However, in the United States, the situation is arguably the opposite. Teachers here are often questioned, and there’s a highly skeptical attitude toward public schools, especially nowadays.

What’s really interesting is the business continuity aspect of AI, and how it can be built into organizational operations to help schools withstand turnover. I don’t know the exact numbers nationwide anymore, but at one point, it was reported that up to 40% of staff could turn over in a single year—this includes teachers, administrators, and support staff. That’s a significant amount of turnover in any organization, and it really highlights the challenges schools face.

Andria Layton:

It's insane.

Z.W. Taylor:

I can definitely see how technology could help backfill some of the more mundane processes—things like sending emails or text message reminders to students who are absent or tardy, nudging them to check the portal, reminding teachers to check their emails, or reminding parents about parent-teacher conferences. Even reminders about upcoming events, like football games or volleyball matches, can be automated. These are basic tasks that don’t necessarily require human intervention anymore. Once they’re programmed, they can be set up and largely forgotten, which is a huge efficiency boost.

We’ve talked a bit about the challenges you've encountered while developing this, and I know you’re still in the early stages. You mentioned fear as one of the challenges. What other challenges have you faced in this process? I can think of a few, but from your perspective, what else is challenging, aside from the fear or reluctance toward AI?

Andria Layton:

As I mentioned, fear is a huge barrier, but there’s also a big learning curve, and you really have to want to be a part of it. There’s just so much research out there across so many different areas, and figuring out which areas are the most beneficial can be overwhelming. Not everyone, like you pointed out with coding, would immediately think of that as relevant. But it could be incredibly beneficial. If this is a professional development program for the entire school, think about it: we have computer science, technology, and other fields that could benefit from it.

There are so many different areas to consider, and not all of them are immediately obvious within the context of education. The challenge is trying to avoid overwhelming people with too much at once. It’s about narrowing it down to niche areas that address specific needs, and finding the easiest way to scale it down into something digestible. That’s something I’m working on now—it’s a lot, and I’m honestly feeling overwhelmed at times. But I’m pushing through the hard work now to make it more manageable and easier to digest for everyone else later. That’s the whole goal of the professional development.

Z.W. Taylor:

You’re in a communication sub-discipline, where you’re expected to teach things like writing, public speaking, critical thinking, and reading comprehension. What’s interesting, though, is figuring out how to digest complex ideas and winnow them down so they don’t go over adults’ heads or, on the flip side, insult their intelligence by infantilizing them. You have to find that middle ground where they feel respected and like you’re going at their pace. I think your decision to make the professional development asynchronous is a great way to start. It allows people to take it at their own pace, which helps prevent overwhelming them. Sure, there may be some stopouts or people who don’t fully commit, but overall, it lets them choose their own adventure, which makes the whole process feel less intimidating.

Resource-wise, I know a little about your background, but Mississippi is a very impoverished state, and many school districts struggle to afford even basic needs. State-level leadership has made some interesting decisions about whether or not to accept federal funds for things like lunches, breakfasts, and pre-K. Mississippi has often been an outlier in that regard. Can you talk a little about the challenges in terms of human resources? Are there other motivated folks in districts who could take on this work? Then, of course, there’s the time and financial cost—your time, specifically. Your time is valuable, and you’re putting in a lot of work. Tell me about the resource constraints and what you think Mississippi can do moving forward. Where do we go from here?

Andria Layton:

I feel like this is Mississippi’s pressing question—resources. COVID really pushed technology into districts, but how does that look going forward? I haven’t done enough research to know what continuous funding for this looks like. But that’s a great question. I’m currently working on this as part of my dissertation, which is a small part of the bigger picture. I’m really passionate about it, and I’m interested in whether this kind of initiative exists elsewhere. I think there is a real desire and need for it.

Personally, my superintendent has expressed interest in AI in different meetings, and he sees it as imperative. Our district is also very proactive in  moving funds around for different programs if possible. I’d love to believe that kind of support exists everywhere, but in more rural areas, that’s often not the case. In some places, the only reason there’s a computer in the classroom is because of COVID. What happens then? There will always be people and places that fall through the cracks, and that’s something you can’t always fix.

As for me, I’m trying to figure out how to monetize my time because it feels like there’s never enough. I’m just so used to being overwhelmed. I work full-time, I have two kids, and all my extra time goes into this dissertation or programs I sign up to work with, like the South Mississippi Writing Project. I also can’t say no to other programs or projects my school district asks me to take on. So, I’m constantly pouring myself into this. But outside of my personal effort, is there anyone else who could take this on? If so, what would that look like? It’s scary to think it might not be achievable for a lot of places, especially rural areas, but I want to be part of making it as accessible as possible. I’m just not sure how that will happen, given the financial decisions districts make and the skepticism around those choices.

Z.W. Taylor:

Yeah, and maybe this professional development could evolve into a "train the trainer" model. You create a framework, refine it over time, and then start taking it to state-level leaders. Each district could nominate their "AI steward" or AI ambassador—whatever title fits. That could be a really cool model. But, of course, you’re one person right now. Let’s talk about your one-person effort. You’re creating this professional development for school teachers about AI, but I think higher ed could benefit from it too. I work in higher ed, and I sought out resources like this, but we’re not really being provided much professional development on AI at the moment—let’s put it that way.

So, how does the research tie into the PD you’re developing? What are you interested in measuring, and what are some of the outcomes you’re hoping to achieve?

Andria Layton:

The research is overwhelming. There’s so much out there, and when I first chose this area of interest, I thought, "AI in rural Mississippi, there can’t be much out there." But it turns out the world is much bigger than just rural Mississippi, and there is a vast amount of information available. The challenge is figuring out what directly relates to our context. I see a lot of theories and discussions, especially from other countries, but not as much practical advice about what we can do with AI in the classroom.

The research shows that many teachers lack the knowledge they need about AI. There's a lot of information about AI being a collaborative tool, not something that will replace teachers' jobs. There’s also research showing how AI can affect teacher workload, either positively or negatively, and how it can impact student engagement. There's plenty of information about ethics and bias in AI, but not much about how to address those issues. That's where my work lands—figuring out what to do with all this information moving forward.

A while back, I took a survey class and conducted an unofficial survey. From the results, I saw that there was a lot of interest and willingness to learn more, but also a lot of misinformation—mainly concerns about students using AI to cheat. There were also a lot of responses that reflected a lack of understanding and a sense of not having enough time. I noticed a gap, even though we have a technology person and many great resources at our school. The research says, "Here’s what AI can do," but there’s still the question of, "What do we do about it in our school?" I think there’s a clear need for an intervention to help bridge that gap.

Z.W. Taylor:

So, the intervention would be the professional development, right? You’d start by measuring teachers' predispositions toward artificial intelligence, or even technology more broadly. Then, after the intervention, you could do a post-survey or some sort of follow-up measure. Before the professional development takes place, what exactly are you interested in measuring? Is it their attitudes toward technology? Or is it more about their actual use of technology? For example, you could ask questions like, "Do you have a smartphone? Do you have a home computer? Do you have Wi-Fi?" Is it more of a technology inventory, or are you focusing on something else? What are you hoping to measure at the beginning?

Andria Layton:

For my dissertation, I’ve developed a rough idea and some research questions. My plan is to conduct a pre-survey, an intervention, and then a post-survey. I want to evaluate the knowledge that high school teachers have about incorporating AI into their classrooms. What do they know about using AI in the classroom? Do they know nothing, a little, or a lot? How self-efficacious do they feel about implementing AI? In other words, how confident do they report being in using AI in their teaching?

I’m also interested in measuring the extent to which teachers believe their knowledge and self-efficacy with AI affect student outcomes. Do they think that a lack of knowledge or self-efficacy impacts their students’ learning? And after the professional development, do they feel that what they learned has had an effect on student outcomes?

Finally, I want to understand the reported barriers to incorporating AI. What do teachers think is stopping them from using AI in their classrooms, both before and after the intervention?

Z.W. Taylor:

Interesting. One thing I’m curious about is the role age plays here. If you grew up in the internet age, you’re likely more familiar with technology. I’d say people born in 1980 and onward, especially during their adolescence and young adulthood, grew up with computers and internet technology. On the other hand, people born before that may not have had the same exposure. A lot of cognitive and brain science research shows that we learn a lot during the first five years of life when the brain is like a sponge. But as we get older, particularly after age 22 or 23 when the frontal cortex is fully developed, it becomes tougher to integrate new things into our lifestyles, workflows, and professional lives.

What role do you think age plays in this? It’s a tough question, given the stereotypes about age and technology—especially about being “behind the times.” But education, as a profession, tends to have older professionals compared to many other sectors. How do you think this plays into your dissertation? Have you thought about how age might influence your research? And beyond age, are there any other demographics you’re interested in studying? I’m curious about gender, race, income status, or family status. Do any of these variables factor into the questions you want to answer in your research?

Andria Layton:

From the informal survey I did, one of the first things I noticed was the role of age. Not to play into any stereotypes, but older teachers seemed to resist AI more, and a lot of it came down to time. They felt like they didn’t have the time to learn something new, which might not necessarily be about age but about years of teaching. Teachers with more years of experience seem to feel more pressure, as they’ve accumulated a lot of responsibilities. The way teaching has evolved, especially with the rise of technology, has led to more expectations and policies. There’s a lot more pressure now than there was, say, 20 or 30 years ago.

Personally, after seven years of teaching, I still feel like I’m just getting started. But when I hear about teaching in the past, it seems like there was so much less paperwork and more focus on what was happening in the classroom. Now, there’s so much more paperwork and many more expectations. I think that’s why some teachers, especially those closer to retirement, resist learning new things like AI. But the flip side is, if something can make their job easier, it might be worth learning. So, there are two sides to that.

What I noticed, stereotypically, is that teachers with more experience often say they don’t have the time to learn AI. But I also wonder if that’s sometimes just an excuse. If we don’t want to do something, saying we don’t have the time is a convenient way to avoid it.

As for the dissertation side, there’s a lot of aspects to consider. One thing that stands out to me is the role of income, especially since I’m from South Mississippi. The availability of resources and funding plays a big role in what’s possible, and that affects how AI and other technology are incorporated into education. I’m also interested in gender, especially since education is a woman-driven profession, but men are more commonly found in the science and math fields, where you see more technology being used. These are all factors I’d like to explore further.

I’ve thought about many of these things, but I have to narrow the scope for my dissertation. However, I’m excited about the research aspect, even if I’m a little terrified—actually, a lot terrified!

Z.W. Taylor:

It really has to start there, because I imagine when early humans first started conceptualizing machines, it must have been terrifying. Think about the first fire or the creation of the wheel—those were huge technological leaps. And over time, a lot of work went into refining those innovations, like adding spokes to the wheel and creating vehicles. Each step was a frontier that needed to be explored. I think the same is true for AI in education. It’s going to transform the field, and the question is: are you on board, or are you not?

Regarding age and years of experience in education, there’s probably an element of “I’ve survived this long, I’ll be fine,” where some teachers may not see it as necessary. But that attitude can be a real disservice to students. While the teacher may be fine without it, students in the future may not survive without it. They may need AI competencies, readiness, and literacy to stay competitive in the workforce over the next 15-20 years. The students you’re teaching now might go straight into the workforce or graduate college soon, and they’ll be expected to support themselves financially. Are teachers preparing them with the skills they need to be successful post-graduation?

As we near the end of our discussion, you mentioned that so far, administrators and district-level people seem supportive of your AI professional development program. If it goes well and becomes established, what comes next? How do you plan to stay on top of the rapidly changing technology? You’re already busy, and as you know, if you create a technological training tool, by the time it’s done, it can be obsolete. How do you plan to keep up with that?

Andria Layton:

I’m really excited about the professional development. I keep hearing that it’s going to be a lot of work, and it is—it already is. But I’m excited about what it could potentially do. The topic is incredibly exciting, though one petrifying thing about AI is that it moves faster than I do. I’ve noticed this even just within the scope of my professional development. A lot of the official research is already dated by the time it’s published. For example, Google Gemini and some of the articles I’ve come across are already outdated as AI evolves so rapidly. It’s hard to stay on top of it.

A lot of what I’ve been doing, aside from official research, is staying involved in what I call "magazine news"—the snippets you see from articles on Google, the things that everyone tells you not to cite. But they’re up to date, like those from reputable sources like Forbes. I spend a lot of my personal time reading about AI developments as they happen. My husband is really into AI too, and he sends me updates whenever he sees something interesting. It’s important to stay on top of those social media-style updates since the information is moving that quickly.

For the professional development, I’d love to continue it. I’m going to invest a lot of time into this, and I would really like to see it evolve. I’m not sure exactly what that looks like yet, but I’d love to keep being part of it. I think it has to be an ongoing process. I took a program evaluation class that taught me how to evaluate whether a program is effective and what to look for. I think those skills will be key in noticing what’s working and what’s not, especially as AI continues to evolve.

As people go through the professional development, having that interactive model will help me see what’s working and what isn’t. But it definitely needs to evolve and remain a continuous process. I know it’s important to refer to official research, like the articles I get from ERIC and other sources—they’re foundational. But it’s just as important to stay aware of what’s happening in the news. AI is evolving faster than official publications can keep up with. For instance, ChatGPT now has a video function, and it can tell you if your shoes match your shirt. I knew about this a week ago, but there’s nothing published about it yet. Being able to understand both the official research and the real-time developments is key to creating something that remains relevant.