All Articles Education Insights 3 questions to ask when adopting AI tools in schools or districts

3 questions to ask when adopting AI tools in schools or districts

School leaders considering a new AI-driven technology must be sure the use is safe and effective, explains Adam Geller, founder of Edthena. 

6 min read

EducationInsights

Ethnic adult female using AI on a laptop computer and holding a smartphone for article on AI tools in schools

Lorado/Getty Images

Insights is a SmartBrief Education Originals column that features perspectives from noted experts and leaders in education on the hot-button issues affecting schools and districts. All contributors are selected by the SmartBrief Education editorial team.

 

Artificial intelligence is all around us. And it takes many forms.

headshot of adam geller for article on AI tools in schools
Geller

From facilitating student learning to supporting teachers with their practice, we can already see the impact of this technology in education. It is and will continue to be a game changer.

But, like with any new technology implementation, change and the unknown can be hard for educators to embrace. While there is nothing to be afraid of when it comes to AI, it is up to school and district leaders to be thoughtful and deliberate in their approach to implementing new AI tools in schools so they can be leveraged effectively. What purpose will these tools serve? How will they enhance what’s already taking place in the classroom?

Below are the three questions every leader should ask and consider before implementing a new AI technology in their school or district this school year.

What guardrails are in place for AI tools in schools?

The amazing thing about generative AI tools is they can create any type of content from a haiku about photosynthesis to an essay about the Boston Tea Party using references to Beyonce songs. (Yes, I promise, both are possible.)

However, these generative AI tools — including ChatGPT, Bard, Claude and Bing — are unstructured resources that can lead to unstructured learning. And no one wants teachers (or students) asking about civil rights leaders and ending up with a recipe for carrot cake.

While a free-flowing marketplace of ideas is important to society, our community standards have long dictated that we curate content within a school setting. It’s why there are content filters on the internet at school to block explicit web resources. And it’s why it is OK to pick AI tools that are moderated in some manner so they are appropriate for the K-12 setting.

So, if you’re considering a generative AI tool, it’s important to ask and find out how the outputs of the tools are bounded and moderated.

How is the tool designed to solve a specific learning need?

While the use of AI may seem new, AI tools have been around for quite some time. I, personally, use AI frequently to complete different tasks and define “everyday AI” tools as ones that are specifically designed to help get a job or task completed more easily. The jobs can be simple, like unlocking my phone with my face.

When it comes to implementing new AI tools in schools, ask what job needs to be done, and then go find the tool or tools that will help get that job accomplished.

Want students to practice early reading skills and phonics? Look for an AI tool that listens to and corrects their reading.

Want to improve students’ grammar? Look for an AI tool that provides inline editing, flags errors and offers helpful explanations for the rewordings.

Want to help teachers engage in more self-reflection and independent learning? Look for an AI-powered tool that can coach them as they go.

Curriculum and edtech consultant Monica Burns often says “tasks before apps” for classrooms. The same is true for picking AI-enabled tools. 

Identify the need within the learning context, and then seek out a solution that leverages AI to make things easier. 

How is personal information being protected?

AI tools are built on existing data and will often learn and improve the more they are used. This means that the tools will sometimes use your — or your teachers’ or your students’ — data.

Learning from teachers’ and students’ data is not necessarily a bad thing. In fact, it could be a good thing. For example, imagine a tool that helps teachers generate reading passages at various Lexile levels. The passages teachers mark as high quality would train the algorithm to produce higher-quality content over time. 

It is important, however, to understand how personal data will be used and whether the AI tools under consideration are designed to protect sensitive information.

For instance, many generative AI tools offer a “summarize my notes” feature. In most cases, the inputted notes are probably not sensitive in nature. Creating a summary of the week’s lessons for a parent newsletter could help save a teacher time in preparing their communications.

However, inputting observational notes on a student with special needs and asking the generative AI tool for help updating the IEP language is a whole other story. Personally identifiable information needs to be protected at all times, and this includes protecting it from becoming part of the data that a large-language model might use to generate new text for another user. 

Again, having your data help train AI tools in schools is not always a bad thing, but you have to do your homework before diving into usage.

We can’t block AI-powered tools, so let’s adopt the right ones

AI is quickly and continually advancing and evolving, but it is here to stay. So, while it is important to be cautious when implementing AI, it is also important to understand that students will now be growing up surrounded by the technology. 

Instead of disregarding AI or finding ways to block it, think about its place in today’s classrooms. 

How can AI tools in schools be used responsibly? Safely? Effectively?

As school and district leaders, familiarize yourself with the technology and ask the right questions so students and teachers alike can benefit from this powerful technology. 

 

Adam Geller is the founder of Edthena, the maker of the AI Coach PD platform for teachers. He also is the author of “Evidence of Practice: Playbook for Video-Powered Professional Learning.” Geller started his career in education as a science teacher in St. Louis and has overseen the evolution of Edthena tools as its founder since 2011.

Opinions expressed by SmartBrief contributors are their own. 

_________________________

 

Subscribe to SmartBrief’s FREE email newsletter to see the latest hot topics on EdTech. It’s among SmartBrief’s more than 250 industry-focused newsletters.