Kara Alaimo is an associate professor of communications at Fairleigh Dickinson University. Her book, Over the Influence: Why Social Media Is Toxic to Women and Girls, and How to Recover It, was published in 2024 by Alcove Press.
When your child returns to school, they plan to use artificial intelligence to accomplish their studies.
Twenty-six percent of teenagers aged 13-17 said they used ChatGpt for their studies in a 2024 Pew Research Center survey. Since then, AI chatbots have become more common, so the number may be increasing.
As a professor, I have the words to say when students ask a chatbot to write a paper. It is called fraud. Most importantly, it tricks them from the opportunity to learn them. Unfortunately, it’s easy for kids to do this because the tools to detect AI-generated content are unreliable. Therefore, when educators grade a paper, they cannot always tell whether it was used or not.
That’s why it’s very important for parents to talk when their children need to use AI this year and shouldn’t.
“Make sure you use AI as a learning tool instead of shortcuts,” said Robbie Torney, senior director of the AI Program, a nonprofit that advocates healthy media options for children.
This is how to do that.
Use AI to brainstorm and tutor rather than think or write
First, they talk to their kids about why their goal is to “learn and grow,” Tornie said. If AI does their job for them, it “takes away that opportunity.”
However, AI helps them learn. Torney suggested using it as a tutor. “It’s great for explaining difficult concepts and helping them to stagnate, but the original thinking and work must belong to them,” he said.
AI can also help brainstorm ideas, Torney said, but students should do their own thoughts and writing.
It is important to explain why these rules are important. “Our brains are like muscles,” Tornie said. “Children don’t learn skills unless they practice them.”
According to Torney, it would be ideal for children to agree to these boundaries before using AI, but they would “check in” regularly to ensure that AI tools do not replace learning.
Don’t believe everything AI says to you – and understand it together
Chatbots tell users that it’s not true. It is called hallucination and happens all the time.
Also, chatbots are missing things. For example, my student recently submitted a paper on AI (what else?). Many of them are terrible similar and always ring an alarm bell in my head that AI was able to generate them. In this case, multiple students have no federal laws to help victims of nude deep fakes.
Therefore, it is important to teach children how to fact-check information they receive, rather than accepting AI answers at face value. One way to do that is to photograph materials obtained at school (for example, the subject of photosynthesis) and compare the facts that the chatbot conveys.
It’s great to experiment with this together. And parents should not be threatened about doing this because they don’t fully understand how AI works. Most people don’t.
“You don’t need to be an AI expert to help your child use AI wisely, and being involved in asking questions and exploring together can teach them the skills they need in the future,” Torney said.
Because chatbots will probably stay here, whether it is important or not. “Accessing information via AI interfaces will become increasingly common for children,” Torney said.
It is also important to teach your children that they should not take personal advice from chatbots or share personal information.
According to Torney, it’s easy for kids to forget that AI chatbots are technology. “We know that younger children can’t convey the difference between fantasy and reality and are more likely to think of AI as real people or friends,” he said.
One concern is that chatbots trained to have romantic conversations may be engaged in sexual conversations with children. It can also give them bad advice, encourage harmful thoughts, or even lead to replacing relationships with others.
So it’s a good idea to remind children that AI is not human. If you give the chatbot an answer that can make it seem like it isn’t, Tornie realizes that her parents said, “AI said, ‘Do I like your ideas?” “I said. It’s just programming.
Torney also allows children to accidentally disclose personal information through chatbots. He said that if your child uploads a photo of your home and the system uses it as part of the training set, it could be shown to other users. Therefore, it is important to talk about why you never share your personal information with AI tools.
Finally, we will set up clear family rules when chatbots are used. Consider making chatbots available to children in places like family rooms, according to Torney. And establishing time without tech, such as during meals or before bedtime, he suggested.
Your child will probably try to use AI to support his studies. Chatbots are so ubiquitous that understanding how to use them is a life skill for our kids.
That’s why we need to teach our children to use AI to help them learn. One way to teach this is to use chatbots together.
Also, children need to know that they should not rely on AI platforms for advice. Even if they sound human, they are not real, but the consequences of making AI in the way of learning are certainly true.