A student's guide to AI
For teachers to share with their class. Clear, honest, and practical — covering what AI is, what it is not, and how to use it without getting yourself into trouble.
What is AI, actually?
When people say "AI" today, they usually mean tools like ChatGPT, Claude, Gemini, or Microsoft Copilot. These tools can write text, answer questions, summarise documents, and generate code.
They work by predicting what word should come next, based on patterns learned from enormous amounts of text. They do not think, feel, understand, or know things the way humans do. They generate plausible-sounding responses — and sometimes those responses are wrong.
AI is genuinely useful. It is also genuinely limited. Understanding both is what matters.
AI gets things wrong — and sounds confident when it does
This is the most important thing to understand. AI tools can produce incorrect facts, invented statistics, and fictional references — all stated with complete confidence. This is called a "hallucination".
If you use AI to research something, you must verify what it tells you against a reliable source. Never cite an AI tool as a source of facts. Never assume what it tells you is accurate just because it sounds authoritative.
What is OK — and what is not
The line between using AI as a tool and using it to cheat depends on what you are being asked to demonstrate.
This is generally OK
Using AI to help you understand a topic you find difficult
Asking AI for feedback on a draft you have written yourself
Using AI to brainstorm ideas — then developing them yourself
Asking AI to explain what a word or concept means
Using AI to practise a skill (e.g. exam-style questions)
This is not OK
Submitting AI-generated text as your own work in any assessed task
Using AI to complete a CBA, project, or essay that is meant to show your own thinking
Copying and pasting AI output without reading, understanding, or editing it
Using AI to avoid thinking about something rather than to help you think
When in doubt, ask your teacher. Schools are still developing their policies on this, and what is acceptable varies by subject, task, and context.
Your personal data — stay safe
Never put personal information into an AI tool. This includes your own name and address, your friends' names, your school's name combined with identifiable details, anything private about another person, and anything you would not want a stranger to read.
AI tools process your inputs on external servers. The free versions of most tools may use your conversations to improve their systems. Treat every AI conversation as if it could be read by anyone.
Using AI to actually learn — not to avoid learning
The students who get the most from AI are the ones who use it to go deeper, not to skip the surface. Here are four ways to use AI that will genuinely improve your understanding.
- The rubber duck. Explain a concept to the AI as if teaching it. Ask it where your explanation is incomplete. This is one of the best ways to find gaps in your own understanding.
- The examiner. Ask it to give you five exam-style questions on a topic you are revising, then answer them yourself without looking anything up first.
- The editor. Write something yourself first. Then paste it in and ask: "What is unclear? What is missing? What could be stronger?" Edit based on the feedback — do not replace your writing with the AI's.
- The explainer. When you do not understand something from class, ask the AI to explain it three different ways. The third explanation often clicks when the first two do not.
AI and the Leaving Certificate / Junior Cert
The SEC (State Examinations Commission) has confirmed that submitting AI-generated content in assessed work without acknowledgement is academic dishonesty. This includes CBAs, orals, practicals, and written examinations.
The examinations themselves are still taken in person, without AI access — so building your own knowledge and skills remains essential. AI can help you prepare. It cannot sit the exam for you.
Three ways AI can help you learn — not avoid learning
AI is most useful for students when it creates more opportunities to practise thinking — not fewer. These three roles give you a way to think about when and how to use it well.
AI as tutor
Ask for explanations, worked examples, analogies or simpler rewording when you are stuck on something. Use it like a patient study partner who never gets tired of explaining.
AI as feedback partner
Write something yourself first, then ask AI what is unclear, what is missing, or what one improvement would make the biggest difference. Do not ask it to rewrite your work — ask it to help you improve it.
AI as thinking scaffold
Use it to plan, compare, question and reflect. Ask it to challenge your argument, suggest counterpoints, or ask you questions you have not thought of yet. The goal is stronger independent thinking — not a finished answer to hand in.
What not to do
Do not ask AI to produce a finished answer and submit it as your own work. You are not practising anything, and the only person you are short-changing is yourself.
Two prompts that are worth keeping
These two prompts work in almost any subject and make AI genuinely useful for learning rather than shortcutting it.
WHEN YOU DO NOT UNDERSTAND SOMETHING
WHEN YOU WANT FEEDBACK ON SOMETHING YOU HAVE WRITTEN
Four things to understand about how AI actually works
- How to ask well. A vague question gets a vague answer. The more context and detail you give, the more useful the response.
- How to check what AI says. AI gets things wrong — confidently. Never copy a fact, statistic or quotation from AI without checking it against a reliable source first.
- How to use AI to support thinking, not replace it. The goal is stronger work that you actually understand. If you cannot explain what you submitted, the preparation failed — regardless of how good it looks.
- What responsible use looks like. You should know where AI help is appropriate in your school and where it crosses a line. If you are not sure, ask your teacher before using it — not after.