Full Case Study: High School Students Use AI to Teach Phishing Awareness
What Happened
At Eminence High School, a cybersecurity teacher implemented a phishing simulation project in which students wrote fake phishing email messages designed to educate staff about how phishing attacks work. (Education Week)
- The simulation was part of a classroom assignment on cybersecurity and ethical technology use. (Education Week)
- Students were allowed to use generative AI tools to help craft realistic phishing emails — mimicking techniques cybercriminals might use. (Education Week)
- The school’s tech team intentionally let these fake emails get delivered to staff inboxes so that the simulation could run. (Education Week)
Simulation Results
- The first phishing email, written without AI, succeeded in persuading 14 staff members to click on a fake link. (Education Week)
- A second email, crafted with AI assistance, more than doubled that number — showing how convincingly AI can generate text. (Education Week)
- Staff exposure to these simulated attacks became a learning experience, helping them notice tactics attackers might use in real threats. (Education Week)
Educational Purpose & Ethics
Before running the simulation:
- The teacher discussed ethical technology use with students — emphasizing that AI can be used for both good and bad purposes. (Education Week)
- Students and parents signed a technology acceptable‑use agreement that allowed the project to proceed. (Education Week)
This context was important to ensure the students understood why they were creating deceptive emails — not to harm, but to educate others about security. (Education Week)
Why It Matters: Comments & Expert Views
Teacher’s Perspective
- The teacher leading the project said there’s no way around it — everyone needs basic cybersecurity education for society to stay safe. (Education Week)
- She believes students gain valuable skills whether they pursue cybersecurity careers or simply need to defend themselves online. (Education Week)
Tech Leader’s Comment
- The district’s technology director, commenting on the project, joked that while the simulation was “successful,” it wasn’t success in a malicious way — it was success as a learning tool. (Education Week)
Broader Educational View
- This case reflects a growing belief in education circles that students should learn cyber‑risk concepts early, not just technical skills but social engineering tactics. (Education Week)
- Using AI in simulations shows how real attackers might use similar tools — and helps both students and staff learn how to recognize signs of phishing and manipulation before they fall for real attacks. (Education Week)
Broader Context on Phishing Awareness and AI
This student project lines up with emerging trends in cybersecurity training:
- AI tools are increasingly used to create realistic phishing simulations that help people learn to spot threats — both in schools and workplaces. (Proofpoint)
- Research shows that phishing simulations can be powerful awareness tools when combined with proper training and ethical framing — though they must be used carefully so participants learn rather than simply feel tricked. (cybersecuritydive.com)
- In higher education and corporate environments alike, AI‑enhanced phishing tests and awareness campaigns are becoming part of standard cybersecurity education strategies. (dukechronicle.com)
Lessons From the Case
- AI can make training more realistic — both educators and defenders must adapt as AI gets better at mimicking human language. (Education Week)
- Ethics and consent matter — having clear agreements from students and staff ensures such exercises are responsible and instructional. (Education Week)
- Learning by doing works — students not only learned about phishing but took part in demonstrating why training is needed. (Education Week)
Here are the detailed case studies and key comments related to how students used AI to raise cybersecurity awareness through phishing email simulations — based on recent reporting, teacher insights, and broader expert context:
Case Study 1 — High School Students Run AI‑Assisted Phishing Simulation
What the Project Was
At Eminence High School, a cybersecurity teacher designed a classroom activity where students created simulated phishing emails to raise awareness among staff about how easy it is to be tricked online.
- Students were allowed to use generative AI tools to write convincing phishing messages — reflecting how real attackers increasingly use AI.
- The school’s tech team deliberately delivered these simulated messages to staff email inboxes as part of the exercise.
What Happened in the Simulation
- A first phishing email generated without AI succeeded in getting 14 staff members to click a fake link.
- A second email crafted with AI help got more clicks, more quickly and convincingly.
- All clicks were on simulated test links, not real malware or harmful sites.
Why It Worked
The simulation showed that:
- AI can make deceptive emails more convincing, even when written by students.
- Staff — like anyone — can be vulnerable to well‑written phishing messages.
Case Study 2 — Ethical Framework & Consent Process
Before running the simulation, the teacher and school took deliberate steps to frame it ethically:
Clear Rules and Agreements
- Parents and students signed an acceptable‑use agreement for technology that included consent for this exercise.
- The teacher discussed ethical technology use with students — emphasizing this was training, not mischief.
Why This Matters
This helped ensure:
- Participants understood why the exercise existed (to learn, not to deceive).
- Staff were not exposed to real security risks beyond awareness‑raising.
Comments From Educators and Experts
Teacher Leading the Project
The cybersecurity teacher said the simulation helped teachers experience how phishing works — an experience that reading about it doesn’t fully provide.
She described it as an effective wake‑up call on how easy it is to fall for realistic threats.
School Technology Director
The technology director remarked that while results showed many clicks on test links, the real success was in starting a conversation about cybersecurity best practices.
Broader Cybersecurity Educators
Security education experts note:
- Simulations can be more impactful than lectures because they put people in the role of decision‑maker.
- Using AI to craft simulated phishing messages reflects real tactics attackers may use, making training more realistic. (General cybersecurity training research)
What These Cases Teach
1. AI Can Be a Double‑Edged Sword
- Tools that make writing easier can also help craft deceptive messages.
- Teaching people how these tools can be misused helps improve real‑world defenses.
2. Active Simulations Boost Awareness
- Staff often think they know what phishing looks like — but clicking behavior shows how deceptive real wording can be.
- Realistic simulations help close that awareness gap.
3. Ethics and Consent Are Key
- Without clear consent and ethical framing, such exercises could harm trust.
- Getting buy‑in from students, parents, and staff ensures the learning goal stays clear.
Expert Perspectives on Phishing Simulations
Cybersecurity training research supports the value of simulations when combined with:
- Immediate feedback for participants
- Follow‑up lessons on how to recognize red flags
- Ongoing awareness campaigns
In other words, simulations should be part of (not the whole of) cybersecurity education.
