Executive Summary
Teacher’s Pet is a browser-based companion that nurtures self-regulated learning (SRL) by combining real-time feedback, playful rewards, and intuitive teacher analytics. Drawing on my senior thesis research in metacognition, I began with a small-scale Chrome extension that detected off-task behaviors. Over time, this prototype evolved into a more holistic system featuring the AI avatar “Capy,” a rewards-based “pet shop,” and a teacher dashboard that highlights both engagement patterns and targeted interventions. By uniting responsive prompts, reflective problem-solving exercises, and concise data insights, Teacher’s Pet aspires to spark lasting study habits, deeper curiosity, and an enduring passion for learning for all students.
Overview
Project Motivation
For the past six years, I have focused on leveraging technology to address educational inequities. While many digital platforms successfully deliver content, few encourage the level of introspection and deliberate practice needed to transform knowledge into long-lasting skill. My senior thesis showed how ongoing feedback can strengthen a student’s ability to plan, monitor, and evaluate their learning, yet such interventions seldom appear within everyday classroom tools.
Teacher’s Pet fills this gap by seamlessly integrating responsive encouragement and reflective prompts into each student’s digital workspace. As AI leaders continue to discuss generative models as personal tutors—highlighted by Jensen Huang’s recent emphasis on ChatGPT, Gemini, and Perplexity—significant gaps remain for younger learners who need direct scaffolding to recognize what they do not know, how to ask the right questions, and when to seek help. Teacher’s Pet addresses these needs by combining real-time, individualized support with the analytics teachers need to spark a culture of metacognitive awareness.
My Role and Goals
I oversaw Teacher’s Pet from early brainstorming to iterative pilot testing. Drawing on metacognitive research, I designed Capy to detect telltale behaviors like rapid guessing or multiple answer changes and to respond with timely guidance. Simultaneously, I mapped out the teacher dashboard, ensuring that instructors received straightforward action items instead of overwhelming data dumps.
In shaping this project, I pursued three overarching goals:
- Promote self-regulation by encouraging students to pause, reflect, and adapt their strategies as they complete assignments, instead of racing through tasks or relying on rote memorization.
- Support educators by providing immediate, high-level engagement data so teachers can identify emerging issues and guide students more effectively.
- Advance equitable access by reducing adoption barriers by building everything as a simple browser extension, eliminating the need for additional hardware or extra software installations.
Development and Iteration
Teacher’s Pet began as a Chrome extension that detected non-academic browsing and provided brief interactive activities to re-engage learners (see demo video above). This initial prototype gave my team and me valuable insights into how timely prompts might redirect students back to their academic tasks. Encouraged by those results, we introduced a second iteration that detected periods of extended inactivity and offered additional support for students who seemed to be struggling (see demo video below). In each version, we tested prototypes with graduate students, middle school students, middle school teachers, and CMU faculty, simulating how Teacher’s Pet could guide learners toward deeper focus.
These early interventions demonstrated the potential for a fully integrated learning ecosystem. Motivated by that promise, we broadened Teacher’s Pet into a layered platform that combines AI-driven feedback with user-friendly design. We adopted a “Wizard of Oz” testing method to refine each element in real time, so rather than relying on a fully automated backend, we manually delivered “Capy’s” messages to better understand which prompts resonated most with users. To gauge early effectiveness, we ran a brief pilot in which one group used a standard online quiz and another used Teacher’s Pet (see demo video below). While the sample size was modest, many participants noted that Capy’s interventions inspired them to think more carefully before answering. Educators who observed the pilot study also highlighted how valuable it could be to pair real-time feedback with rewards and concise teacher insights.
The Student Experience
In Idle Mode, when students go inactive or browse unrelated sites, Capy gently appears with a “Need help?” prompt (see image below). At that point, learners can try a simpler problem, review a short tip, or complete a reflective exercise that realigns them with the task.

In Working Mode, if the learners switch answers frequently or respond too quickly, Capy prompts them to double-check their reasoning (see image below). This intervention encourages explicit monitoring and reflection over impulsive guessing.

After an assignment is submitted, Teacher’s Pet awards productivity points based on consistent focus and mindful activity (see image below). Students can trade these points for virtual items in the “pet shop,” reinforcing the idea that steady effort leads to tangible rewards.

Together, these modes push students to adopt more deliberate study habits without overwhelming them. In line with Zimmerman’s cyclical SRL model, the system encourages students to monitor their performance and evaluate their learning strategies, thereby building a stronger foundation for independent problem-solving.
Reward System
Beyond Capy’s guided prompts, Teacher’s Pet includes a gamified virtual “pet shop” to help students stay motivated over time. Students earn up to ten points per assignment, redeemable for virtual pets, accessories, or other creative items (see image below).

By tying these small, fun rewards to consistent effort, we aim to sustain engagement past the initial novelty period and reinforce the idea that reflective practice pays off.
The Teacher Experience

On the educator side, we developed a dashboard that aggregates student metrics and suggests possible interventions, all within a visually intuitive layout. At the top, teachers have access to all of the sections or classes. Beneath that, a bar chart distinguishes active from inactive time, allowing educators to recognize patterns or potential trouble spots. For example, if a student’s weekly active hours dip suddenly, the system may suggest reaching out before the issue escalates.
A “Suggestions” panel to the left highlights topics that multiple students struggled with, as well as individual patterns like guess-heavy attempts or last-second answer changes. Clicking on these alerts reveals more detail, such as how many times a student switched an answer, whether they looked at Capy’s prompts, and how their overall performance shifted afterward. A scheduling feature on the right-hand side displays upcoming classes or meetings, giving teachers a practical way to incorporate new strategies.
During our pilot and departmental demos, educators found that these consolidated, context-rich views saved them from wading through extensive logs or building their own spreadsheets. Instead, they could spot immediate trends, assign follow-up tasks, or organize small-group sessions to close knowledge gaps.
Poster Session and Live Demo
At the culmination of our three-month research and prototyping process, we presented Teacher’s Pet at a departmental poster session and live demo.

During the live walkthrough, attendees observed our project's real-time interventions in response to off-task behavior, rapid guesses, and multiple answer changes, and saw how the teacher dashboard could record each event. We also shared our comparative analysis and early findings which, despite a modest sample size, indicated that students using Teacher’s Pet spent more time on thoughtful problem-solving and reported feeling more engaged, suggesting the platform’s potential to enrich day-to-day learning activities. Finally, we previewed initial prototypes for more robust off-task detection, prompting a discussion about tailoring these interventions for different age groups, ensuring student privacy, and maintaining a positive learning atmosphere.
Collectively, the feedback affirmed the project’s viability: researchers recommended longer pilot studies (e.g., semester- or year-long), while educators requested subject-specific prompts and additional customization options.
Reflections and Future Plans
Our preliminary tests and demos revealed that timely, well-placed interventions can encourage learners to question their approaches and refine their study habits. However, some students missed the full potential of the project, underscoring the importance of clear onboarding. Meanwhile, teachers expressed interest in multi-week or multi-month trials that could shed light on whether students continue to benefit after the initial novelty fades.
With these insights in mind, future work could replace our Wizard of Oz process with an AI-driven backend that adapts prompts to each learner’s unique habits. We also envisioned expanding the gamification elements to include cooperative “house” or “team” challenges, inspired by successful gamification strategies in traditional classrooms. Additionally, as generative AI continues to evolve, we would ideally incorporate teacher-led prompts that help younger or less confident students ask relevant questions and manage advanced AI tutors more effectively. Our ultimate aim is to preserve human teachers’ expertise while offering dynamic scaffolds that help students identify and address their own knowledge gaps.
Conclusion
I designed Teacher’s Pet with the conviction that thoughtful technology can amplify curiosity and build resilience in learners of all ages. By giving students opportunities to plan, monitor, and evaluate their thinking, while streamlining the teacher’s role through clear, data-driven insights, the platform closes a gap many generative AI solutions have yet to bridge. Indeed, as AI advances and more “personal tutors” emerge, systems like Teacher’s Pet may prove vital in ensuring all students, regardless of their initial level of curiosity or confidence, receive the structured support they need to thrive.
Although our design takes a playful approach to teaching self-regulation, its implications stretch much further. If we integrate metacognitive scaffolds into educational tools, we stand a chance of altering how students see themselves as learners. This shift could broaden the conversation around AI’s role in society, guiding us away from a future where only naturally inquisitive students flourish and toward one where every learner, equipped with reflective capabilities, can harness technology to reach new heights. By helping students recognize what they do and do not know, and by guiding them to ask the right questions, we can fuel a collective culture of innovation and continuous improvement that benefits us all.