Your Strongest Differentiator

Assessments & Academic Integrity

Detection doesn't work. Process does. Our assessment philosophy is built on transparency, verification, and authentic demonstration of learning.

Why We Don't Rely on Detection

AI detection tools are unreliable, punitive, and miss the point entirely.

Detection Tools Are Unreliable

AI detectors produce high false-positive rates—flagging human-written work as AI-generated and missing actual AI use. They're especially biased against non-native English speakers and students with learning differences.

Research shows detection accuracy as low as 50–70% in real-world conditions. That's not defensible.

They're Punitive, Not Pedagogical

Detection creates adversarial relationships between students and teachers. It assumes guilt, encourages evasion, and teaches students to game the system instead of learn.

We're not trying to catch cheaters—we're building transparent learning environments.

They Miss the Real Problem

Detecting AI use doesn't address why students use it inappropriately—unclear expectations, assessments that can be easily automated, or lack of skill development. Detection is a band-aid, not a solution.

Better pedagogy > better policing.

Our Approach: Process-Based Authenticity

Instead of trying to catch AI use after the fact, we build transparency and verification into the learning process itself.

Our Approach

Process-Based Authenticity

Instead of trying to detect AI use, we make the learning process transparent—so you can verify authentic understanding.

Documentation

Students document every step of their AI use: prompts, iterations, failures, refinements, and decisions.

Tools Included:

  • • Process log templates
  • • Screenshot requirements
  • • Iteration tracking sheets
  • • Reflection prompts

Verification

Students complete checklists proving they understand their work—not just that they produced it.

Tools Included:

  • • Verification checklists
  • • Self-assessment rubrics
  • • Concept check questions
  • • Peer review protocols

Oral Defense

Students explain their work, reasoning, and process in live conversations that can't be AI-generated.

Tools Included:

  • • Oral check question banks
  • • Presentation rubrics
  • • Walkthrough protocols
  • • Defense scoring guides

How It Works in Practice

1

Student Completes Assignment

Uses AI tools to research, draft, revise—but documents every prompt, decision, and iteration in their process log.

2

Submits Verification Checklist

Completes checklist demonstrating understanding: "Can you explain this concept without AI?" "What would you change?" "Why did you make this choice?"

3

Participates in Oral Check

Brief (5-10 min) conversation with instructor: walks through their process, answers probing questions, defends their reasoning.

4

Instructor Evaluates with Confidence

Has complete audit trail: the work, the process, the verification, and the defense—defensible evidence of authentic learning.

Result: You can confidently assign grades knowing students genuinely understand the material.

Complete Assessment Toolkit

Every curriculum includes detailed, ready-to-use assessment tools—no guesswork required.

Detailed Rubrics

Every assignment includes a comprehensive rubric with clear criteria for evaluation—defend every grade you assign.

What's Included

  • • 4-5 level performance criteria (Exceeds → Needs Work)
  • • Process quality assessment (not just final product)
  • • Critical thinking & iteration evaluation
  • • Transparency & documentation scoring
  • • Reflection & metacognition criteria

Sample Rubric Criteria

Process Documentation (25%)

Complete log of prompts, iterations, and decisions with clear reasoning

Critical Evaluation (25%)

Identifies AI limitations, evaluates outputs, demonstrates independent thinking

Final Product Quality (30%)

Meets assignment requirements with evidence of refinement and iteration

Oral Defense (20%)

Articulates process, defends choices, demonstrates understanding

Oral Check Question Banks

Structured protocols for brief verbal assessments that verify genuine understanding—impossible to AI-generate.

Process Questions

  • "Walk me through your first attempt."
  • "What didn't work and why?"
  • "Show me where you got stuck."
  • "How did you decide to revise?"

Understanding Questions

  • "Explain this concept in your own words."
  • "What would happen if...?"
  • "Why did you choose this approach?"
  • "What's an alternative method?"

Critical Thinking Questions

  • "What are the limitations here?"
  • "How would you verify accuracy?"
  • "What biases might exist?"
  • "What's missing from this analysis?"

Note: Oral checks are brief (5-10 minutes) and can be conducted during class, office hours, or via video. Question banks include 20-30 questions per assignment—instructors select 3-5 based on student work.

Audit Trail & Process Logs

Students maintain transparent records of their entire AI interaction—creating a verifiable trail of their learning process.

What Students Document:

Initial Prompts

First attempt + what they were trying to achieve

AI Responses

Screenshots or copy-paste of outputs

Iterations & Refinements

How prompts evolved, what changed and why

Failures & Dead Ends

What didn't work, lessons learned

Decision Points

Why they chose one path over another

Final Reflection

What they learned, what they'd do differently

Why This Works

Process logs make AI use transparent, not hidden. Students can't submit AI-generated work without understanding it—because they have to explain every step. Instructors can trace the learning journey and identify where students struggled, grew, or need support.

See It In Action

Sample Artifacts

Real examples of student work, process logs, rubrics, and oral check protocols (student information redacted).

Sample Process Log

Complete student documentation of AI-assisted research project with prompts, iterations, and reflections.

Includes:

  • • 12 documented iterations
  • • Screenshot evidence
  • • Decision rationale
  • • Self-assessment checklist
View Sample

Annotated Rubric

Detailed rubric for AI ethics policy memo with scoring examples and instructor notes.

Features:

  • • 5-level performance criteria
  • • Process & product scoring
  • • Sample student responses
  • • Common pitfalls guidance
View Sample

Oral Check Protocol

Complete question bank and scoring guide for 10-minute oral defense with sample responses.

Contents:

  • • 25 question bank options
  • • Scoring criteria (1-4 scale)
  • • Follow-up prompts
  • • Time management guide
View Sample

Verification Checklist

Student self-assessment tool with comprehension checks and understanding verification.

Sections:

  • • "Can you explain...?" questions
  • • Comprehension verification
  • • Limitation identification
  • • Improvement recommendations
View Sample

Student Portfolio

Curated semester portfolio with reflective essay and complete process documentation.

Artifacts:

  • • 5 best works with logs
  • • Growth reflection (2000 words)
  • • Iteration examples
  • • Oral defense transcript
View Sample

Policy Memo Example

Student-written AI ethics policy with instructor comments showing assessment in action.

Highlights:

  • • Stakeholder analysis
  • • Risk assessment
  • • Evidence-based recommendations
  • • Rubric scoring annotations
View Sample

Full sample artifacts with complete rubrics, scoring guides, and student work examples are available in the curriculum preview. Request access to explore complete assessment materials.

Ready to See the Full Assessment System?

Request a curriculum preview to explore complete rubrics, sample artifacts, oral check protocols, and assessment guides.