MEDEX Northwest AI Policy — The One PA Program with Rules
MEDEX Northwest is the only PA program publishing its own AI policy beyond CASPA. It's stricter than CASPA and reserves the right to use AI detectors.
MEDEX Northwest AI Policy — The One PA Program with Its Own Published Rules
Short answer: Of the 20 prominent PA programs we surveyed for AI policies in April 2026, University of Washington MEDEX Northwest is the only one with a published, program-specific AI policy that goes beyond simply mirroring CASPA's central rule. MEDEX permits limited AI use for non-substantive editing (spelling and grammar) only, prohibits AI for substantive content, and reserves the right to use AI detection tools — even though PAEA itself has cautioned PA programs that detection tools are unreliable. If MEDEX Northwest is on your application list, treat its policy as binding alongside the CASPA central certification.
This article exists because MEDEX is genuinely an outlier, and the contradiction between PAEA's anti-detection guidance and MEDEX's pro-detection stance is the single most interesting policy story in PA admissions for 2026.
What MEDEX Northwest actually says
The verbatim policy, as published on the MEDEX Northwest applicants page:
"Applicants are responsible for following all institutional policies on the use of artificial intelligence (AI) tools, and all written content submitted through CASPA must be the applicant's own original work. Limited use of AI or other tools for non substantive editing — such as spelling or grammar correction — is permitted, but the final submission must accurately reflect the applicant's own writing, experiences, and voice. MEDEX may use tools that detect AI generated or AI modified content, and may use AI supported systems during admissions review."
There are four distinct things happening in that paragraph. Each one matters.
1. The "follow all institutional policies" clause is a backstop
MEDEX explicitly defers to the CASPA certification and any other institutional policies. So everything in CASPA's central rule still applies — the strict prohibition on AI-written content, the prohibition on modification, the affirmative attestation. MEDEX is layered on top of CASPA, not in place of it. If CASPA bans something, MEDEX bans it. MEDEX adds extra rules but doesn't subtract any.
2. The "non-substantive editing" carve-out is more lenient than CASPA central
This is the most interesting part. CASPA's central certification reads:
"I am strictly prohibited from using Generative AI to create, write and/or modify any content, in whole or part, submitted in CASPA."
The CASPA central rule technically prohibits any modification — even a Grammarly suggestion that fixes a misplaced comma could be read as "modifying content." Most PA admissions readers and PAEA itself have tacitly accepted that grammar/spell-check tools are not the target of this rule, but the strict reading bans everything.
MEDEX's policy explicitly carves out a narrower allowed zone: "Limited use of AI or other tools for non substantive editing — such as spelling or grammar correction — is permitted." MEDEX is willing to be more explicit about what it tolerates than CASPA central is. That actually gives MEDEX applicants slightly more clarity than applicants to other programs, who have to guess where the central rule's boundaries lie.
But — and this is critical — the carve-out is narrow. "Non-substantive" is the operative word. Spelling and grammar correction are non-substantive. Rephrasing a sentence is substantive. Restructuring a paragraph is substantive. Asking an AI for feedback and incorporating any of it into the final draft is substantive. The MEDEX rule is "you can run a spell check; you cannot have an AI rewrite a sentence."
3. "Final submission must accurately reflect the applicant's own writing, experiences, and voice"
This is the authentication test. Even within the non-substantive editing carve-out, the final essay has to read as authentically yours. If you used a grammar tool aggressively enough that the prose started to sound like an AI's voice rather than yours, MEDEX's reviewer is empowered to flag it under this clause. The standard is not "did you technically only use Grammarly" — it's "does the submission reflect your own writing, experiences, and voice." The MEDEX language puts the burden on the applicant to maintain authenticity, not on MEDEX to prove violation.
4. The detection clause — and the contradiction
The last sentence is the unique one:
"MEDEX may use tools that detect AI generated or AI modified content, and may use AI supported systems during admissions review."
Two things to notice here. First, MEDEX explicitly reserves the right to run AI detection tools on submitted content. Second, MEDEX also reserves the right to use AI to help review applications ("AI supported systems during admissions review"). The program is using AI on its own end while requiring applicants not to use AI on theirs.
That second part is not unusual — admissions offices across higher education are increasingly using AI for application triage, scoring assistance, and routine document processing. What's unusual is MEDEX being upfront about it.
But the first part — reserving the right to use detection tools — puts MEDEX out of step with PAEA's central guidance.
The PAEA contradiction
PAEA, the Physician Assistant Education Association that operates CASPA, has been remarkably honest about the unreliability of current AI-detection tools. From PAEA's published guidance for member programs (What Your Program Should Know About AI and Admissions):
"PAEA will not investigate an applicant if the only evidence the applicant did not write their personal essay comes from AI detection software. The Association's position is that current AI detection tools are simply not reliable enough yet."
PAEA goes further and recommends in-person essay-writing during interviews as the preferred verification mechanism — get the applicant to write a short essay in front of you and compare voice and structure to their submitted personal statement.
This is the most pro-applicant position any centralized health-professions service has taken on AI detection. PAEA has read the research on false positives and false negatives and concluded that the technology is not trustworthy enough to use as the sole basis for a serious admissions decision. We've covered the false-positive problem in our deep dive on flagxiety — students who wrote every word of their essay themselves are still getting flagged by detection tools because the tools cannot reliably distinguish careful student writing from AI output, especially for non-native English speakers.
So when MEDEX explicitly says "MEDEX may use tools that detect AI generated or AI modified content," they are reserving a right that PAEA central has cautioned against using. The contradiction is not technically a violation — PAEA's guidance is non-binding for member programs, and individual programs are free to set their own enforcement policies. But it is the most notable disagreement in the PA admissions policy landscape right now.
What MEDEX applicants should actually do
Practical guidance if MEDEX Northwest is on your application list:
Treat the policy as binding
The MEDEX policy is published on the program's admissions page. It is therefore part of the application contract you accept when you submit. Even if PAEA centrally would not investigate based on detection-only evidence, MEDEX has reserved the right to take its own actions. For your MEDEX submission specifically, plan as if AI detection will run.
Use spell check and grammar correction only
The MEDEX carve-out for non-substantive editing covers spell check and grammar correction. It does not cover:
- AI rewriting or rephrasing
- AI feedback on structure or tone that you incorporate
- AI suggestions for word choice, even single words
- AI brainstorming where the AI's output ends up reflected in your final essay
If you would not be comfortable telling a MEDEX admissions reader exactly which AI tool you used and exactly which words came from it, do not use it.
Preserve drafting evidence
If you are ever questioned, the strongest defense against a false-positive AI detection result is your drafting history. Save your Google Docs revision history. Keep dated drafts in a folder. Save any handwritten notes. The CASPA central rule and the MEDEX policy both put the burden of authenticity on you; being able to show a trail of human revisions is the most concrete way to discharge that burden. PAEA-funded research has actually measured how detectors perform on PA application essays — and the false-positive rate on human writing is the exact reason this drafting evidence matters. See our breakdown of the PAEA AI detection research for the numbers.
Write the new CASPA AI essay carefully
Beginning with the 2026-2027 cycle, CASPA includes a new AI and Technology essay (the "Situational Decision-Making Question") that asks applicants to think critically about AI in healthcare. We have a complete guide to that essay including the verbatim prompt and seven worked angles. For MEDEX applicants specifically: this essay is doubly fraught because MEDEX is more likely than other programs to run detection tools on it, and the topic is the AI policies you must comply with. Do not use AI to write the essay about AI. The discipline of writing it without AI assistance is part of what MEDEX is testing.
Why MEDEX is the outlier — a hypothesis
We don't have an interview with MEDEX leadership to confirm this, so this is speculative. But the hypothesis worth considering: MEDEX serves a population that is unusually likely to test the policy.
MEDEX Northwest is a University of Washington School of Medicine, Department of Family Medicine program with a long history of training PAs for primary care and underserved communities in the Pacific Northwest, Alaska, and the Western US. Its applicant pool skews older, more clinically experienced, and more non-traditional than many MD-school-affiliated PA programs. That includes more military-experienced applicants, more career-changers, and more applicants from rural and underserved backgrounds where access to writing support is uneven.
In that population, a detection tool that produces false positives for non-native English speakers, for applicants whose writing style is unusually structured or formal, or for applicants who write the same way they speak in clinical settings, becomes a real risk. PAEA's caution is that detection tools are unreliable in general; MEDEX has chosen to reserve the right to use them anyway, presumably because they have decided the risk of false negatives (an AI-generated essay that goes undetected) is worse than the risk of false positives (a human-written essay incorrectly flagged).
This is a defensible position. It is also out of step with the rest of the PA admissions ecosystem. We will be watching how MEDEX's policy evolves through the 2026-2027 cycle, especially after the new CASPA AI and Technology essay rolls out and admissions readers see how applicants engage with it.
Where MEDEX fits in the broader PA policy landscape
We did the survey work on the other 19 prominent PA programs in our program-by-program survey of PA school AI policies. The summary: 18 of the other 19 are silent at the program level, and the one with informal advisory language is the University of Iowa pre-health advising office (not the Iowa PA program admissions office). MEDEX is alone among the 20 in publishing actual program-specific rules.
If you are building your PA application list and want to compare AI policy stances across your target programs, that survey article is the place to start. MEDEX is the most stringent program-level policy we found. Every other program defers to CASPA's central rule. For an always-current list of every PA program's published AI position in one place — including any policies that emerge after our initial 20-program survey — see our PA program AI policies aggregator for 2026.
Related Reading
Medical school essays hub: Medical School Essays — The Complete Guide to AMCAS, AACOMAS, CASPA & TMDSAS — every medical school essay guide on the site, organized by application system, topic, and applicant profile.
The CASPA + AI policy cluster:
- PA School AI Policies 2026 — Why 18 of 20 Programs Are Silent — the umbrella survey this article expands
- CASPA AI Certification Decoded — What It Actually Bans — clause-by-clause read of the central rule
- CASPA AI and Technology Essay 2026-2027 — Prompt + 7 Angles — the new AI essay this cycle
- Can You Use ChatGPT for Your Medical School Application? AMCAS, AACOMAS, CASPA, TMDSAS Compared — the four-system comparison
- What Is Flagxiety? The AI Detection Anxiety Reshaping How Students Write — false positives and detection accuracy
- Medical School AI Policies 2026: AMCAS Rules & 60+ Schools — our AI policy hub for medical admissions
The CASPA writing cluster:
- Sample CASPA Essay Analysis: How 40+ Applicants Got Into Top PA Programs
- The CASPA Life Experiences Essay: What the New Prompt Actually Asks
- CASPA Personal Statement Topics to Avoid
Review Your Personal Statement
See how your AMCAS or secondary essay scores before you submit.