Contents
With the rising popularity of AI tools in education, many OET candidates are turning to ChatGPT as a study partner—especially for the Writing sub-test. But the big question remains:
Can ChatGPT actually help you pass OET Writing as a nurse?
To find out, I decided to run a simple experiment: I wrote 10 OET referral letters, all with the help of ChatGPT. I followed the official format, used real OET-style case notes, and asked ChatGPT to:
Review each letter
Suggest improvements
Provide model answers and vocabulary upgrades
Explain corrections step by step
My goal was to see if ChatGPT could help me:
Write faster and more confidently
Understand common writing patterns
Reduce grammar and structure mistakes
Hit the Band B standard required for registration
In this report, I’ll share:
How I structured my 10-day practice routine
How ChatGPT performed across each writing task
What kinds of feedback were helpful—and what weren’t
The exact phrases and sentence patterns I learned
And most importantly: whether my writing actually improved
I’ll also talk honestly about the limitations of using ChatGPT alone: when the AI gave vague advice, misunderstood clinical tone, or missed context from the case notes.
If you’re a nurse preparing for OET Writing and wondering whether AI is a useful tool—or just a distraction—this experiment will give you real insight into what ChatGPT can (and can’t) do for your success.
Let’s begin with how I set up the challenge—and what I expected going in.
As a nurse preparing for the OET, I quickly discovered that the Writing sub-test was the most difficult part to prepare for. Why? Because unlike Listening or Reading, which offer clear answers and ready-made materials, Writing requires personalized feedback, clinical vocabulary, and a sense of professional tone—all within 45 minutes and 180–200 words.
But unless you’re enrolled in a prep course or working with a tutor, it’s hard to get reliable correction. I didn’t have access to a professional teacher. That’s when I thought:
Can ChatGPT act as my writing coach?
I wanted to know if I could:
Use ChatGPT to review my OET referral letters
Get useful corrections and explanations
Improve my writing over time—even without a human tutor
So I created a simple experiment:
Duration: 10 consecutive days
Task: Write 1 OET referral letter each day based on sample case notes
Tool: Use ChatGPT for feedback, rewrites, grammar checks, and vocabulary suggestions
Goal: Reach the clarity, tone, and structure expected of a Band B or higher
I used the same base prompt every day:
“Please review this OET referral letter written by a nurse. Correct any grammar or tone issues, improve the structure if needed, and explain your suggestions step by step.”
Sometimes I followed up with:
“Can you rewrite this letter at a Band B level?”
“List 5 key phrases I can reuse in future writing.”
“Was anything inappropriate or unclear?”
Once I committed to this challenge, I knew the key to success was consistency and structure. So I created a daily routine that I could repeat for ten days without feeling overwhelmed. Each day, I focused on writing one complete OET referral letter using ChatGPT as my writing coach.
Here’s how my daily routine looked:
I used either:
Realistic case notes from past OET samples (available online)
AI-generated case notes created with prompts like:
“Create OET-style case notes for a nurse writing about a patient with diabetes complications.”
I made sure the scenario was varied—different conditions, age groups, and referral purposes.
I treated each session like a mini mock test:
Set a timer for 45 minutes
No grammar tools or Google—just me and the notes
Typed my draft in Word or Google Docs
Aimed for 180–200 words, proper structure, and clinical tone
This helped me simulate real test conditions and manage pressure.
After completing my draft, I copied the letter into ChatGPT and used my base prompt:
“Please review this OET referral letter. Correct grammar, improve tone, and explain why each change is made.”
Depending on the day, I also asked for:
Rewriting at Band B level
Highlighting unclear or inappropriate phrases
Listing 5 reusable clinical expressions
Summarizing the tone and structure
I kept a spreadsheet where I logged:
Mistakes I made (e.g., too casual tone, vague details)
Phrases ChatGPT suggested (e.g., “responded well to treatment”)
Grammar rules or format tips
Confidence score (1–5) after each session
To my surprise, ChatGPT turned out to be a very efficient writing assistant—especially in the areas where OET candidates tend to struggle most. While it didn’t replace a real teacher, it made daily writing smoother, more focused, and less intimidating.
Here’s what it did best:
Every time I submitted a letter, ChatGPT helped reinforce the standard OET letter structure:
Purpose in the opening
Relevant medical background
Current condition and treatment
Requested action or follow-up
If I mixed up the order or included too much irrelevant information, it gently suggested how to fix it. Over time, I began to apply the format automatically.
Example:
“You might want to state the purpose more clearly in the first sentence.”
“This paragraph could be better placed after the clinical history section.”
One of the hardest parts of OET Writing is sounding professional but not robotic. ChatGPT caught my casual phrases and offered better clinical alternatives.
Example:
❌ “She’s doing better now.”
✅ “She is currently clinically stable and recovering well post-treatment.”
These rewrites helped me internalize the right tone and vocabulary.
Every day, I asked ChatGPT to extract 3–5 phrases I could reuse. These turned into my “mini phrasebook” for future writing.
Some examples:
“He presented with…”
“She was admitted for management of…”
“I am referring her for continued care regarding…”
“He has a past medical history of…”
“Your ongoing management is kindly requested.”
This saved me time and improved my fluency.
Instead of just rewriting the letter, ChatGPT explained why it made each change:
Grammar
Clarity
Formality
Relevance
This made me learn from each correction, not just copy it.
While ChatGPT helped me write faster and more confidently, it definitely had some blind spots—especially in areas that require deep clinical reasoning or true test-level accuracy. Here’s what I learned about its limitations:
Even when I provided complete case notes, ChatGPT occasionally misunderstood the clinical situation or gave advice that didn’t match the patient’s condition.
Example:
Case note: “Patient recovering from stroke, history of hypertension.”
ChatGPT suggestion: “He should avoid sugary foods to control diabetes.”
❌ There was no mention of diabetes at all.
Lesson: AI doesn’t always “read between the lines” like a human examiner would. You still need to check the logic.
When I asked, “Is this letter Band B?”, ChatGPT almost always said yes—regardless of how strong or weak my writing was. It didn’t provide a clear score breakdown (e.g., purpose, tone, grammar, cohesion) like an official OET assessor would.
Verdict: Helpful as a coach, but not reliable as a grader.
Sometimes, ChatGPT rewrote my letter so heavily that it lost my original intent or tone. In trying to “improve” the writing, it occasionally made it sound too robotic or too far from what I was trying to express.
I had to learn to say:
“Thanks, but I’ll keep my version for this part.”
The OET Writing test penalizes including irrelevant or excessive detail. ChatGPT didn’t always know which parts of the case notes to leave out, often keeping everything.
That’s a judgment you still need to make yourself.
After 10 days of using ChatGPT as my only writing coach, I can confidently say:
Yes—ChatGPT can help you improve your OET Writing skills.
But no—it shouldn’t be your only tool if you’re aiming for Band B or higher.
Here’s what I really gained from the experiment:
Routine and discipline: Having a “coach” on demand made it easier to practice every day.
Better structure: I learned how to organize my letters more clearly and professionally.
Confidence: The daily repetition helped reduce my writing anxiety.
Useful language bank: ChatGPT gave me dozens of Band B-level phrases and clinical expressions.
Band-level feedback: ChatGPT can’t truly assess whether your writing meets OET scoring criteria.
Case-based reasoning: It sometimes gave generic or off-topic suggestions.
Personalized advice: A human tutor could have pushed me to explain my choices and refine my logic.
Exam conditions: I had to create my own time pressure and didn’t get experience with hand-writing or official templates.
Use ChatGPT as:
A daily practice partner
A grammar checker and rephraser
A source of model letters and vocabulary
A tool to explain why something is correct or incorrect
But combine it with:
Feedback from teachers or experienced OET candidates
Practice under test-like conditions
Time spent reviewing the official scoring criteria and samples
Bottom line: ChatGPT is a great assistant—but not a certified OET tutor.
It can take you from “unsure” to “almost ready,” but if you want to pass confidently, you still need human guidance.
Still, if you’re a nurse on a budget or just getting started, using ChatGPT is one of the smartest, cheapest, and most flexible ways to start improving today.