Build Something Real with Words and Code
Our twelve-month program walks you through natural language processing from scratch. You'll work with actual text data, solve real problems, and leave with projects you can point to.
View Course ScheduleWhat Actually Happens During the Year
Most people think NLP means throwing data at pre-trained models and hoping something useful comes out. That works until it doesn't. Then you're stuck without understanding why your chatbot keeps insulting customers or your sentiment analyser thinks complaints are compliments.
We start at the beginning. How tokenisation works. Why word embeddings matter more than you'd think. The difference between sequence-to-sequence models and what actually makes transformers tick. Not the trendy stuff that everyone blogs about. The foundations that let you debug problems at three in the morning.
You'll spend months writing text processing code that feels basic and occasionally frustrating. But that's when you learn how language behaves when it's not perfectly formatted Wikipedia articles.
By month six, you're working with messy datasets from actual Australian businesses. Social media comments with creative spelling. Customer emails that switch between casual and formal mid-sentence. Medical records with abbreviations that mean six different things.
The final semester involves building something substantial. Previous cohorts have created tools that classify support tickets, extract information from legal documents, and analyse sentiment in regional dialects. One student built a system that helps non-native speakers understand Australian workplace idioms, which was surprisingly useful.
Real code reviews, real
problems
Who'll Be Teaching You
Callum Threlfall
Computational Linguistics
Spent eight years building text analysis systems for government agencies. Now teaches the stuff that textbooks skip over because it's too practical.
Freya Bannister
Neural Architecture
Built conversational systems for healthcare providers. Knows every way attention mechanisms can fail and how to fix them without retraining everything.
Sienna Bamford
Applied Implementation
Worked on recommendation systems that actually had to process Australian English. Teaches deployment strategies that survive contact with production.
Poppy Godwin
Industry Practice
Led NLP projects at three different startups. Covers the bits between proof-of-concept and something customers can actually use without breaking.
How the Curriculum Unfolds
Foundation Phase
- Text preprocessing and tokenisation methods
- Statistical language models and n-grams
- Vector representations and embeddings
- Named entity recognition fundamentals
- Building your first text classifier
Architecture Phase
- Recurrent networks for sequence processing
- Attention mechanisms and transformers
- Pre-trained models and transfer learning
- Sequence-to-sequence architectures
- Fine-tuning strategies for specific tasks
Application Phase
- Working with domain-specific corpora
- Building conversational interfaces
- Information extraction pipelines
- Model evaluation beyond accuracy scores
- Deployment and monitoring strategies
Why This Takes a Full Year
You can learn the syntax of PyTorch in an afternoon. Understanding why your language model keeps generating nonsense takes months of working through failures.
We run two intakes annually. The autumn cohort starts in late March 2025, giving students time to settle in before diving deep. The spring intake begins mid-September 2025 for those who need the first half of the year to sort things out.
Most students keep working while they study. Classes run Tuesday and Thursday evenings, with Saturday morning workshops for hands-on projects. You'll need about fifteen hours weekly outside class time, though some weeks demand more when deadlines approach.
The program doesn't promise you'll land a six-figure role immediately after graduation. It does give you skills that matter when businesses need someone who understands how language actually works in code.
Ask About Enrolment