Skip to content

The road to COMP4020: providing sharp tools

comp4020

INFO

This post is part of a series I’m writing as I develop COMP4020/8020: Rapid Prototyping for the Web. See all posts in the series.

I was at the Australasian Computing Education Conference (ACE) in Melbourne last week presenting my work on LLMs Unplugged and (unsurprisingly) a lot of the discussion was around the use of LLMs in the classroom.[1]

One fun anecdote from Claudia Szabo was about how she gets her small kids to help in the kitchen, and that she gives them sharp knives and teaches them how to use them. And she wants the same for her students in the classroom.

One of my main concerns for teaching COMP4020 later this year (starting July 2026) is that I want my students to have a) access to the latest agentic LLMs and b) sufficient token budgets to use them properly. It’s not fair to expect students to shell out 200USD of their own money for a Claude/ChatGPT/Gemini Max plan; I have to provide them with a strong baseline (although my current plan is to allow students to use different models if they wish; it’s too hard to police anyway).

GitHub Copilot does have free student accounts but they’re limited to 300 “requests” (read prompts) per month, or even fewer (by a factor of almost 10!) if students choose Claude Opus 4.6. It’s just not enough for me to run the sort of class and teach the sort of workflows I’m planning to run.

I’m not saying my students need Gas Town levels of tokens, but they need to be able to regularly sit down and have long, interactive back-and-forth sessions exploring different ideas and implementations.

So, my current options are:

  1. get Anthropic/OpenAI/Google, or a company that resells those models like Microsoft/GitHub/Amazon to sponsor the class. I’m thinking that I’ll need approximately 200 seats (that’s the number of students I’m expecting) and they’d need at least the $20/mo pro-level plan, ideally the $100/mo level one—or similar amounts of API credits

  2. get a partner with datacenter-grade hardware (e.g. Canva, LambdaLabs, or ANU’s very own NCI) to host an open weight model for us, e.g. Qwen3.5, MiniMax M2.5, Kimi K2 or GLM-5 using vLLM

Option #1 is my slight preference because those folks are really good at serving these models at scale. If we have to self-host then there’s a risk that I’m on the hook if the model goes down one hour before the assessment deadline. But it wouldn’t be a disaster.

If you work at any of the above companies and would like to get in touch, email me. I can offer the good vibes and publicity that comes with supporting the next generation of software developers and computer scientists in their learning. And I’m happy to share all the course materials online, inc. a shout-out to whichever model you end up providing. You’ll also have an opportunity to meet (if you like) the students, who are awesome and will be highly skilled and looking for work in the near future. And finally, you’ll have my gratitude 😃


  1. There was a running joke where the few presenters whose papers weren’t about LLMs made a point of that fact in their intro, and the rest of the presenters sortof gave an apology that theirs was yet another paper on LLMs. But I digress. ↩︎

Cite this post
@online{swift2026comp4020SharpTools,
  author = {Ben Swift},
  title = {The road to COMP4020: providing sharp tools},
  url = {https://benswift.me/blog/2026/02/17/comp4020-sharp-tools},
  year = {2026},
  month = {02},
  note = {AT-URI: at://did:plc:tevykrhi4kibtsipzci76d76/site.standard.document/2026-02-17-comp4020-sharp-tools},
}