The rollout of AI-powered road safety cameras across Australia has triggered a predictable explosion in fines: 184,000 infringements in Western Australia since October 2025, over 130,000 in New South Wales in a single year, and 114,000 in Queensland. Governments are raking in tens of millions in revenue while drivers face hefty penalties — often hundreds of dollars plus demerit points — for seatbelt and phone offences captured in a single still image.
Call it what it is: automated revenue collection dressed up as road safety. This isn't thoughtful enforcement — it's Big Brother surveillance on steroids, designed to extract money from everyday drivers with minimal human oversight and even less real-world context.
The Numbers Don't Lie: It's About the Cash
In WA alone, these cameras have already generated over $29 million in fines in just months, with some drivers hit with $20,000+ for repeated seatbelt issues. NSW pulled in over $100 million from camera-issued offences. Yet when drivers appeal, a staggering 60% of reviewed seatbelt fines in WA are withdrawn — meaning the system is wrong (or overly punitive) far too often. Millions in fines have been quietly cancelled after public backlash.
If the goal were genuinely safer roads, we'd see clear drops in serious crashes and fatalities correlating with these fines. Instead, the focus is on volume: more cameras, more automated detection, more money flowing into government coffers. Classic revenue-raising behaviour — governments have found a high-tech cash cow that operates 24/7 without needing police on the street.
AI + Still Images = Injustice at Scale
These systems use computer vision on single snapshots. The AI flags a potential offence, a human reviewer (supposedly) checks it, and the fine drops in the mail weeks later. But real driving is dynamic. A passenger adjusts their seatbelt momentarily. A driver reaches for something safely. A medical exemption or temporary injury applies. None of this context survives in a blurry still image taken from the roadside.
Appealing requires time, stress, phone queues, paperwork, and sometimes court — all while you're presumed guilty. Many drivers simply pay up to avoid the hassle. This isn't justice; it's a bureaucratic trap that punishes ordinary people while governments profit.
Human police officers can exercise discretion in real time — educate, warn, or consider circumstances. AI cameras cannot. They optimise for detection and revenue, not nuanced safety.
Big Brother is Watching Inside Your Car
These aren't traditional speed cameras. Many peer directly into vehicles to monitor seatbelts and phone use. That's invasive surveillance of private spaces — your car — enabled by AI. Governments promise images are deleted if no offence is found, but trust in data retention, hacking risks, and mission creep is low. Once the infrastructure is in place, expanding to more behaviours (distraction detection, passenger compliance, etc.) becomes trivial.
This fits a broader pattern: ever-expanding government monitoring of citizens under the banner of "safety" or "the greater good." Combined with other tracking (ANPR, phone data, etc.), it builds a digital panopticon on the roads.
Safety Theatre, Not Real Solutions
True road safety comes from better road design, driver education, fatigue management, tackling aggressive driving, and targeting high-risk repeat offenders — not fining millions for momentary lapses captured on camera. Studies and real-world experience with automated enforcement often show revenue gains far outpacing safety improvements. Fatalities don't plummet just because fine revenue soars.
Governments love these systems because they're efficient cash generators with plausible deniability ("It's for safety!"). Drivers bear the cost, the frustration, and the demerit points that can cost jobs and licences.
Time to Push Back
AI cameras have their place for clear-cut offences like speeding in fixed zones. But the aggressive expansion into interior vehicle monitoring, coupled with massive revenue hauls and high error/appeal rates, reveals the true priority: filling budgets on the backs of motorists.
This isn't safety innovation — it's predatory governance. Drivers deserve transparent, proportionate enforcement that considers real-world context, not an automated fine factory that treats every citizen as a potential revenue source.
Governments should scrap the most invasive uses, publish full accuracy and revenue data, and redirect focus to genuine safety measures. Until then, these AI cameras remain what they clearly are: Big Brother revenue raisers in high-tech disguise. Australians should demand better.