Robo-compliance in employment services

This post is a summary of an article that will be appearing in the Australian Journal of Social Issues, later in the year. It builds on earlier analysis from @simonecasey about the ways that digitisation has automated decisions in employment services and reshaped compliance with mutual obligation requirements. It explores the shift from human decision-making to self-activation and automated compliance, analysing its impact on job seekers and employment services staff. This shift raises concerns about power dynamics, transparency, and fairness in welfare administration, especially as digital technologies become central to enforcing conditional welfare policies.

Since the late 1990s, Australia has adopted a disciplinary welfare model, where access to welfare payments is conditioned on meeting strict mutual obligation requirements. These policies aim to reduce welfare costs and increase workforce participation, but they have also resulted in increased surveillance and control over job seekers. Initially, these policies were enforced through employment services workers, who monitored compliance and determined penalties for non-compliance.

However, over the past decade, the digitisation of employment services has extended this disciplinary approach. Technologies such as online self-service platforms and automated decision-making systems (ADMs) have replaced many human interactions, making job seekers responsible for tracking and reporting their own compliance. This has led to a shift in power from employment services workers to automated systems, raising concerns about fairness and accessibility.

Self-Activation and Automated Compliance

Self-activation is a key feature of the new digital employment services model. It refers to the process where job seekers must independently manage and report their compliance data. Instead of employment services workers verifying participation in job-seeking activities, job seekers are required to log into online portals and manually enter proof that they have met their obligations.

The introduction of self-activation means that compliance is no longer assessed by human discretion but instead by pre-programmed digital rules. If a job seeker fails to enter the required data within the system’s deadline, they may face automated penalties, including payment suspensions and sanctions.

The Targeted Compliance Framework (TCF) and Workforce Australia

The Targeted Compliance Framework (TCF) was introduced in 2018 to further automate the enforcement of mutual obligation requirements. Under the TCF, job seekers are required to self-report their compliance through a digital dashboard, which tracks progress toward meeting pre-set targets. The system automatically issues demerit points and payment suspensions for non-compliance. A job seeker who accumulates five demerit points enters the penalty zone, where they risk losing welfare payments for up to four weeks.

In 2022, the Workforce Australia model expanded these self-activation processes. Workforce Australia consists of two main service types:

  1. Provider Services – for job seekers with greater employment barriers who receive support from a service provider.

  2. Online Services – for those expected to find work independently, with minimal human assistance.

Job seekers in Online Services must sign an Online Job Plan, which automatically assigns mutual obligation requirements. Compliance is enforced through the Points-Based Activation System (PBAS), where participants must earn up to 100 points per month by completing activities such as job applications, training, or volunteer work. Failure to meet the required points results in automated payment suspensions.

Impacts of Automation on Decision-Making Power

1. Shift from Human Discretion to Robo-Compliance

Previously, employment services workers had some discretion in assessing compliance. They could consider individual circumstances, such as illness or caring responsibilities, before applying penalties. However, with self-activation and automated compliance, decision-making is now embedded in IT systems, reducing opportunities for human intervention.

For example, if a job seeker forgets to log a required activity, the system automatically suspends payments, regardless of personal circumstances. This shift from human discretion to rigid algorithms has led to a system of Robo-compliance, where decisions are made without meaningful consideration of individual needs.

2. Increased Burden on Job Seekers

Self-activation transfers administrative responsibilities from employment services providers to job seekers. They must now navigate digital systems, understand complex compliance rules, and ensure that their data entries align with system expectations. Many participants struggle with digital literacy, creating additional barriers to receiving support.

Statistical data from Workforce Australia reveals that only 56-62% of Online Services users were able to complete their PBAS requirements on time. This suggests that nearly half of job seekers are facing difficulties in meeting compliance obligations, increasing their risk of payment suspensions and financial hardship.

Automation Errors and Systemic Failures

Case Study: IT Error in the TCF

A notable example of automation failure occurred in the Targeted Compliance Framework (TCF), where a programming error resulted in 1,300 job seekers receiving erroneously calculated financial penalties over a five-year period. The IT system failed to reset individuals’ compliance status after they had met their obligations, causing them to be wrongly penalised with one-, two-, or four-week payment suspensions.

The case study highlights the risks of relying on automated decision-making, particularly when system errors can go undetected for years, causing significant financial distress.

Lack of Transparency and Accountability

The shift to automated compliance has made it more difficult for job seekers to challenge decisions. Unlike interactions with human employment services workers, digital systems lack transparency, making it unclear how decisions are made or who is responsible for errors. The absence of human oversight means that participants have limited opportunities to negotiate their requirements or seek exemptions.

For example, while employment services workers can manually review some compliance decisions, their role has largely shifted to assisting job seekers in navigating digital platforms rather than making case-by-case assessments. This change raises serious concerns about fairness, particularly for job seekers with disabilities, limited digital literacy, or complex personal circumstances.

Future Research and Policy Implications

Further research is needed to understand the long-term effects of digital employment services on job seekers. Including:

  • Greater scrutiny of algorithmic decision-making, ensuring that IT systems are designed with transparency and accountability.

  • Stronger protections against automation errors, preventing wrongful penalties like those seen in the TCF case study.

  • Improved support for job seekers, including alternative service models for those who struggle with digital compliance requirements.

  • A reconsideration of the balance between efficiency and fairness, ensuring that welfare recipients are not unfairly penalised by automated processes.

Conclusion

The digitisation of Australia’s employment services has fundamentally changed the power dynamics in welfare administration. While self-activation and automated compliance were introduced to improve efficiency, they have resulted in a rigid, punitive system that often lacks fairness and transparency. The case studies on Online Job Plans, PBAS, and TCF payment suspensions illustrate how decision-making power has shifted from human workers to algorithmic rules, making it more difficult for job seekers to navigate the system and avoid financial penalties.

Given the concerns raised, policymakers must ensure that digital employment services do not replicate the harms of previous automation failures like Robodebt. Future reforms should focus on introducing more human oversight, improving system transparency, and ensuring that automation does not undermine social security rights.

The article has been summarised with the help of AI, by @simonecasey and the full free version can be found here.

Thank you to the patient reviewers who helped me get this over the line.

Power to Persuade