Robo-gov: How regulators are using machines to make calls
When Anton Joseph Wilson, a property developer, was banned from managing companies by the corporate watchdog last year it said he had already churned through nine collapsed firms owing an estimated total of $45 million.
The ban, which cited tax issues with two entities, was announced by the Australian Securities and Investments Commission in a press release and showed the corporate cop enforcing the law. But it was also something of a pyrrhic victory because, as ASIC’s press announcement noted, Wilson was already banned from managing companies when he ran six of the failed firms due to an undischarged bankruptcy.
In theory, governments’ increasing use of artificial intelligence and automation should help to identify patterns of suspicious behaviour, helping under-resourced regulators stop wrongdoing faster. But on Monday, this masthead revealed that ASIC is now using an automated system to decide whether to investigate liquidators’ allegations of wrongdoing against directors, with more than 97 per cent of allegations not formally pursued over the past five years.
ASIC’s use of “artificial intelligence”, or at least systems that might not be as smart as ChatGPT but still make decisions that would have once been solely the work of a human, is growing as it tries to police a flood of reports, complaints and tip-offs across the country’s corporate landscape. A 2022 report from the Financial Regulator Assessment Authority, which oversees the regulators, described three “artificial intelligence and machine learning” projects that have drawn little attention.
One monitors for suspicious sharemarket trades, picking out about 200 a day from the roughly 200,000 it checks.
Another reviews around 800 company prospectuses ASIC receives each year from firms that hope to list on the sharemarket and helps to rate their risk.
A third trial system analyses and helps triage the reports that financial service providers, such as banks and insurers, are required to lodge if they breach their licences. It includes reports of the kind of issues that led to the banking royal commission, such as giving customers misleading information, failing to disclose fees or overcharging on loans. These breach reports used to number about 4000 a year but have risen to almost 9000 over the last nine-month reporting period after changes to the law.
“Using a representative sample of breach reports, the AI trial provided 65 per cent accuracy in triaging breaches to the correct team,” the authority’s report notes. That figure was encouraging enough that ASIC planned to make the trial permanent.
ASIC did not directly address questions on these automated programs but in a general statement a spokesman said claims it used artificial intelligence were incorrect.
“The automation of some processes should not be confused with the use of artificial intelligence,” the spokesman said. “Like other regulators globally, we think AI has a role to play in the future. When we do use AI it will be done responsibly, fairly and safely, guided by principles that we will define and share transparently.”
ASIC receives huge volumes of data about possible misconduct, with 20,000 reports, complaints and other types of intelligence lodged this financial year to date, the spokesman said.
“We do not and cannot investigate every instance of possible misconduct that comes to our attention,” he said. “We have to make careful and sometimes difficult decisions, utilising our regulatory expertise, judgement and experience to assess a broad range of factors and information.”
UNSW Professor Toby Walsh, an AI expert, said government was a natural candidate to use artificial intelligence. It collected huge stores of data, was in a position of trust, and faced constant pressure to do more without spending any more money.
“Your mission expands the cheaper you can do things,” Walsh said.
‘A significant impediment to meaningful debate about the future governance of machine technology use by government is an almost complete lack of transparency about that use.’NSW Ombudsman 2021 report
ASIC is far from the only regulator relying on artificial intelligence and automated systems. Some state police forces use AI to help identify suspects from security camera footage and have trialled systems to try to assess the risk of repeated domestic violence offences. The tax office uses it to flag tax returns that appear out of the ordinary. Anti-money laundering agency AUSTRAC uses it to help find suspicious transactions. APRA, another regulator that oversees banks and superannuation funds, noted in its most recent annual report that it is piloting automated risk-ratings. AUSTRAC did not respond to a request for comment. APRA declined to comment.
These are likely the tip of the iceberg, according to a 2021 report from the NSW Ombudsman, which is an independent government oversight agency.
“A significant impediment to meaningful debate about the future governance of machine technology use by government is an almost complete lack of transparency about that use,” the report states.
Federal Finance Minister Katy Gallagher said: “The public service must act ethically, treat people with respect and make informed and evidence-based decisions. While automated decision-making is becoming more available, the Robodebt Royal Commission has highlighted the need to act cautiously.”
Hanging over perceptions of them all is the best-known automated government system, the unlawful welfare recovery program colloquially known as “robo-debt”, which is now so infamous it has sparked a royal commission delivering damaging new information seemingly every day.
Walsh did not want the government to be cowed from using automated systems by the failures of robo-debt, which was flawed from the start because it presumed that welfare recipients’ income was evenly spread whereas the law requires it to be measured week by week. He argues public servants are naturally cautious and should learn from the private sector’s approach of launching products and iterating on them quickly, albeit while staying conscious of the damage that incorrect government decisions can have.
“The thing that was so bad about robo-debt was that they stuck to their guns when the evidence started to arrive that this was causing harm,” Walsh said.
“If they had quickly iterated and said, ‘Well, wait a second, let’s change how we do the calculation or let’s reflect on whether we’re doing this right’ as soon as they started to have evidence in year one of the program, then most of robo-debt’s harms would have been averted.”
UTS Professor Ed Santow, a former human rights commissioner who led a large-scale report into government AI, would prefer a different approach. He said there are areas where governments can experiment with AI, such as using it to improve train schedules, but others where it should be wary.
“In high-stakes decision-making it’s not safe to go out with what the tech world would call a ‘beta product’ and then just iterate on your customers because they are citizens who are receiving decisions that can be life-changing,” Santow said.
“The other difference from the corporate sector … is that in the corporate sector, people can generally shop elsewhere. Government has a monopoly. It’s the only game in town.”
If ASIC refuses to pursue a director, there are often no alternatives with the resources to do it. In the Wilson case, ASIC did act, and gave funding for liquidators to do more work. The spokesman confirmed there had been no updates to the ban on Wilson, and his wife Melinda Wilson, who was also banned. Melinda Wilson did not respond to phone calls and texts seeking comment from her and her husband, whose previous number and email are inactive. He did not respond to a LinkedIn message.
“We believe this ban represented an important action despite Mr Wilson’s bankruptcy,” the ASIC spokesman said. “ASIC has the power to disqualify a person from acting as director for up to five years and where a director is engaging in systemic or serious misconduct, even if they are already bankrupt, ASIC will take action to ban them for the protection of the public.”
The Business Briefing newsletter delivers major stories, exclusive coverage and expert opinion. Sign up to get it every weekday morning.
Most Viewed in Business
Source: Thanks smh.com