ANZ develops algorithm to tackle abusive messages in payments – Finance

ANZ Bank has developed an algorithm that identifies harmful misuse of payment fields in transactions to harass or abuse victims.

The bank’s financial crime team developed the algorithm to prevent its digital payment services from being used to send abusive messages.

An ANZ spokesperson said iTnews the algorithm was built using “indicators of abusive payments published in an AUSTRAC financial crime guide”.

“In some cases, this algorithm identified potential imminent threats to life and these high-risk cases were escalated to ensure authorities were able to take action,” the spokesperson said.

“This algorithm recently identified an individual who repeatedly sent abusive and threatening payment messages to a victim.”

After being identified as a repeat offender, an internal investigation took place, ultimately leading to the individual’s arrest by law enforcement, the spokesperson continued.

ANZ is a founding member of AUSTRAC, the Fintel Alliance, formed in 2017 to build the capacity of the financial industry to address these issues and support law enforcement investigations.

NAB has also recently strengthened its measures to systematically block abusive messages in transaction description fields.

The bank said it managed to block 10,000 abusive transactions from 6,800 individual customers in March alone.

NAB’s method prevents payments made in its banking app with offensive words from being processed until the message has been changed.

Westpac is also now running payment data analytics to detect “subtle threats and patterns of abuse” in description fields, while Commonwealth Bank has built machine learning models to look for patterns of abusive behavior .

If you or someone you know is experiencing domestic or family violence, call 1800RESPECT (1800 737 732) or visit www.1800RESPECT.org.au.

For advice, guidance and support call MensLine Australia on 1300 789 978 or www.mensline.org.au. In an emergency or if you feel unsafe, always call 000.