CommBank Opens Up Its Financial Abuse Detection AI Code to Banks Worldwide

Image credits: Commonwealth Bank

Today, the Commonwealth Bank of Australia (CBA) has taken a significant step towards combatting technology-facilitated financial abuse by sharing its breakthrough AI model with global financial institutions. The model helps protect individuals experiencing coercive control from a partner, by detecting digital payment transactions that include harassing, threatening or offensive messages.

The state of financial abuse in Australia

Financial abuse is a form of coercive control that is used to keep a partner trapped in a relationship. This form of abuse often goes unrecognised in Australia because it can often be dismissed as a private family matter, and people may be uncomfortable talking about finances. However, a survey conducted by YouGov and CBA in May with over 10,000 Australians showed that 1 in 4 adults have experienced financial abuse from their partner. Impacting a staggering number of Australians, financial abuse can include running up significant debts in a partner’s name, using threats or intimidation to maintain control, or sabotaging financial independence.

What is CBA doing to help?

In 2018, CBA discovered a pattern where abuse perpetrators would send a high level of low-value transactions, usually valued at 1 cent, to their partner. These transactions would often be weaponised to send their partner abusive messages. Going forward, CBA blocked over 1 million transactions by detecting profanities in the messages.

However, perpetrators then found ways to bypass this by using language that was less obviously crude, but still just as coercive. These types of phrases included language like, “I know where you live”, or “If you don’t unblock me I’m turning up”. As a result, CBA took action in 2021 to develop a Machine Learning model that could identify more nuanced forms of abusive messaging, through an analysis of many factors such as transaction frequency, transaction value, and intent. When new patterns of abuse in customer transactions emerge, CBA manually updates the guardrails of their algorithms to optimise detection capabilities on a consistent basis.

By leveraging technology to identify long-term patterns of systemic abuse, CBA can now assign abuse risk scores to customer accounts. Using this data they are able to step in to support the customer, offering to set up safe accounts, remove the ability for their partner to transfer funds using their mobile number or email address through PayID, or terminate the account of the perpetrator in some cases. CBA is also collaborating with NSW Police to assist with compiling evidence reports for survivors.

Image credits: Commonwealth Bank

Sharing open-source code with banks worldwide

CBA’s AI model has established a new benchmark for financial institutions worldwide. Their initiative isn’t driven by seeking a competitive advantage, because this AI model is now open to be used by any other bank. The source code will be freely accessible on GitHub, ensuring complete transparency.

By enabling other financial institutions to leverage this resource, CBA is aiming to empower the sector to take on greater moral responsibility and combat technology-facilitated abuse. This unified approach, CBA believes, will significantly enhance the protection of vulnerable customers. With these tools readily available, there is simply no reason for other institutions to not implement these essential measures.

Anyone worried about their finances because of domestic or family violence or coercive control can contact the CBA Next Chapter Team on 1800 222 387 for support – no matter who they bank with.

If you or someone you know is experiencing domestic or family violence, call 1800RESPECT (1800 737 732) or visit www.1800RESPECT.org.au.

In an emergency or if you’re not feeling safe, always call 000.

Alice Duthie: Alice is a writer for Women Love Tech and The Carousel. She is currently studying a Bachelor of Commerce at The University of Sydney, majoring in Marketing and Business Information Systems. Alice loves to cover all things tech-related, from reporting on the latest devices and apps on the market, to sharing inspirational stories about women working in STEM careers.

This website uses cookies.

Read More