Generative AI sprung to fame through the likes of OpenAI’s ChatGPT and Google’s Bard. Indeed, the former captured a mammoth one million users within five days of launching in late 2022.

The technology is essentially a large language model that can digest properties and patterns found in data and spit out new content based on them.

Users have turned to the chatbots to write essays, create computer-generated art, and even for relationship advice. In fact, research has found the majority of people have a high level of trust in the technology.

And it's no different in Australia. 

More than 70% of Aussies trust AI-generated written content, according to findings by the Capgemini Research Institute. It surveyed 10,000 consumers across 13 countries, including Australia, in April. 

Interestingly, more than half of all respondents were found to trust AI to assist with their financial planning. 

And there’s no shortage of places people can turn to get such advice.

Those interested in investing on the stock market might seek out what could be the next generation of robo-advisors – the likes of Alpha from investing platform Public, or FinChat - sold as being “ChatGPT for investors”. 

Unlike ChatGPT, the products promise to provide up-to-date market information. Whereas their predecessor is only trained on data up to 2021.

More recently, Wally entered the chat. The personal finance app launched WallyGPT in 70 countries around the globe. 

The free tool is said to be able to provide AI-generated financial advice based on a person’s individual financial situation.

How could AI help me meet my savings goals?

Futuristic, oui? Wally first launched in 2013 and has historically offered users the ability to track all their bank accounts, as well as their inflows and outflows, in a single place.

Now, WallyGPT can use that data to answer prompts like, ‘how much of my income am I currently saving?’ or broader questions like, ‘what’s an ETF?’. Both are offered as examples of use on the app’s website.

In answer to potential privacy concerns, it assures all data used to create responses is deleted within 30 days and isn’t used for training. Meanwhile, all conversations with the chatbot are said to be private and confidential.

Beyond the first-of-its-kind product, more ‘everyday’ chat bots can – and likely often do – respond to simple personal finance-related queries.

For the purpose of this story, I asked ChatGPT how long it would take me to save for a week-long holiday in Bali, travelling from Sydney, if I earned the median Australian weekly wage of $805 (as per Australian Bureau of Statistics (ABS) data collected in the 2021 Census) and set a third of that aside. It returned nearly 200 words on the subject and estimated my figurative trip would cost around $1,500.

I then asked it to shorten its response. This is the final answer it gave me:

With an income of $805 per week and saving around a third of that, you save approximately $268.33 per week. Assuming a trip cost of $1,500, it would take approximately 6 weeks to save enough for a week-long holiday to Bali from Sydney.

“AI-powered personal finance apps and tools can help retail investors track their spending, budget effectively, and optimise their savings and investment strategies,” RMIT Associate Professor of Finance Angel Zhong told Savings.com.au.

However, as she noted, “individuals need to know about the limitations of AI in financial planning.”

The dark side of AI-powered personal finance

There are some notable downsides to turning to generative AI for personal financial management.

“Human advisors can offer critical analysis, consider the broader context, and provide guidance beyond what AI algorithms can offer,” Professor Zhong said.

AI can only go off of patterns and correlations presented by historical data. Thus, their ability to predict what the future might bring is limited.

On top of that, some operate as ‘black boxes’, leaving users unaware of how they reached a conclusion.

“This lack of explainability and transparency can raise concerns, particularly in financial advice where individuals need to understand the reasoning behind recommendations and be able to assess the risk factors involved,” she said.

Digging deeper, the ethics of AI, privacy concerns, and the potential for scammers to use the technology can also be called into question.

“Australians should be cautious of AI-based investment schemes or offers that promise unrealistic returns,” Professor Zhong said.

 “Vigilance and scepticism are essential in assessing the credibility and legitimacy of AI-powered wealth-building opportunities.”

Regulatory challenges

In addition, policing generative AI in the Australian financial sector – a notoriously highly regulated space – will probably prove challenging.

It’s important to note that you need a licence to provide personalised financial advice down under.

ChatGPT has been found to be capable of passing law exams at the University of Minnesota Law School, the final exam of a typical MBA core course at the Wharton School of the University of Pennsylvania, and the United States Medical Licensing Exam.

Despite that, it doesn’t hold an Australian financial services licence. 

Not to mention, research has found ChatGPT requires human assistance to adhere to Aussie regulations.

That’s according to findings by Neilson & Co Wealth Management founder and director Ben Neilson, published in the Journal of Financial Regulation.

Neilson found the technology slashed time and costs that often accompany simple personalised financial advice. 

Thus, it could lower the barriers some Aussies face when looking to access financial guidance.

But when it came to providing more complex advice, it didn’t quite cut the mustard. Particularly, when it came to meeting all regulatory responsibilities.

“As AI evolves, there is a need for robust governance frameworks to ensure ethical use, transparency, and accountability,” Professor Zhong said.

“Regulatory changes or restrictions in AI development or deployment could impact wealth-building strategies that rely heavily on AI.”

As yet, there’s no consensus on how to best regulate the technology, according to the Australian Securities and Investments Commission (ASIC) chair Joe Longo.

“To be clear, my and ASIC’s interest is – and will always be – the safety and integrity of the financial ecosystem,” Mr Longo told a forum last week.

“As with any new technology, to the extent that AI affects that ecosystem, to that extent we will be involved.”

AI in the banking space

If the concept of AI in the finance space has you shaking your head, I have some potentially bad news. Many of Australia’s major banks are increasingly turning to the technology.

 Typically, their use of the technology has been shepherded to segments like cyber security, scam and fraud detection, and customer service.

 Though, they have employed AI for other uses. For example, CommBank’s Bill Sense and Benefits Finder services are founded on the technology.  

But we might be on the cusp of a boom in banks using AI. Indeed, things appear to have sped up in recent years.

 For instance, Westpac has indicated its interest in introducing AI to its business lending process.

 CommBank is also leaning further into the technology as it looks to use it to better predict customer needs.

Meanwhile, ANZ is engaging with the tool behind the scenes. Its 4,000-strong team of software engineers have used it to help produce code and digest complex information.

 NAB CEO Ross McEwan and his leadership team recently took a trip to the United States where they met with leaders at various iconic technology companies.

 Mr McEwan said the bank will “crawl first” when it comes to AI, but showed interest in being among those paving the way.

“So you either get in front and use it to move the organisation forward, or I suspect that there will be companies that do it to us,” he said, expressing what appears to be the mood among the big four banks.

“I want to be at the leading edge of it.”

Image by Possessed Photography on Unsplash.