Generative AI can help improve financial literacy, but the current models aren’t sophisticated enough to serve as standalone advisers, says Wharton’s Michael Roberts. This episode is part of a series on “Financial Literacy.”

Transcript

The State of AI in Financial Literacy

Dan Loney: In this world where everything is being linked to artificial intelligence, we take a look now at whether AI could have an impact on financial literacy. It’s a pleasure to be joined in-studio by Michael Roberts, finance professor here at the Wharton School. Michael, before we dig into the AI side of it, I would like to get your thoughts on where you think financial literacy is in general right now.

Michael Roberts: I can start off by saying I’m not entirely sure what financial literacy even means at this point, but if I think about the research on the subject and my engagement with practitioners, I would describe it in several ways.

First, in general, it’s pretty poor. You think about financial literacy rates. They are about 57% in the U.S., but that hides a lot of variation in the population across genders, across income, across education, across age, where we see rates varying a great deal. I think the trajectory of financial literacy has been positive over time, but at a somewhat glacial pace. It’s not improving quickly enough. I think recent government policy enactments that we’ve seen over the last four or five years is a positive step forward but raises perhaps more concerns than it calms. Who is going to teach it? How is it going to be taught? What’s going to be taught? There are a lot of challenges.

Loney: Where does AI fit in around financial literacy?

Roberts: It’s hard to not see it as being central in one way or another to financial literacy and financial practice more broadly. And given how quickly it’s changing, perhaps what I’m about to say will be outdated when this is eventually published. Nonetheless, I’ll take a stab.

Where it is right now is simply nowhere near where it needs to be in terms of a stand-alone, say, investment adviser or a financial guru for individuals. It takes a great deal of understanding of finance and an ability to prompt or engage with the AI to get meaningful answers out of it. When it does produce answers, they can be impressive, but they can also lead you astray. How long before we get to a point where anyone can just talk to AI and have it solve their financial problems? I have no idea, but it’s not going to negate the importance of what I call financial proficiency.

Loney: You’ve taken a look at this in terms of how generative AI could potentially have a role. What have you found out?

Roberts: I’ve been playing with it for about two years, ever since I had a bunch of students in one of my classes use it to solve an exam. It was in a different context, more programming and data science. Nonetheless, it was a quick wake-up call.

I’ve been feeding it questions, both standard financial literacy questions, as well as questions from my exams, and it’s interesting. AI is financially literate by any measure in practice. From that perspective, it’s impressive. But when you ask it questions such as just broad, open-ended questions like, “How should I save for retirement?” “What should I invest in?” — and I’m sure there are regulations or restrictions on AI from becoming an unlicensed RIA (registered investment adviser) — it really doesn’t know where to begin. It provides a very broad overview. I don’t know when it’s going to be ready to be our financial gurus. I don’t know how quickly AI technology will progress to that point.

Loney: What has to happen for it to get to that point? The one thing we do know is that as AI is building up, and we have to remember there are humans behind the scenes.

Roberts: What’s critical is that I don’t see AI as just a panacea. I view it as a complement and an accelerant on our path towards financial literacy, whatever that may be, or financial proficiency. In other words, I think it’s going to be critical that people recognize that knowledge of finance and financial principles are not going to go away. They are going to be paramount in order to engage successfully with AI, wherever it may be in the future.

Loney: But some people would wonder, with the fact that we know that financial literacy education is important, and the fact that it has been hard to implement in schools, whether or not AI can be that vehicle to help teach along the way.

Roberts: I think it can, and it probably will be a wonderful aid sooner than we think. Sal Khan has created Khanmigo. There is no reason we can’t have personal AI teaching assistants for our classes now or very soon. But there is a difference between being a TA for a class and being someone who can answer individuals’ specific personal or professional financial questions. It’s important to recognize that distinction. Everyone is different. Me saving for retirement is different from you saving for retirement, and so on.

Communicating With and Interpreting AI

Loney: Right, and understanding of the dynamics of how each person feels about that is something that AI cannot fully understand.

Roberts: Not without the help of the end user. That’s what I mean when I say it’s critical for individuals to understand how to communicate with AI. But to do so, you have to understand finance. You have to know what to ask it and, importantly, how to interpret what it’s spitting back at you.

Loney: I noticed the research you had done in and around AI. There are also instances where you rephrase some of the questions to see if there would be a better understanding for the generative AI tool.

Roberts: Yes, absolutely. I had to learn how to communicate with the AI in order to give it the opportunity to get the right answer. By the same token, I imagine, the AI is learning how I’m communicating with it in order to give me the answer I’m looking for.

Applications and Limitations of Generative AI in Finance

Loney: What you’re saying is somewhat of a theme that we’ve heard from other experts — that AI, in the scope of where financial literacy may be going, will be a path to help it along the way, but it won’t be the be-all, end-all.

Roberts: I think it has the opportunity to accelerate our progress, but it won’t be a replacement for financial literacy, financial proficiency.

Loney: You also talk in your work about the ability to properly interpret the answers being given by ChatGPT.

Roberts: Yes. It’s not just spitting out a number. It’s giving a somewhat long-winded explanation of the answer. If you don’t understand what it’s returning, the information is not helpful. It’s not usable.

It’s progressing so quickly, it’s difficult to keep up with. That’s why I say it’s hard to see a limitation, at least from my perspective, on what it’s going to be able to do, other than I don’t see it currently as a replacement for the progress we need to make on the literacy side.

Loney: What has that research meant to your mindset around teaching finance and the component of having ChatGPT as a tool in that process?

Roberts: Oh, I’m incredibly excited. I take the view that ChatGPT or other gen AI models are just a fantastic tool in the toolkit that we need for addressing financial challenges. I embrace it. I encourage my students to embrace it. I want them to use it on tests or homework, if they choose. But what they quickly learn is that if they’re going to use it, they can’t shirk on understanding the finance.

Loney: How much has it benefited your students?

Roberts: It depends. For data science-related financial applications, when I teach my Data Science for Finance course, it’s hugely helpful because it almost completely eliminates the technical burden, in terms of programming and working with data. It has made things so much more efficient. On the banking and personal finance side, it’s a little less helpful because it struggles with broad, almost open-ended questions. It varies based on application at this point.

Loney: Is there anything that surprised you about your interaction with ChatGPT?

Roberts: Yes, sometimes it gives impressively insightful responses. In fact, it found an error on one of the financial literacy exams, which I thought was really impressive. On the other hand, it also impresses me in terms of some of the silly mistakes it makes. That sort of volatility needs to be worked out, and I’m sure it will be as it learns through more interaction and data.