As I prepared to publish that post, I received a Google Alert for another research paper on exactly the same topic I gave it a quick skim, and the conclusions were astonishing.This new paper suggested that financial literacy training showed no improvement in financial decision making!Everything I'd just read and written might have been moot. Naturally, I had to dive in and learn more.
Financial Literacy Programs In Doubt
The research paper, High School and Financial Outcomes: The Impact of Mandated Personal Finance and Mathematics Courses by Shawn Cole, Anna Paulson and Gauri Kartini Shastry out of the Harvard Business School, took a different approach to determining the efficacy of financial literacy training than the earlier paper.Instead of creating controlled, small scale experiments to test efficacy as the German study did, this new paper reviewed large sources of publicly available statistical information. This alone may account for some of the difference. Two major conclusions were drawn, which I paraphrase below:
1) State mandated financial literacy training programs showed no statistically valid increase in asset accumulation nor credit management for students who took the courses as compared to students who completed high school immediately before the state mandate began.
2) Simply taking more math courses did increase both asset accumulation and credit management.So what's going on here? How can two studies released within only a month of one another come to such dramatically different results? And what does it mean for people who engage in financial literacy training and engage policy-makers in discussions about including more such training?
Some Differences
Clearly, one major difference is the country the research was performed in. The earlier study involved school children in Germany. This second study reviewed data available in the United States. I submit that it's possible that the German financial literacy program was simply better!
Another major difference, the German study was experimentally focused. The study was performed on a small sample and included a control group. The results of these to groups were compared in a very controlled fashion. This second study was performed by completing statistical analysis of publicly available information, and not as a controlled experiment. Might this difference suggest that the results seen in an experimental setting wouldn't translate to real life?
A third, and very crucial difference, is time frame. The earlier, German study was by design focused on very short-term impact and on self-reporting of financial change. This second study was very long-term in nature and based on purely objective measures of financial success.
So, Does Financial Literacy Training Work?
This second study has begrudgingly left me questioning the value of financial literacy programs, at least those implemented in high schools. Like many financial planners, I've long been a big proponent for including more financial literacy education to children, to adults, to anyone willing to listen. Yet, in reading and thinking about this study, the results are very compelling.
Consider the differences. It's possible the German financial literacy program was better than every program in the second study. However, the second study analyzed the impact of many programs administered in a variety of states over a variety of time periods. Variability in the quality of programs has already been captured in this study without indicating any differences. Unless the German program had reached some possible threshold level of "better" where efficacy increases dramatically, it doesn't seem this is likely a good explanation. I'd call this a draw.
The second difference may be even more damning. The German study was experimental and controlled in a lab-like environment. If the results in that controlled environment don't translate to real-life impact, it would stand to reason that the experiment was somehow flawed or simply did not model real life appropriately. Real-life is where it really matters. Real-life (and the second study) wins this round.
And for the third difference, long-term impact is far more important than any short-term benefits of financial literacy training. It seems problematic to suggest that schools should use already scarce resources on programs that only offer short-term benefits, particularly when other programs (more mathematics classes) could actually benefit the exact same issue much better while providing other high-value benefits. The edge goes to this second study here, as well.
I'm Skeptical
Despite this, I remain skeptical of the assertion that financial literacy training is not effective. It intuitively does not make sense to me. And I certainly would not suggest that someone with the opportunity to take courses pass on this type of program.
But my opinion is evolving. When I first read the study conclusions a couple days ago, I immediately looked for ways to explain them away. Today I'm willing to entertain the idea that financial literacy training may not be particularly effective, and that these programs may be a poor use of resources in our schools.
Challenging my preconceptions and learning, that's what this whole blogging voyage is about.
Let me know your thoughts...
##
Photo courtesy of ken2754@yokohama
I think the inefficiencies in financial literacy training stem from their focus on math and logical. However, I feel that a majority of our decisions are not based on logic, but rather behavioral tendencies. I see the future of financial literacy training focusing more on behavioral sciences coupled with math to encourage smarter financial decisions.
ReplyDeleteI think there is likely truth in your comment, Todd. Even looking at the time-frame where the research project pulled data from, it's clear the financial literacy programs would have been unlikely to have any behavioral element at all. We didn't really know about that piece yet!
ReplyDeleteThis is purely anecdotal (at best), but I wonder how effective knowledge of behavioral biases would be in changing behaviors. Just considering my own decision-making, I'm fully aware of many behavioral biases yet catch myself having fallen for them all the time! Problem is, I only rarely can identify the bias on the front end. It's generally afterward as I think about the decision made.
Perhaps there's some research out there that can offer some insight into the effectiveness of behavioral based training.
Thanks, Nathan. If I locate any research that supports this discussion, I would be happy to send it your way.
DeleteThere is a flaw in the conclusions of studies based on real life data. It's best illustrated in this analogy: we are teaching people how to swim, then throwing them in a shark tank and concluding the swimming lessons aren't working.
ReplyDeletePart of financial literacy in the US needs to include how to identify, avoid, and repel the sharks (which are not always people, but institutions and social pressures too). Creators and teachers of financial literacy curriculum need to include financial self-defense along with the literacy.
As per financial literacy training program dos't work, I don't think so. A quick learning process which comes and gaining a high guarantee decision as soon as possible. Some people never buy a car, as they simply cannot afford one or they live in cities where public transportation and conveniently located shops, schools, and businesses make having a car a luxury, not a necessity. Just because their financial services are not enough to take that. There are some financial issues to resolve before you go car shopping and just resolved like quick e-learning program.
ReplyDelete