As I prepared to publish that post, I received a Google Alert for another research paper on exactly the same topic I gave it a quick skim, and the conclusions were astonishing.This new paper suggested that financial literacy training showed no improvement in financial decision making!Everything I'd just read and written might have been moot. Naturally, I had to dive in and learn more.
Financial Literacy Programs In Doubt
The research paper, High School and Financial Outcomes: The Impact of Mandated Personal Finance and Mathematics Courses by Shawn Cole, Anna Paulson and Gauri Kartini Shastry out of the Harvard Business School, took a different approach to determining the efficacy of financial literacy training than the earlier paper.Instead of creating controlled, small scale experiments to test efficacy as the German study did, this new paper reviewed large sources of publicly available statistical information. This alone may account for some of the difference. Two major conclusions were drawn, which I paraphrase below:
1) State mandated financial literacy training programs showed no statistically valid increase in asset accumulation nor credit management for students who took the courses as compared to students who completed high school immediately before the state mandate began.
2) Simply taking more math courses did increase both asset accumulation and credit management.So what's going on here? How can two studies released within only a month of one another come to such dramatically different results? And what does it mean for people who engage in financial literacy training and engage policy-makers in discussions about including more such training?
Clearly, one major difference is the country the research was performed in. The earlier study involved school children in Germany. This second study reviewed data available in the United States. I submit that it's possible that the German financial literacy program was simply better!
Another major difference, the German study was experimentally focused. The study was performed on a small sample and included a control group. The results of these to groups were compared in a very controlled fashion. This second study was performed by completing statistical analysis of publicly available information, and not as a controlled experiment. Might this difference suggest that the results seen in an experimental setting wouldn't translate to real life?
A third, and very crucial difference, is time frame. The earlier, German study was by design focused on very short-term impact and on self-reporting of financial change. This second study was very long-term in nature and based on purely objective measures of financial success.
So, Does Financial Literacy Training Work?
This second study has begrudgingly left me questioning the value of financial literacy programs, at least those implemented in high schools. Like many financial planners, I've long been a big proponent for including more financial literacy education to children, to adults, to anyone willing to listen. Yet, in reading and thinking about this study, the results are very compelling.
Consider the differences. It's possible the German financial literacy program was better than every program in the second study. However, the second study analyzed the impact of many programs administered in a variety of states over a variety of time periods. Variability in the quality of programs has already been captured in this study without indicating any differences. Unless the German program had reached some possible threshold level of "better" where efficacy increases dramatically, it doesn't seem this is likely a good explanation. I'd call this a draw.
The second difference may be even more damning. The German study was experimental and controlled in a lab-like environment. If the results in that controlled environment don't translate to real-life impact, it would stand to reason that the experiment was somehow flawed or simply did not model real life appropriately. Real-life is where it really matters. Real-life (and the second study) wins this round.
And for the third difference, long-term impact is far more important than any short-term benefits of financial literacy training. It seems problematic to suggest that schools should use already scarce resources on programs that only offer short-term benefits, particularly when other programs (more mathematics classes) could actually benefit the exact same issue much better while providing other high-value benefits. The edge goes to this second study here, as well.
Despite this, I remain skeptical of the assertion that financial literacy training is not effective. It intuitively does not make sense to me. And I certainly would not suggest that someone with the opportunity to take courses pass on this type of program.
But my opinion is evolving. When I first read the study conclusions a couple days ago, I immediately looked for ways to explain them away. Today I'm willing to entertain the idea that financial literacy training may not be particularly effective, and that these programs may be a poor use of resources in our schools.
Challenging my preconceptions and learning, that's what this whole blogging voyage is about.
Let me know your thoughts...
Photo courtesy of ken2754@yokohama