Pages

Thursday, January 31, 2013

Financial Literacy Training DOESN'T Work?

Earlier this week, I posted some thoughts and information regarding University of Munich research  published last year studying the efficacy of financial literacy training.To sum up quickly, that research suggested there was a strong correlation between financial literacy training in high school children and improved financial decision making, at least in the short term.

As I prepared to publish that post, I received a Google Alert for another research paper on exactly the same topic I gave it a quick skim, and the conclusions were astonishing.This new paper suggested that financial literacy training showed no improvement in financial decision making!Everything I'd just read and written might have been moot. Naturally, I had to dive in and learn more.

Financial Literacy Programs In Doubt

The research paper, High School and Financial Outcomes: The Impact of Mandated Personal Finance and Mathematics Courses by Shawn Cole, Anna Paulson and Gauri Kartini Shastry out of the Harvard Business School, took a different approach to determining the efficacy of financial literacy training than the earlier paper.Instead of creating controlled, small scale experiments to test efficacy as the German study did, this new paper reviewed large sources of publicly available statistical information. This alone may account for some of the difference. Two major conclusions were drawn, which I paraphrase below:
1) State mandated financial literacy training programs showed no statistically valid increase in asset accumulation nor credit management for students who took the courses as compared to students who completed high school immediately before the state mandate began. 
2) Simply taking more math courses did increase both asset accumulation and credit management.
So what's going on here? How can two studies released within only a month of one another come to such dramatically different results? And what does it mean for people who engage in financial literacy training and engage policy-makers in discussions about including more such training?

Some Differences

Clearly, one major difference is the country the research was performed in. The earlier study involved school children in Germany. This second study reviewed data available in the United States. I submit that it's possible that the German financial literacy program was simply better!

Another major difference, the German study was experimentally focused. The study was performed on a small sample and included a control group. The results of these to groups were compared in a very controlled fashion. This second study was performed by completing statistical analysis of publicly available information, and not as a controlled experiment. Might this difference suggest that the results seen in an experimental setting wouldn't translate to real life?

A third, and very crucial difference, is time frame. The earlier, German study was by design focused on very short-term impact and on self-reporting of financial change. This second study was very long-term in nature and based on purely objective measures of financial success.

So, Does Financial Literacy Training Work?

This second study has begrudgingly left me questioning the value of financial literacy programs, at least those implemented in high schools. Like many financial planners, I've long been a big proponent for including more financial literacy education to children, to adults, to anyone willing to listen. Yet, in reading and thinking about this study, the results are very compelling.

Consider the differences. It's possible the German financial literacy program was better than every program in the second study. However, the second study analyzed the impact of many programs administered in a variety of states over a variety of time periods. Variability in the quality of programs has already been captured in this study without indicating any differences. Unless the German program had reached some possible threshold level of "better" where efficacy increases dramatically, it doesn't seem this is likely a good explanation. I'd call this a draw.

The second difference may be even more damning. The German study was experimental and controlled in a lab-like environment. If the results in that controlled environment don't translate to real-life impact, it would stand to reason that the experiment was somehow flawed or simply did not model real life appropriately. Real-life is where it really matters. Real-life (and the second study) wins this round.

And for the third difference, long-term impact is far more important than any short-term benefits of financial literacy training. It seems problematic to suggest that schools should use already scarce resources on programs that only offer short-term benefits, particularly when other programs (more mathematics classes) could actually benefit the exact same issue much better while providing other high-value benefits. The edge goes to this second study here, as well.

I'm Skeptical

Despite this, I remain skeptical of the assertion that financial literacy training is not effective. It intuitively does not make sense to me. And I certainly would not suggest that someone with the opportunity to take courses pass on this type of program.

But my opinion is evolving. When I first read the study conclusions a couple days ago, I immediately looked for ways to explain them away. Today I'm willing to entertain the idea that financial literacy training may not be particularly effective, and that these programs may be a poor use of resources in our schools.

Challenging my preconceptions and learning, that's what this whole blogging voyage is about.

Let me know your thoughts...

##

Photo courtesy of ken2754@yokohama

Tuesday, January 29, 2013

Financial Literacy Training Actually Does Work!

Financial literacy is one of those things that gets tons of lip service and media attention. Yet, nothing ever seems to come of it. Financial literacy education does not find its way into schools nor do any major government initiatives ever launch. Some new research out of my home country of Germany might be just the thing to help give this kind of education a kick in the tuckus!

Significant Impact

Research performed by Melanie Lührmann und Marta Serra-Garcia und Joachim Winter out of the University of Munich Department of Economics suggests that financial literacy programs administered to teenage children had significant impact in a variety of areas. In their paper, The effect of financial literacy training: Evidence from a experiment with German high-school children, draws several conclusions that make a compelling case that financial literacy programs can be highly effective.

They sum up their work with the following statement:

While the jury is still out when it comes to the long-run behavioral impacts of financial literacy training for high-school students, the results of this study show that one such program is quite successful in raising teenagers' interest in financial matters and their subjective knowledge. Along with the objective knowledge and hypothetical behavior changes that we can already document over very short time horizons, these fi ndings suggest that a even relatively short financial literacy training has the potential to help teenagers become more informed and sovereign consumers.

The study details a variety ways financial literacy positively impacts teenage children. These included an increased awareness of the purpose of advertisement to sell products, increased interest and awareness of financial topics and improved financial knowledge. Also noted was a decreased self-assessment as an impulse shopper.

Maybe Not Surprising 

This may come as no surprise to many readers. It would stand to reason that financial literacy training would have some positive impact on individual financial knowledge and decision-making. Yet, it is terribly difficult to effect change and to gain the ear of policy-makers without strong research showing that a benefit actually exists.

This study offers just such research and suggests that financial literacy training may have the impact many people have stated it would have. If you've been pushing for financial literacy education for your children, or if you're a financial planner who offers this type of education, use this research as support for your position. It's one thing to say you think it will be beneficial, but entirely another thing to have academic research supporting that claim.

Update: only moments before hitting “Publish” on this blog post, I received a Google Alert with another piece of research which seems to contradict this study, suggesting that financial literacy training actually does NOT have much impact. I’ll be reviewing and offering some thoughts on that research in this space in the near future.

Tuesday, January 22, 2013

Good Money, Bad Grades?

"I want to pay for my children to go to college."

It's a common goal clients tell financial planners. And planners generally run some calculations and, based on a variety of assumptions, determine an amount needed to be saved in order to meet that goal. But recent research suggests doing so may not actually be helping clients' children out. In fact, parents fully paying for a child's education could actually be harmful to the child's progress in college!

We Seem To Know This Somehow

There's been plenty of anecdotal evidence to suggest that children who have to fund part of their college educations perform better. I've discussed this with other planners in that past. Heck, I can speak from personal experience. My GPA rose dramatically (much more than indicated in the study below) once my parents limited the amount they paid toward my college education and I began working 20+ hours per week to make up the difference! But using anecdotal evidence to guide clients is a poor practice.

Research Confirms It

Now we have a strong, research-based indication that parents should allow space for their children to put some of their own money and effort "at risk" while attending college. A report in the Associated Press, Study: Parental support sends down college GPA, reviews a study that indicates greater amounts of parental funding are correlated with a lower average GPA for students. The results:
"...parents not giving their children any aid predicts a GPA of 3.15. At $16,000 in aid, GPA drops under 3.0. At $40,000, it hits 2.95..."
As noted in the article, it's important to remember that parents funding children's college education has been illustrated to greatly increase the likelihood a child actually attends and completes college. Financial planners encouraging clients not to save for college education at all certainly would not be helping clients desiring to see their children attend college.

Just Raise The Issue

However, financial planners may want to raise the topic of how much to fund with clients and what impact on academic outcome that funding may have on the client's children. While it's not the role of a financial planner to try to shape a clients' goals unless specifically asked to do so, a planner can (and likely should) help a client understand the impact of their financial decisions. This potential impact on college performance is just such an opportunity to educate our clients.

As financial planners, we have a duty to help our clients understand the numbers and help them make financial decisions to reach their goals. However, I strongly believe that we also have a duty to help our clients see beyond the numbers, to help them understand the impact their financial decisions have on a variety of factors in their lives beyond simple mathematical ones. Financial decisions can impact emotional and physical health. They can increase the likelihood of depression, or lead to more risk-taking.

Maybe a financial decision will lead to a child doing more poorly in college than expected.

Monday, January 14, 2013

When Nudges Don't Nudge


The idea of the nudge, as popularized by Richard H. Thaler, is an intriguing and exciting one for financial planners. It offers us a method to help our clients make good, but sometimes challenging, financial decisions they might not otherwise make. It gives financial planners a powerful tool in shaping behavior.

But what if nudges aren't universally applicable. What if sometimes a nudge doesn't do the trick? What if our reliance on nudges causes us to miss other opportunities to help our clients? What if sometimes a nudge doesn't nudge at all?

The Failed Nudge

A recent small-scale study illustrated that sometimes a nudge, in this case an opt-out default nudge, doesn't have any major impact. This study, A Nudge Isn't Always Enough, reviews a group of low-income tax filer’s likelihood of directing a portion of their tax refunds into savings bonds.

The first group was given paperwork with the option to direct some of their refund into a savings bond program. It took a proactive election for this group to save money. The second group, the nudged group, was given paperwork that included a default option sending a portion of their refund to a savings bond. No extra work was required on their part and they actually had to elect out of saving money. Research on nudges and defaults suggests that there should have been some increased frequency or amount of savings in the nudged group compared to the first group.

The result...no difference in participation rates in the savings bond program. The group without the nudge and the group with the nudge participated in the savings bond program about 9% of the time. And the amount saved was no different between the two groups. The nudge had no impact.

We Do Still Choose

The study offers some thoughts about why this nudge failed. These include that the tax filers already had plans to spend their refunds and didn't want to change those plans. A second theory was that, because the savings bonds were locked in for one year based on this one-time decision, participants were hesitant to act. A third suggested financial circumstances were involved.

I suspect that given the financial demographics of this group (low-income), the reality was many had already spent the refund they anticipated receiving. This was a group unlikely to have much discretionary income, that was relying on a tax refund to cover certain, necessary expenses...rent, groceries, etc…. For this group, had the nudge worked on a massive scale, it may have actually put people in a worse financial position. Those nudged into saving may have had to cover expenses already incurred with even less ideal resources, credit cards or pay-day loans or some other means.

Nudge, Nudge, Wink, Wink

Nudges can be tremendous choice architecture devices that can be employed by financial planners and policymakers to shape behavior. But it must be understood that nudges don't work in all circumstances, and that sometimes a good nudge could actually result in a bad decision.

So, nudge away, but know that it doesn’t always have the intended outcome.

Wednesday, January 9, 2013

Why This Dollar May Be Worth More Than Other Dollars


Once again, I've run across research on financial decision-making that displays just how little about how we make decisions could be considered "common sense." New research suggests that the physical appearance of the money we have alters our spending behavior. We value clean crisp bills more highly than worn out, dirty bills. What's more, we spend more easily if the bills in our wallets are worn out and dirty than if they are clean crisp bills! Not common sense.

A $20 bill has the exact same purchasing power whether brand spanking new out of the local ATM as it does if it's been sitting in a piggy bank crumpled up and played with by a 2 year old toddler eating candied apples. Yet research published in the Journal of Consumer Research, Inc. by Faubrizio Di Muro and Theodore J. Noseworthy illustrates that we do not treat those two bills the same. In their research paper, Money Isn't Everything, but It Helps If It Doesn't Look Used: How The Physical Appearance of Money Influences Spending, the authors identify three ways the physical appearance of money impacts us.

The first finding suggests that we spend money more freely when bills are worn and spend less when we have crisp bills. The reason we look to get rid of dirty money implied in the research? We have a tendency to view dirty, worn money as "contaminated" and we want to divest ourselves of that contaminant as quickly as possible. Further, we actually take pride in carrying clean, crisp bills and having these available to us for use in social situations (more on this in finding three.)

The second finding suggests that this dirty money phenomenon is so strong it can counteract another spending phenomenon. This other spending phenomenon, studied elsewhere, illustrates that, given the choice, we use smaller bills and exact change to make a purchase instead of breaking a larger bill. Yet, when encountered with a dirty larger bill and clean smaller bills, we have a tendency to break the larger bill in order to get rid of the contaminant!

The third finding may be the most interesting of all. This finding suggests that while we initially want to hold on to clean, crisp bills and will spend less when carrying these bills; we will actually spend crisp bills more freely when in a situation where we are being socially observed. We want to show off our crisp bills that we've held on to in a social context. We become proud to spend these beautiful bills and make sure others get to see them!

What bizarre, interesting findings. What a clear illustration that financial decision-making is not common sense! The economic value of the bills remains the same. The cost of our spending remains the same. Yet the condition of our physical money has an impact on how we make financial decisions.

The implications are interesting. Do you ever grab your change and stuff it in a pocket or purse? You may have just made that money easier to spend. Going to a bar with friends? Bring crumpled up, old bills if you want to spend less. Other times, take those worn bills to the bank and exchange them for crisp, clean bills as long as you’re going to spend them in an anonymous way. Maybe…

Wednesday, January 2, 2013

Limiting Choices and Details in a Full Disclosure World

As financial planners, we live in a world where we are required to give our clients full disclosure on conflicts of interest and to make certain we do not omit vital information and details when providing advice. But what if those requirements actually lead our clients to potentially make worse decisions? What if more information leads to poorer financial decisions?

An interesting article by Ron Friedman, Ph.D. on Psychology Today makes precisely this assertion. Friedman writes:

Imagine that you are a loan officer at a bank reviewing the mortgage application of a recent college graduate with a stable, well-paying job and a solid credit history. The applicant seems qualified, but during the routine credit check you discover that for the last three months the applicant has not paid a $5,000 debt to his charge card account.
Do you approve or reject the mortgage application?

Group 2 saw the same paragraph with one crucial difference. Instead of learning the exact amount of the student's debt, they were told there were conflicting reports and that the size of the debt was unclear. It was either $5,000 or $25,000. Participants could decide to approve or reject the applicant immediately, or they could delay their decision until more information was available, clarifying how much the student really owed. Not surprisingly, most Group 2 participants chose to wait until they knew the size of the debt.

Here's where the study gets clever. The experimenters then revealed that the student's debt was only $5,000. In other words, both groups ended up with the same exact information. Group 2 just had to go out of its way and seek it out.

The result? 71% of Group 1 participants rejected the applicant. But among Group 2 participants who asked for additional information? Only 21% rejected the applicant.

More information changed the decision made by study participants. The information didn't change the fact set, yet dramatically altered the analysis of the decision...likely for the worse. But what does this mean in the context of financial planning? Can financial planners select what information to provide to clients and what information to allow to be ignored? Can choices given to clients be limited in order to reduce information overload that might cause clients to make a poor decision?

I suspect that in practice financial planners do this all the time. Certain information is deemed as inconsequential or distracting by the financial planner, likely without them even consciously deciding so. Certain options are clearly so detrimental or unsuitable that the financial planner never even considers it for the client.

But I wonder if it would be a mistake for a financial planner to deliberate withhold information from clients based on a belief that some information might lead to a poor decision. Legal issues aside, is a financial planner equipped to determine what information is valuable and helpful and which information is harmful? Worse yet, by withholding information, are a planner's personal biases and money scripts impacting the decision about what information to share and withhold?

Instead of acting as the gatekeepers of information, I suggest planners help clients understand that more information isn't always better.  Planners can help clients understand the impact too much information can have on decision-making. Planners can help clients recognize when the client is digging exceptionally deeply for information, and help the client reflect on whether that information will actually help the decision or may, in fact, harm it.

It seems clear that too much information is detrimental to good financial decision-making. And, at some level, financial planners must limit the amount of information they give clients, if for no other reason than a lack of time and patience (both on the part of the planner and client) to work through every piece of tangentially relevant information. But how do planners know what to share and what to hold back? What is critical for a good decision and what is harmful? How does a financial planner know when to caution a client that more information may be detrimental?

I suspect that answer lies somewhere in the 10,000 hours of practice identified in Malcolm Gladwell’s Outliers as required to master a skill. There may be no specific process or procedure to determine this, only learning through experience and observing a mentor and practice.