We've seen that there are huge differences in performance on the California marriage and family therapy licensing exams based on what graduate program the test-takers attended. We've also seen that for-profit MFT programs should not be dismissed simply because they aim to make money; Argosy graduates do particularly well on the exams, while University of Phoenix graduates do not. I've said before, though, that there are lots of things to consider when choosing a graduate program in MFT, and that graduates' exam performance should only be one of many such considerations. Indeed, there are some major problems with putting too much stock in exam data.If you are looking at graduate programs, and are concerned about your prospective MFT program's exam pass rate, here are three reasons why you may want to ignore the exam data:
- Programs can and do improve. Exam data reflects students who graduated years earlier. Remember, it takes the average MFT intern in California more than four years to move from graduation to the licensing exams. That number is a bit lower for graduates of COAMFTE-accredited programs, primarily because they do more practicum hours while still in school. Nonetheless, if you are looking at MFT licensing exam data from 2009 and earlier (the link will take you to a searchable database of California programs), you will find very little information on anyone who graduated much past 2005, and nothing to tell you which programs have gotten better or worse since then. Consider the recently-COAMFTE-accredited programs at Chapman University and Hope International University. Their national accreditation should arguably make them more appealing (and thus competitive) programs for prospective students and faculty alike. That's important, and simply is not reflected in currently-available exam data.
- Programs seek to give students opportunities. Consider for a moment the state's worst-performing program, according to this table that appeared in Part I of this series: Pacific Oaks College. Based on the pass-rate statistic alone, one might presume that the Pacific Oaks program is not very good. But that conclusion can't safely be made from that data. Pacific Oaks, over the past few years, has specifically sought to provide opportunities to historically underserved populations, creating cohorts specifically for African-American Family Therapy and Latino Family Therapy. (This outreach is vital: Lots of evidence suggests that the mental health workforce is not meeting the needs of minority populations, either in California or around the US.) Students in these cohorts may lack the family, economic, and social support, as well as the earlier educational opportunities, that other students often have. Pacific Oaks goes to great length to remediate these earlier deficits, and may be doing more, with less, than programs who start with more economically- and educationally-advantaged students. When financial and accreditation concerns threatened to close the Pacific Oaks in 2009, I was one of many who stood up in defense of keeping the program alive, and have no reservations about having done so.
- Programs have no control over what students do after graduation. A program can really only control what happens from the time students are in the program to the time they graduate -- and even then, programs have limited control over how well their students prepare themselves. A great supervisor can help an MFT Intern/Associate make up for deficiencies in their education, and help get them ready for licensing exams. Poor supervision may leave the Intern/Associate on their own to prepare, or even offer incorrect information that ultimately harms one's chance of passing the exams. And of course, programs have no control over whether their graduates use MFT exam prep programs, although there is little evidence that these prep classes actually impact MFT exam pass rates.