Reading all these comments, I was just wondering out of interest if the UMAT actually predicts good performance in medical school--
maybe not.
MedEntry responded vigorously to that one, saying that the study doesn't look at the more important thing, that is, a correlation between the UMAT and "good" doctors. In reality, MedEntry offers a skewed response: while there is no evidence to suggest that the UMAT is pointless, there is no evidence that suggests that it is even beneficial when it comes to producing good doctors. Surprisingly, no one has thought to actually evaluate the process despite decades of testing and coaching.
From
a layman's search (Google), ACER have based the UMAT on literature in psychometric testing as opposed to actually looking at the actual benefit of the test to the medical community. Also although this is old (2007), line 24 says that they do try to limit success from coaching/tutoring, which sounds like what has happened this time around.