The primary goal of any veterinary educational program is to produce graduates who become highly competent, successful, and satisfied professionals. To meet that end, veterinary schools seek to design their admissions and instructional practices in ways that will help graduates obtain licensure and succeed in their subsequent careers. To design effective admissions and teaching processes, it is useful for colleges to understand how measures of knowledge or aptitude that are used for admissions or during veterinary school relate to subsequent success or failure. Thus, the present study was designed to assess relationships among several common measures of performance prior to and during veterinary school and scores on the NAVLE. The NAVLE is an electronically administered, 360-item, multiple-choice examination that is required for veterinary licensure in the United States and Canada. Veterinary schools have great interest in NAVLE scores because the NAVLE score directly determines a graduate's opportunity to gain licensure as a veterinary medical practitioner and because the NAVLE is the only currently available standardized measure of clinical ability used by all veterinary colleges in the United States and Canada. Measures of student performance evaluated in the present study consisted of GRE scores, UGPA, QE scores, and VGPA.
The National Board of Veterinary Medical Examiners, which administers the NAVLE, did not begin making raw NAVLE scores available to accredited veterinary schools until 2009. As a result, reviews of medical and general education scholarly indexes revealed no published studies exploring the relationship between NAVLE scores and other measures of performance prior to or during participation in the veterinary curriculum. Therefore, we reviewed literature from veterinary medical education and closely related areas to form hypotheses regarding relationships among these predictors and NAVLE scores.
The GRE measures knowledge and cognitive skills thought to predict aptitude for graduate study, and GRE score is a significant predictor of performance in master's and doctoral degree programs across a variety of disciplines.1 The 2009 comparative data report, shared among AVMA-accredited colleges of veterinary medicine, revealed that of the 28 US veterinary schools and colleges, all but 3 reported GRE scores (verbal, quantitative, and analytical), while none of the 5 Canadian schools reported GRE scores. Only 1 US veterinary college reported the biology subject test score. A search of the literature revealed few references to the GRE's ability to predict performance in medical science programs, primarily because human medical schools use instruments such as the Medical College Admission Test or the Graduate Australian Medical School Admissions Test, which are specifically designed to predict performance in medical school. Two studies2,3 exploring the predictive power of GRE scores in veterinary medicine provided mixed results, with overall GRE score and scores on the various GRE subsections predicting VGPA to a greater or lesser degree. In a study2 involving 8 academic classes at Oklahoma State University, Confer found that scores for the GRE subsections (analytical, quantitative, or verbal) significantly predicted VGPA in most years. However, no particular subsection was a significant predictor across all years. In a subsequent study,3 Confer and Lorenz found that scores for the GRE biology subject test were a significant predictor of VGPA. Because GRE scores generally are positively associated with performance in graduate programs and are predictive of VGPA, we hypothesized that GRE scores would be predictive of all subsequent measures explored in our study, including QE scores, VGPA, and NAVLE scores.
Undergraduate grade point averages are broadly available to schools and colleges of veterinary medicine and have been shown to predict grade point average in veterinary school2,3 and similar settings such as colleges of medicine4–7 and colleges of osteopathic medicine.8 In human medical education, UGPA has also been shown to predict performance on licensing examinations and postgraduate clinical competency.4,5 Therefore, we expected that UGPA would be positively associated with subsequently measured variables, including VGPA, QE scores, and NAVLE scores.
Grades during medical school have been shown to predict subsequent clinical performance, as measured by means of clinical encounters (within the curriculum and during residency), and licensing examination scores.6,9 As a result, we hypothesized that VGPA would positively predict performance on the NAVLE, as measured by NAVLE score.
The QE is a 300-item, multiple-choice examination that is designed to assess basic proficiency in the areas of veterinary anatomy, physiology, pharmacology, microbiology, and pathology. It was created by the National Board of Veterinary Medical Examiners, the same organization that administers the NAVLE, to be administered to graduates of programs not accredited by the AVMA, as part of the PAVE. The PAVE is accepted by 31 states and territories in the United States as evidence that the educational component of licensure requirements has been met, and PAVE participants have to pass the QE before they are allowed to proceed to the assessment of clinical skills proficiency. As is standard for commercially available standardized examinations, the QE has a number of forms, with different forms used each time the examination is administered. Content experts are employed to create items, and new items are reviewed prior to their incorporation in the examination. Care is taken to ensure that the psychometric properties of the test are equivalent from form to form and year to year.
A search of medical and general education research indexes revealed no studies exploring the relationship between QE scores and other predictors of academic, clinical, or professional performance. However, there are 2 reasons to believe that QE scores would be a good predictor of NAVLE scores. First, for various knowledge domains, underlying domain knowledge has been shown to be characteristic of expertise.10 In the medical field, for instance, effective diagnosticians have been shown to have more elaborated underlying knowledge structures11 and more thorough knowledge representations12 than do their less effective counterparts. If basic science knowledge is essential to practicing medicine effectively and the QE measures basic science knowledge while the NAVLE measures clinical readiness, then QE scores should predict NAVLE scores. Second, the QE is similar to an examination used to measure basic science knowledge in human medical education that has been shown to predict subsequent clinical knowledge. That examination, step 1 of the US Medical Licensing Examination, assesses whether students understand and can apply important concepts of the sciences fundamental to the practice of medicine. The step 1 examination has been shown to predict performance on subsequent measures of clinical performance or clinical readiness, including clerkship performance6 and clinically oriented licensing examinations.9 Because the QE is similar to step 1 of the US Medical Licensing Examination in purpose and because the NAVLE is intended to measure clinical readiness, we hypothesized that QE scores would be predictive of NAVLE scores.
Overall, therefore, we hypothesized that GRE scores would directly predict QE scores, VGPA, and NAVLE scores and that GRE scores would also indirectly predict NAVLE scores through QE scores and VGPA. Similarly, we hypothesized that UGPA would directly predict QE scores, VGPA, and NAVLE scores and would also indirectly predict NAVLE scores through QE scores and VGPA. We also hypothesized that QE scores and VGPA would directly predict NAVLE scores. Because QE scores and VGPA reflect ability and knowledge specific to veterinary medicine, we hypothesized that they would be stronger predictors of NAVLE scores than UGPA or GRE scores.
Many studies exploring relationships among common predictors of academic performance and subsequent outcomes use multiple regression analysis. For the present study, we decided instead to use path analysis because it provides several advantages over multiple regression analysis.13 First, path analysis can explain the relationships between a set of independent variables (eg, GRE scores or UGPA) and a set of dependent variables (eg, QE scores, VGPA, or NAVLE scores). The path coefficient is similar in meaning to the standardized regression coefficient in a multiple regression analysis and represents the extent to which 1 unit of change in the independent variable would predict the level of change in the dependent variable, while controlling for other predictors. Second, path analysis can examine the indirect effect among variables, which is not possible with multiple regression analysis.13 An indirect effect means that the effect of a predictor on the dependent variable is mediated through the predictor's effect on another variable. For example, in this study, we investigated whether GRE scores and UGPA indirectly affected NAVLE scores through their effects on QE scores and VGPA. Third, path analysis can examine whether hypothesized theoretical models fit the data. In summary, path analysis provides a more complete understanding of the relationships among various variables than does multiple regression analysis.
Graduate Record Examination
Iowa State University
North American Veterinary Licensing Examination
Program for the Assessment of Veterinary Education Equivalence
Undergraduate grade point average
University of Minnesota
Overall grade point average during veterinary school
LISREL 8.54, Scientific Software International Inc, Lincolnwood, Ill.
SPSS 17.0, SPSS Inc, Chicago, Ill.
Kuncel NRWee SSerafin L, et al. The validity of the Graduate Record Examination for master's and doctoral programs: a meta-analytic investigation. Educ Psychol Meas 2010; 70: 340–352.
Confer AW. Preadmission GRE scores and GPAs as predictors of academic performance in a college of veterinary medicine. J Vet Med Educ 1990; 17: 56–62.
Confer AWLorenz MD. Pre-professional institutional influence on predictors of first-year academic performance in veterinary college. J Vet Med Educ 1999; 26: 16–20.
Collin VTViolato CHecker K. Aptitude, achievement and competence in medicine: a latent variable path model. Adv Health Sci Educ Theory Pract 2009; 14: 355–366.
Ferguson EJames DMadeley L. Factors associated with success in medical school: systematic review of the literature. BMJ 2002; 324: 952–957.
White CBDey ELFantone JC. Analysis of factors that predict clinical performance in medical school. Adv Health Sci Educ Theory Pract 2009; 14: 455–464.
Wilkinson DZhang JByrne GJ, et al. Medical school selection criteria and the prediction of academic performance. Med J Aust 2008; 188: 349–354.
Evans PWen FK. Does the Medical College Admission Test predict global academic performance in osteopathic medical school? J Am Osteopath Assoc 2007; 107: 157–162.
Hamdy HPrasad KAnderson MB, et al. BEME systematic review: predictive values of measurements obtained in medical schools and future performance in medical practice. Med Teach 2006; 28: 103–116.
Pretz JENaples AJSternberg RJ. Recognizing, defining, and representing problems. In: Davidson JESternberg RJ, eds. The psychology of problem solving. Cambridge, England: Cambridge University Press, 2003; 3–30.
Chang RWBordage GConnell KJ. The importance of early problem representation during case presentations. Acad Med 1998; 73: S109–S111.
Mertler CVannatta R. Advanced and multivariate statistical methods: practical application and interpretation. 3rd ed. Glendale, Calif: Pyrczak Publishing, 2005.
Byrne BM. Structural equation modeling with LISREL, PRELIS, and SIMPLIS: basic concepts, application, and programming. Mahwah, NJ: Erlbaum, 1998.
Yates JJames D. Predicting the “strugglers”: a case-control study of students at Nottingham University Medical School. BMJ 2006; 332: 1009–1013.
Satorra ABentler R Scaling corrections for chi-square statistics in covariance structure analysis, in Proceedings. Bus Econ Statist Sect Am Stat Assoc 1988;308–313.