Search Results

You are looking at 1 - 10 of 51 items for

  • Author or Editor: Paul S. Morley x
  • Refine by Access: All Content x
Clear All Modify Search


A study was designed to determine if inactivated bovine respiratory syncytial virus (brsv) vaccines induce the same types of antibody and cellular responses as does a modified-live brsv vaccine. Ninety mixed-breed, 5- to 6-month-old beef calves were randomly assigned to 1 of 6 groups with 15 animals/group. Calves in 5 of the groups were inoculated on days 0 and 14 with 1 of 4 inactivated virus vaccines or with a modified-live virus vaccine. The remaining 15 calves were maintained as unvaccinated controls. Immune responses were measured on days 0 and 24, by means of elisa, virus neutralization assay, blocking elisa for the brsv fusion (F) protein, immunoblotting, and lymphocyte blastogenesis assay. All vaccines induced production of antibodies that recognized the F protein; however, the ratio of neutralizing antibody titer to change in brsv-specific IgG antibody concentration (as determined by use of elisa) was lower for calves that received an inactivated virus vaccine than for calves that received the modified-live virus vaccine. All of the vaccines induced lymphocyte proliferative responses to brsv. Results suggest that commercially employed inactivation processes can alter functionally important epitopes on brsv envelope glycoproteins, leading to production of predominantly nonneutralizing antibodies in immunized cattle.

Free access
in Journal of the American Veterinary Medical Association


Objective—To evaluate various sampling strategies for potential use in measuring prevalence of antimicrobial susceptibility in cattle.

Sample Population—500 isolates of non–type-specific Escherichia coli (NTSEC) isolated from the feces of 50 cows from 2 dairy farms (25 cows/farm and 10 isolates/cow).

Procedures—Diameters of inhibition zones for 12 antimicrobials were analyzed to estimate variation among isolates, cows, and farms and then used to determine sampling distributions for a stochastic simulation model to evaluate 4 sampling strategies. These theoretic sampling strategies used a total of 100 isolates in 4 allocations (1 isolate from 100 cows, 2 isolates from 50 cows, 3 isolates from 33 cows, or 4 isolates from 25 cows).

Results—Analysis of variance composition revealed that 74.2% of variation was attributable to isolates, 18.5% to cows, and 7.3% to farms. Analysis of results of simulations suggested that when most of the variance was attributable to differences among isolates within a cow, culturing 1 isolate from each of 100 cows underestimated overall prevalence, compared with results for culturing more isolates per cow from fewer cows. When variance was not primarily attributable to differences among isolates, all 4 sampling strategies yielded similar results.

Conclusions and Clinical Relevance—It is not always possible to predict the hierarchical level at which clustering will have its greatest impact on observed susceptibility distributions. Results suggested that sampling strategies that use testing of 3 or 4 isolates/cow from a representative sample of all animals better characterize herd prevalence of antimicrobial resistance when impacted by clustering.

Full access
in American Journal of Veterinary Research


Objective—To assess perceptions of personnel working at a veterinary teaching hospital regarding risks of occupational hazards and compare those perceptions with assessments made by occupational safety experts.

Design—Cross-sectional study.

Study Population—A representative sample of personnel (n = 90) working at the veterinary teaching hospital at Colorado State University and a panel of 3 occupational safety experts.

Procedures—Hospital personnel ranked perceptions of 14 physical, chemical, and biological workplace hazards and listed the injuries, illnesses, and near misses they had experienced. The expert panel provided consensus rankings of the same 14 hazards for 9 sections of the facility. Risk perceptions provided by the 2 sources were compared.

Results—Risk perceptions did not differ significantly between hospital personnel and the expert panel for most of the site-specific comparisons (94/126 [75%]). Personnel perceived greater risks for some physical hazards (loud noises, sharps injuries, and ionizing radiation) and some chemical or materials exposures (insecticides or pesticides and tissue digester emissions). In contrast, the expert panel perceived greater risks for physical hazards (bite or crush and restraining and moving animals), chemical exposures (anesthetic waste gas), and biological exposures (Toxoplasma gondii, antimicrobial-resistant bacteria, and allergens).

Conclusions and Clinical Relevance—Participants and safety experts had similar perceptions about occupational risks, but there were important differences where hospital personnel apparently overestimated or underappreciated the risks for workplace hazards. This type of study may be useful in guiding development of optimal workplace safety programs for veterinary hospitals.

Full access
in Journal of the American Veterinary Medical Association


To examine the effects of perinatal vaccination on cellular and humoral responses in cows and on passive transfer of antibodies and cells to calves, and to assess the role of maternal antibodies in vaccination responses of neonatal calves.


Prospective randomized control trial.


52 beef cows and their calves.


Assigned cows were vaccinated twice during the last month of gestation. Assigned calves were vaccinated at day 10 after birth. Antibody concentrations and cellular responses to bovine respiratory syncytial virus (BRSV) and bovine herpesvirus type 1 (BHV-1) were measured in blood and colostrum of cows and in blood of calves. Calves were assessed for passive transfer of lymphocytes.


At parturition, serum antibody concentrations to BRSV as well as BHV-1 - and BRSV-specific blastogenic responses were significantly higher in vaccinated cows. After birth, calves from vaccinated cows had significantly higher concentrations of BRSV-specific serum antibodies, but not BHV-1 specific antibodies. Calves did not develop delayed-type hypersensitivity responses to BRSV. At weaning, lymphocytes from neonatally vaccinated calves had significantly higher values for virus-specific proliferation than did lymphocytes from unvaccinated calves; however, significant differences were not detected between groups after vaccination at weaning.

Clinical Implications—

Administration of modified-live viral vaccines can boost systemic humoral and cellular responses to BRSV and BHV-1 in cows. Neonatal calves can be immunologically primed by vaccination with modified-live virus vaccines. Virus-specific memory cells persist in most calves until weaning. ( J Am Vet Med Assoc 1996;208:393-400)

Free access
in Journal of the American Veterinary Medical Association


Objective—To evaluate the extent of environmental contamination with Salmonella enterica in a veterinary teaching hospital.

Design—Longitudinal study.

Samples—Environmental samples obtained from 69 representative locations within a veterinary teaching hospital by use of a commercially available electrostatic wipe.

Procedure—Environmental samples were obtained for bacteriologic culture, and antimicrobial susceptibility testing was performed on each environmental isolate. Environmental isolates were compared with isolates obtained from animals during the same period to investigate potential sources of environmental contamination.

Results—54 S enterica isolates were recovered from 452 (11.9%) cultured environmental samples .Five different serotypes were recovered; the most common serotypes were S Newport and S Agona. Within the 5 serotypes recovered, 10 distinguishable phenotypes were identified by use of serotype and antimicrobial susceptibility patterns. Of the environmental isolates, 41 of 54 (75.9%) could be matched to phenotypes of isolates obtained from animal submissions in the month prior to collection of environmental samples.

Conclusions and Clinical Relevance—Results indicated that environments in veterinary hospitals can be frequently contaminated with S enterica near where infected animals are managed and fecal specimens containing S enterica are processed for culture in a diagnostic laboratory. Bacteriologic culture of environmental samples collected with electrostatic wipes is an effective means of detecting contamination in a veterinary hospital environment and may be beneficial as part of surveillance activities for other veterinary and animal-rearing facilities. (J Am Vet Med Assoc 2004;225:1344–1348)

Full access
in Journal of the American Veterinary Medical Association


Objective—To evaluate the efficacy of furosemide for prevention of exercise-induced pulmonary hemorrhage (EIPH) in Thoroughbred racehorses under typical racing conditions.

Design—Randomized, placebo-controlled, blinded, crossover field trial.

Animals—167 Thoroughbred racehorses.

Procedures—Horses were allocated to race fields of 9 to 16 horses each and raced twice, 1 week apart, with each of the 2 races consisting of the same race field and distance. Each horse received furosemide (500 mg, IV) before one race and a placebo (saline solution) before the other, with the order of treatments randomly determined. Severity of EIPH was scored on a scale from 0 to 4 after each race by means of tracheobronchoscopy. Data were analyzed by means of various methods of multivariable logistic regression.

Results—Horses were substantially more likely to develop EIPH (severity score ≥ 1; odds ratio, 3.3 to 4.4) or moderate to severe EIPH (severity score ≥ 2; odds ratio, 6.9 to 11.0) following administration of saline solution than following administration of furosemide. In addition, 81 of the 120 (67.5%) horses that had EIPH after administration of saline solution had a reduction in EIPH severity score of at least 1 when treated with furosemide.

Conclusions and Clinical Relevance—Results indicated that prerace administration of furosemide decreased the incidence and severity of EIPH in Thoroughbreds racing under typical conditions in South Africa.

Full access
in Journal of the American Veterinary Medical Association


Objective—To evaluate the potential association between Salmonella enterica shedding in hospitalized horses and the risk of diarrhea among stablemates, and to characterize gastrointestinal-related illness and death following discharge among horses that shed S enterica while hospitalized.

Design—Case-control study.

Animals—221 horses (59 that shed S enterica during hospitalization and 162 that tested negative for S enterica shedding ≥ 3 times during hospitalization).

Procedures—Information from medical records (signalment, results of microbial culture of fecal samples, clinical status at the time of culture, and treatment history) was combined with data collected through interviews with horse owners regarding formerly hospitalized horses and their stablemates. Data were analyzed to investigate risk factors for death and diarrhea.

Results—Occurrence of diarrhea among stablemates of formerly hospitalized horses was not associated with S enterica shedding in hospitalized horses but was associated with oral treatment with antimicrobials during hospitalization. Salmonella enterica shedding during hospitalization was not associated with risk of death or gastrointestinal-related illness in study horses ≤ 6 months after discharge, but shedding status and history of gastrointestinal illness were associated with increased risk of death during the preinterview period.

Conclusions and Clinical Relevance—Stablemates of horses that shed S enterica during hospitalization did not appear to have an increased risk for diarrhea, but comingling with horses that receive orally administered antimicrobials may affect this risk. Salmonella enterica shedding during hospitalization may be a marker of increased long-term risk of death after discharge. Risks are likely influenced by the S enterica strain involved and biosecurity procedures used.

Full access
in Journal of the American Veterinary Medical Association


Objective—To characterize biosecurity and infection control practices at veterinary teaching hospitals located at institutions accredited by the AVMA.

Design—Cross-sectional survey.

Population—50 biosecurity experts at 38 veterinary teaching hospitals.

Procedures—Telephone interviews were conducted between July 2006 and July 2007, and questions were asked regarding policies for hygiene, surveillance, patient contact, education, and awareness. Respondents were also asked their opinion regarding the rigor of their programs.

Results—31 of 38 (82%) hospitals reported outbreaks of nosocomial infection during the 5 years prior to the interview, 17 (45%) reported > 1 outbreak, 22 (58%) had restricted patient admissions to aid mitigation, and 12 (32%) had completely closed sections of the facility to control disease spread. Nineteen (50%) hospitals reported that zoonotic infections had occurred during the 2 years prior to the interview. Only 16 (42%) hospitals required personnel to complete a biosecurity training program, but 20 of the 50 (40%) respondents indicated that they believed their hospitals ranked among the top 10% in regard to rigor of infection control efforts.

Conclusions and Clinical Relevance—Results suggested that differences existed among infection control programs at these institutions. Perceptions of experts regarding program rigor appeared to be skewed, possibly because of a lack of published data characterizing programs at other institutions. Results may provide a stimulus for hospital administrators to better optimize biosecurity and infection control programs at their hospitals and thereby optimize patient care.

Full access
in Journal of the American Veterinary Medical Association


Objective—To evaluate trends in feedlot cattle mortality ratios over time, by primary body system affected, and by type of animal.

Design—Retrospective cohort study.

Animals—Approximately 21.8 million cattle entering 121 feedlots in the United States during 1994 through 1999.

Procedures—Yearly and monthly mortality ratios were calculated. Numbers of deaths were modeled by use of Poisson regression methods for repeated measures. Relative risks of death over time and by animal type were estimated.

Results—Averaged over time, the mortality ratio was 12.6 deaths/1,000 cattle entering the feedlots. The mortality ratio increased from 10.3 deaths/1,000 cattle in 1994 to 14.2 deaths/1,000 cattle in 1999, but this difference was not statistically significant (P = 0.09). Cattle entering the feedlots during 1999 had a significantly increased risk (relative risk, 1.46) of dying of respiratory tract disorders, compared with cattle that entered during 1994, and respiratory tract disorders accounted for 57.1% of all deaths. Dairy cattle had a significantly increased risk of death of any cause, compared with beef steers. Beef heifers had a significantly increased risk of dying of respiratory tract disorders, compared with beef steers.

Conclusions and Clinical Relevance—Results suggested that although overall yearly mortality ratio did not significantly increase during the study, the risk of death attributable to respiratory tract disorders was increased during most years, compared with risk of death during 1994. The increased rates of fatal respiratory tract disorders may also reflect increased rates of non-fatal respiratory tract disorders, which would be expected to have adverse production effects in surviving animals. (J Am Vet Med Assoc 2001;219:1122–1127)

Full access
in Journal of the American Veterinary Medical Association