Objective—To evaluate trends in feedlot cattle mortality
ratios over time, by primary body system affected,
and by type of animal.
Design—Retrospective cohort study.
Animals—Approximately 21.8 million cattle entering
121 feedlots in the United States during 1994 through
Procedures—Yearly and monthly mortality ratios
were calculated. Numbers of deaths were modeled
by use of Poisson regression methods for repeated
measures. Relative risks of death over time and by
animal type were estimated.
Results—Averaged over time, the mortality ratio
was 12.6 deaths/1,000 cattle entering the feedlots.
The mortality ratio increased from 10.3
deaths/1,000 cattle in 1994 to 14.2 deaths/1,000
cattle in 1999, but this difference was not statistically
significant (P = 0.09). Cattle entering the feedlots
during 1999 had a significantly increased risk
(relative risk, 1.46) of dying of respiratory tract disorders,
compared with cattle that entered during
1994, and respiratory tract disorders accounted for
57.1% of all deaths. Dairy cattle had a significantly
increased risk of death of any cause, compared with
beef steers. Beef heifers had a significantly
increased risk of dying of respiratory tract disorders,
compared with beef steers.
Conclusions and Clinical Relevance—Results suggested
that although overall yearly mortality ratio did
not significantly increase during the study, the risk of
death attributable to respiratory tract disorders was
increased during most years, compared with risk of
death during 1994. The increased rates of fatal respiratory
tract disorders may also reflect increased rates
of non-fatal respiratory tract disorders, which would
be expected to have adverse production effects in
surviving animals. (J Am Vet Med Assoc 2001;219:1122–1127)
Objective—To evaluate biosecurity practices of cowcalf
Sample Population—2,713 cow-calf operations
were used in phase 1 of the study, and 1,190 cow-calf
operations were used in phase 2.
Procedure—Producers were contacted for a personal
interview between Dec 30, 1996 and Feb 3, 1997
regarding their management practices. Noninstitutional
operations with 1 or more beef cows were
eligible to participate in the study. Producers who participated
in the first phase of the study and who had
≥ 5 beef cows were requested to continue in the
study and were contacted by a veterinarian or animal
health technician who administered further questionnaires.
All contacts for the second phase of the study
were made between Mar 3, 1997 and Apr 30, 1997.
Additional data on use of various vaccines, testing of
imported cattle for brucellosis, Mycobacterium
paratuberculosis, bovine viral diarrhea, and tuberculosis
as well as potential for feed contamination were
collected during the second phase of the study.
Results—Producers commonly engaged in management
practices that increased risk of introducing disease
to their cattle such as importing cattle, failing to
quarantine imported cattle, and communal grazing.
Producers inconsistently adjusted for the increased risk
of their management practices by increasing the types
of vaccines given, increasing the quarantine time or
proportion of imported animals quarantined, or increasing
testing for various diseases in imported animals.
Conclusions and Clinical Relevance—Cow-calf
herds are at risk for disease exposure from outside
sources when cattle are introduced to the herd, and
producers do not always adjust management practices
such as vaccination schedules and quarantine
procedures appropriately to minimize this risk.
Veterinary involvement in education of producers
regarding biosecurity risks and development of rational
and economical biosecurity plans is needed. (J Am
Vet Med Assoc 2000;217:185–189)
Objective—To evaluate antimicrobial susceptibility of commensal Escherichia coli strains isolated from the feces of horses and investigate relationships with hospitalization and antimicrobial drug (AMD) administration.
Animals—68 hospitalized horses that had been treated with AMDs for at least 3 days (HOSP–AMD group), 63 hospitalized horses that had not received AMDs for at least 4 days (HOSP–NOAMD group), and 85 healthy horses that had not been hospitalized or treated with AMDs (community group).
Procedures—Fecal samples were submitted for bacterial culture, and up to 3 E coli colonies were recovered from each sample. Antimicrobial susceptibility of 724 isolates was evaluated. Prevalence of resistance was compared among groups by use of log-linear modeling.
Results—For 12 of the 15 AMDs evaluated, prevalence of antimicrobial resistance differed significantly among groups, with prevalence being highest among isolates from the HOSP–AMD group and lowest among isolates from the community group. Isolates recovered from the HOSP–AMD and HOSP–NOAMD groups were also significantly more likely to be resistant to multiple AMDs. Resistance to sulfamethoxazole and resistance to trimethoprim-sulfamethoxazole were most common, followed by resistance to gentamicin and resistance to tetracycline. Use of a potentiated sulfonamide, aminoglycosides, cephalosporins, or metronidazole was positively associated with resistance to 1 or more AMDs, but use of penicillins was not associated with increased risk of resistance to AMDs.
Conclusion and Clinical Relevance—Results suggest that both hospitalization and AMD administration were associated with prevalence of antimicrobial resistance among E coli strains isolated from the feces of horses.
Objective—To determine current practices regarding
use of antimicrobials in equine patients undergoing
surgery because of colic at veterinary teaching hospitals.
Sample Population—Diplomates of the American
College of Veterinary Surgeons performing equine
surgery at veterinary teaching hospitals in the United
Procedure—A Web-based questionnaire was developed,
and 85 surgeons were asked to participate. The
first part of the survey requested demographic information
and information about total number of colic
surgeries performed at the hospital, number of colic
surgeries performed by the respondent, and whether
the hospital had written guidelines for antimicrobial
drug use. The second part pertained to nosocomial
infections. The third part provided several case scenarios
and asked respondents whether they would
use antimicrobial drugs in these instances.
Results—Thirty-four (40%) surgeons responded to
the questionnaire. Respondents indicated that most
equine patients undergoing surgery because of colic
at veterinary teaching hospitals in the United States
received antimicrobial drugs. Drugs that were used
were similar for the various hospitals that were represented,
and for the most part, the drugs that were
used were fairly uniform irrespective of the type of
colic, whereas the duration of treatment varied with
the type of colic and the surgical findings. The combination
of potassium penicillin and gentamicin was the
most commonly used treatment.
Conclusions and Clinical Relevance—Results of
this study document the implementation of recommendations
by several authors in veterinary texts that
antimicrobial drugs be administered perioperatively in
equine patients with colic that are undergoing surgery.
However, the need for long-term antimicrobial drug
treatment in equine patients with colic is unknown. (J
Am Vet Med Assoc 2002;220:1359–1365)
Objective—To evaluate bacterial and protozoal contamination of commercially available raw meat diets for dogs.
Design—Prospective longitudinal study.
Sample Population—240 samples from 20 raw meat diets for dogs (containing beef, lamb, chicken, or turkey), 24 samples from 2 dry dog foods, and 24 samples from 2 canned dog foods.
Procedure—Each product was purchased commercially on 4 dates approximately 2 months apart. Three samples from each product at each sampling period were evaluated via bacterial culture for non–type-specific Escherichia coli (NTSEC), Salmonella enterica, and Campylobacter spp. Antimicrobial susceptibility testing was performed on selected isolates. Polymerase chain reaction assays were used to detect DNA from Cryptosporidium spp, Neospora spp, and Toxoplasma spp in samples obtained in the third and fourth sampling periods.
Results—One hundred fifty-three of 288 (53%) samples were contaminated with NTSEC. Both raw and prepared foods contained NTSEC during at least 1 culture period. Salmonella enterica was recovered from 17 (5.9%) samples, all of which were raw meat products. Campylobacter spp was not isolated from any samples. In 91 of 288 (31.6%) samples, there was no gram-negative bacterial growth before enrichment and in 48 of 288 (16.7%) samples, there was no aerobic bacterial growth before enrichment. Susceptibility phenotypes were variable. Cryptosporidium spp DNA was detected in 3 samples.
Conclusions and Clinical Relevance—Bacterial contamination is common in commercially available raw meat diets, suggesting that there is a risk of foodborne illness in dogs fed these diets as well possible risk for humans associated with the dogs or their environments.
Objective—To assess associations between herd management practices and herd-level rates of bovine respiratory disease complex (BRDC) in preweaned beef calves in US cow-calf operations.
Sample—443 herds weighted to represent the US cow-calf population.
Procedures—Producers from 24 states were selected to participate in a 2-phase survey; 443 producers completed both survey phases and had calves born alive during the study period. Data from those respondents underwent multivariable negative binomial regression analyses.
Results—Bred heifer importation was associated with lower BRDC rates (incidence rate ratio [IRR], 0.40; confidence interval [CI], 0.19 to 0.82), whereas weaned steer importation was associated with higher BRDC rates (IRR, 2.62; CI, 1.15 to 5.97). Compared with single-breed herds, operations with calves of 2-breed crosses (IRR, 2.36; CI, 1.30 to 4.29) or 3-breed crosses (IRR, 4.00; CI, 1.93 to 8.31) or composite-herd calves (IRR, 2.27; CI, 1.00 to 5.16) had higher BRDC rates. Operations classified as supplemental sources of income had lower BRDC rates (IRR, 0.48; CI, 0.26 to 0.87) than did operations classified as primary sources of income. Reported feed supplementation with antimicrobials was positively associated with BRDC rates (IRR, 3.46; CI, 1.39 to 8.60). The reported number of visits by outsiders in an average month also was significantly associated with herd-level BRDC rates, but the magnitude and direction of the effects varied.
Conclusions and Clinical Relevance—Management practices associated with preweaning BRDC rates may be potential indicators or predictors of preweaning BRDC rates in cow-calf production systems.
Objective—To assess the use of CSF testing with an indirect fluorescent antibody test (IFAT) for diagnosis of equine protozoal myeloencephalitis (EPM) caused by Sarcocystis neurona.
Sample Population—Test results of 428 serum and 355 CSF samples from 182 naturally exposed, experimentally infected, or vaccinated horses.
Procedure—EPM was diagnosed on the basis of histologic examination of the CNS. Probability distributions were fitted to serum IFAT results in the EPM+ and EPM-horses, and correlation between serum and CSF results was modeled. Pairs of serum-CSF titers were generated by simulation, and titer-specific likelihood ratios and post-test probabilities of EPM at various pretest probability values were estimated. Post-test probabilities were compared for use of a serum-CSF test combination, a serum test only, and a CSF test only.
Results—Post-test probabilities of EPM increased as IFAT serum and CSF titers increased. Post-test probability differences for use of a serum-CSF combination and a serum test only were ≤ 19% in 95% of simulations. The largest increases occurred when serum titers were from 40 to 160 and pre-test probabilities were from 5% to 60%. In all simulations, the difference between pre- and post-test probabilities was greater for a CSF test only, compared with a serum test only.
Conclusions and Clinical Relevance—CSF testing after a serum test has limited usefulness in the diagnosis of EPM. A CSF test alone might be used when CSF is required for other procedures. Ruling out other causes of neurologic disease reduces the necessity of additional EPM testing.
Objective—To evaluate the effectiveness of various
sampling techniques for determining antimicrobial
resistance patterns in Escherichia coli isolated from
feces of feedlot cattle.
Sample Population—Fecal samples obtained from
328 beef steers and 6 feedlot pens in which the cattle
Procedure—Single fecal samples were collected
from the rectum of each steer and from floors of pens
in which the cattle resided. Fecal material from each
single sample was combined into pools containing 5
and 10 samples. Five isolates of Escherichia coli from
each single sample and each pooled sample were
tested for susceptibility to 17 antimicrobials.
Results—Patterns of antimicrobial resistance for
fecal samples obtained from the rectum of cattle did
not differ from fecal samples obtained from pen
floors. Resistance patterns from pooled samples differed
from patterns observed for single fecal samples.
Little pen-to-pen variation in resistance prevalence
was observed. Clustering of resistance phenotypes
within samples was detected.
Conclusions and Clinical Relevance—Studies of
antimicrobial resistance in feedlot cattle can rely on
fecal samples obtained from pen floors, thus avoiding
the cost and effort of obtaining fecal samples from the
rectum of cattle. Pooled fecal samples yielded resistance
patterns that were consistent with those of single
fecal samples when the prevalence of resistance
to an antimicrobial was > 2%. Pooling may be a practical
alternative when investigating patterns of resistance
that are not rare. Apparent clustering of resistance
phenotypes within samples argues for examining
fewer isolates per fecal sample and more fecal
samples per pen. (Am J Vet Res 2002;63:1662–1670)
Objective—To investigate Salmonella enterica infections at a Greyhound breeding facility.
Animal and Sample Populations—138 adult and juvenile dogs and S enterica isolates recovered from the dogs and their environment.
Procedures—The investigation was conducted at the request of a Greyhound breeder. Observations regarding the environment and population of dogs were recorded. Fecal, food, and environmental specimens were collected and submitted for Salmonellaculture. Isolates were serotyped and tested for susceptibility to 16 antimicrobials. Isolates underwent genetic analyses by use of pulsed-field gel electrophoresis and ribotyping.
Results—S enterica was recovered from 88 of 133 (66%) samples of all types and from 57 of 61 (93%) fecal samples. Eighty-three (94.3%) of the isolates were serotype Newport, 77 (87.5%) of which had identical resistance phenotypes. Genetic evaluations suggested that several strains of S enterica existed at the facility, but there was a high degree of relatedness among many of the Newport isolates. Multiple strains of Salmonella enterica serotype Newport were recovered from raw meat fed on 1 day.
Conclusions and Clinical Relevance—S enterica infections and environmental contamination were common at this facility. A portion of the Salmonellastrains detected on the premises was likely introduced via raw meat that was the primary dietary constituent. Some strains appeared to be widely disseminated in the population. Feeding meat that had not been cooked properly, particularly meat classified as unfit for human consumption, likely contributed to the infections in these dogs.