Search Results

You are looking at 1 - 10 of 19 items for

  • Author or Editor: David A. Dargatz x
  • Refine by Access: All Content x
Clear All Modify Search

Abstract

Objective—To evaluate antimicrobial susceptibility of commensal Escherichia coli strains isolated from the feces of horses and investigate relationships with hospitalization and antimicrobial drug (AMD) administration.

Design—Observational study.

Animals—68 hospitalized horses that had been treated with AMDs for at least 3 days (HOSP–AMD group), 63 hospitalized horses that had not received AMDs for at least 4 days (HOSP–NOAMD group), and 85 healthy horses that had not been hospitalized or treated with AMDs (community group).

Procedures—Fecal samples were submitted for bacterial culture, and up to 3 E coli colonies were recovered from each sample. Antimicrobial susceptibility of 724 isolates was evaluated. Prevalence of resistance was compared among groups by use of log-linear modeling.

Results—For 12 of the 15 AMDs evaluated, prevalence of antimicrobial resistance differed significantly among groups, with prevalence being highest among isolates from the HOSP–AMD group and lowest among isolates from the community group. Isolates recovered from the HOSP–AMD and HOSP–NOAMD groups were also significantly more likely to be resistant to multiple AMDs. Resistance to sulfamethoxazole and resistance to trimethoprim-sulfamethoxazole were most common, followed by resistance to gentamicin and resistance to tetracycline. Use of a potentiated sulfonamide, aminoglycosides, cephalosporins, or metronidazole was positively associated with resistance to 1 or more AMDs, but use of penicillins was not associated with increased risk of resistance to AMDs.

Conclusion and Clinical Relevance—Results suggest that both hospitalization and AMD administration were associated with prevalence of antimicrobial resistance among E coli strains isolated from the feces of horses.

Restricted access
in Journal of the American Veterinary Medical Association

Abstract

Objective—To evaluate trends in feedlot cattle mortality ratios over time, by primary body system affected, and by type of animal.

Design—Retrospective cohort study.

Animals—Approximately 21.8 million cattle entering 121 feedlots in the United States during 1994 through 1999.

Procedures—Yearly and monthly mortality ratios were calculated. Numbers of deaths were modeled by use of Poisson regression methods for repeated measures. Relative risks of death over time and by animal type were estimated.

Results—Averaged over time, the mortality ratio was 12.6 deaths/1,000 cattle entering the feedlots. The mortality ratio increased from 10.3 deaths/1,000 cattle in 1994 to 14.2 deaths/1,000 cattle in 1999, but this difference was not statistically significant (P = 0.09). Cattle entering the feedlots during 1999 had a significantly increased risk (relative risk, 1.46) of dying of respiratory tract disorders, compared with cattle that entered during 1994, and respiratory tract disorders accounted for 57.1% of all deaths. Dairy cattle had a significantly increased risk of death of any cause, compared with beef steers. Beef heifers had a significantly increased risk of dying of respiratory tract disorders, compared with beef steers.

Conclusions and Clinical Relevance—Results suggested that although overall yearly mortality ratio did not significantly increase during the study, the risk of death attributable to respiratory tract disorders was increased during most years, compared with risk of death during 1994. The increased rates of fatal respiratory tract disorders may also reflect increased rates of non-fatal respiratory tract disorders, which would be expected to have adverse production effects in surviving animals. (J Am Vet Med Assoc 2001;219:1122–1127)

Restricted access
in Journal of the American Veterinary Medical Association

Abstract

Objective—To evaluate biosecurity practices of cowcalf producers.

Design—Cross-sectional survey.

Sample Population—2,713 cow-calf operations were used in phase 1 of the study, and 1,190 cow-calf operations were used in phase 2.

Procedure—Producers were contacted for a personal interview between Dec 30, 1996 and Feb 3, 1997 regarding their management practices. Noninstitutional operations with 1 or more beef cows were eligible to participate in the study. Producers who participated in the first phase of the study and who had ≥ 5 beef cows were requested to continue in the study and were contacted by a veterinarian or animal health technician who administered further questionnaires. All contacts for the second phase of the study were made between Mar 3, 1997 and Apr 30, 1997. Additional data on use of various vaccines, testing of imported cattle for brucellosis, Mycobacterium paratuberculosis, bovine viral diarrhea, and tuberculosis as well as potential for feed contamination were collected during the second phase of the study.

Results—Producers commonly engaged in management practices that increased risk of introducing disease to their cattle such as importing cattle, failing to quarantine imported cattle, and communal grazing. Producers inconsistently adjusted for the increased risk of their management practices by increasing the types of vaccines given, increasing the quarantine time or proportion of imported animals quarantined, or increasing testing for various diseases in imported animals.

Conclusions and Clinical Relevance—Cow-calf herds are at risk for disease exposure from outside sources when cattle are introduced to the herd, and producers do not always adjust management practices such as vaccination schedules and quarantine procedures appropriately to minimize this risk. Veterinary involvement in education of producers regarding biosecurity risks and development of rational and economical biosecurity plans is needed. (J Am Vet Med Assoc 2000;217:185–189)

Restricted access
in Journal of the American Veterinary Medical Association
in American Journal of Veterinary Research

Abstract

Objective—To evaluate the effectiveness of various sampling techniques for determining antimicrobial resistance patterns in Escherichia coli isolated from feces of feedlot cattle.

Sample Population—Fecal samples obtained from 328 beef steers and 6 feedlot pens in which the cattle resided.

Procedure—Single fecal samples were collected from the rectum of each steer and from floors of pens in which the cattle resided. Fecal material from each single sample was combined into pools containing 5 and 10 samples. Five isolates of Escherichia coli from each single sample and each pooled sample were tested for susceptibility to 17 antimicrobials.

Results—Patterns of antimicrobial resistance for fecal samples obtained from the rectum of cattle did not differ from fecal samples obtained from pen floors. Resistance patterns from pooled samples differed from patterns observed for single fecal samples. Little pen-to-pen variation in resistance prevalence was observed. Clustering of resistance phenotypes within samples was detected.

Conclusions and Clinical Relevance—Studies of antimicrobial resistance in feedlot cattle can rely on fecal samples obtained from pen floors, thus avoiding the cost and effort of obtaining fecal samples from the rectum of cattle. Pooled fecal samples yielded resistance patterns that were consistent with those of single fecal samples when the prevalence of resistance to an antimicrobial was > 2%. Pooling may be a practical alternative when investigating patterns of resistance that are not rare. Apparent clustering of resistance phenotypes within samples argues for examining fewer isolates per fecal sample and more fecal samples per pen. (Am J Vet Res 2002;63:1662–1670)

Restricted access
in American Journal of Veterinary Research

Abstract

Objective—To determine current practices regarding use of antimicrobials in equine patients undergoing surgery because of colic at veterinary teaching hospitals.

Design—Survey.

Sample Population—Diplomates of the American College of Veterinary Surgeons performing equine surgery at veterinary teaching hospitals in the United States.

Procedure—A Web-based questionnaire was developed, and 85 surgeons were asked to participate. The first part of the survey requested demographic information and information about total number of colic surgeries performed at the hospital, number of colic surgeries performed by the respondent, and whether the hospital had written guidelines for antimicrobial drug use. The second part pertained to nosocomial infections. The third part provided several case scenarios and asked respondents whether they would use antimicrobial drugs in these instances.

Results—Thirty-four (40%) surgeons responded to the questionnaire. Respondents indicated that most equine patients undergoing surgery because of colic at veterinary teaching hospitals in the United States received antimicrobial drugs. Drugs that were used were similar for the various hospitals that were represented, and for the most part, the drugs that were used were fairly uniform irrespective of the type of colic, whereas the duration of treatment varied with the type of colic and the surgical findings. The combination of potassium penicillin and gentamicin was the most commonly used treatment.

Conclusions and Clinical Relevance—Results of this study document the implementation of recommendations by several authors in veterinary texts that antimicrobial drugs be administered perioperatively in equine patients with colic that are undergoing surgery. However, the need for long-term antimicrobial drug treatment in equine patients with colic is unknown. (J Am Vet Med Assoc 2002;220:1359–1365)

Restricted access
in Journal of the American Veterinary Medical Association

Abstract

Objective—To assess associations between herd management practices and herd-level rates of bovine respiratory disease complex (BRDC) in preweaned beef calves in US cow-calf operations.

Design—Cross-sectional survey.

Sample—443 herds weighted to represent the US cow-calf population.

Procedures—Producers from 24 states were selected to participate in a 2-phase survey; 443 producers completed both survey phases and had calves born alive during the study period. Data from those respondents underwent multivariable negative binomial regression analyses.

Results—Bred heifer importation was associated with lower BRDC rates (incidence rate ratio [IRR], 0.40; confidence interval [CI], 0.19 to 0.82), whereas weaned steer importation was associated with higher BRDC rates (IRR, 2.62; CI, 1.15 to 5.97). Compared with single-breed herds, operations with calves of 2-breed crosses (IRR, 2.36; CI, 1.30 to 4.29) or 3-breed crosses (IRR, 4.00; CI, 1.93 to 8.31) or composite-herd calves (IRR, 2.27; CI, 1.00 to 5.16) had higher BRDC rates. Operations classified as supplemental sources of income had lower BRDC rates (IRR, 0.48; CI, 0.26 to 0.87) than did operations classified as primary sources of income. Reported feed supplementation with antimicrobials was positively associated with BRDC rates (IRR, 3.46; CI, 1.39 to 8.60). The reported number of visits by outsiders in an average month also was significantly associated with herd-level BRDC rates, but the magnitude and direction of the effects varied.

Conclusions and Clinical Relevance—Management practices associated with preweaning BRDC rates may be potential indicators or predictors of preweaning BRDC rates in cow-calf production systems.

Restricted access
in Journal of the American Veterinary Medical Association

Abstract

Objective—To analyze the sulfur content of water and forage samples from a geographically diverse sample of beef cow-calf operations in the United States and to estimate frequency and distribution of premises where forage and water resources could result in consumption of hazardous amounts of sulfur by cattle.

Design—Cross-sectional study.

Sample Population—709 forage samples from 678 beef cow-calf operations and individual water samples from 498 operations in 23 states.

Procedure—Sulfur content of forage samples and sulfate concentration of water samples were measured. Total sulfur intake was estimated for pairs of forage and water samples.

Results—Total sulfur intake was estimated for 454 pairs of forage and water samples. In general, highest forage sulfur contents did not coincide with highest water sulfate concentrations. Overall, 52 of the 454 (11.5%) sample pairs were estimated to yield total sulfur intake (as a percentage of dry matter) ≥ 0.4%, assuming water intake during conditions of high ambient temperature. Most of these premises were in north-central (n = 19) or western (19) states.

Conclusions and Clinical Relevance—Results suggest that on numerous beef cow-calf operations throughout the United States, consumption of forage and water could result in excessively high sulfur intake. All water sources and dietary components should be evaluated when assessing total sulfur intake. Knowledge of total sulfur intake may be useful in reducing the risk of sulfur-associated health and performance problems in beef cattle. (J Am Vet Med Assoc 2002;221:673–677)

Restricted access
in Journal of the American Veterinary Medical Association

Abstract

OBJECTIVE To identify geographic areas in the United States where food animal veterinary services may be insufficient to meet increased needs associated with the US FDA's Veterinary Feed Directive.

DESIGN Cross-sectional study.

SAMPLE Data collected between 2010 and 2016 from the US Veterinary Medicine Loan Repayment Program, the National Animal Health Monitoring System Small-Scale US Livestock Operations Study, and the USDA's National Veterinary Accreditation Program.

PROCEDURES Each dataset was analyzed separately to identify geographic areas with greatest potential for veterinary shortages. Geographic information systems methods were used to identify co-occurrence among the datasets of counties with veterinary shortages.

RESULTS Analysis of the loan repayment program, Small-Scale Livestock Operations Study, and veterinary accreditation datasets revealed veterinary shortages in 314, 346, and 117 counties, respectively. Of the 3,140 counties in the United States during the study period, 728 (23.2%) counties were identified as veterinary shortage areas in at least 1 dataset. Specifically, 680 counties were identified as shortage areas in 1 dataset, 47 as shortage areas in 2 datasets, and 1 Arizona county as a shortage area in all 3 datasets. Arizona, Kentucky, Missouri, South Dakota, and Virginia had ≥ 3 counties identified as shortage areas in ≥ 2 datasets.

CONCLUSIONS AND CLINICAL RELEVANCE Many geographic areas were identified across the United States where food animal veterinary services may be inadequate to implement the Veterinary Feed Directive and meet other producer needs. This information can be used to assess the impact of federal regulations and programs and help understand the factors that influence access to food animal veterinary services in specific geographic areas.

Restricted access
in Journal of the American Veterinary Medical Association

Abstract

Objective—To estimate the prevalence of Mycobacterium avium subsp paratuberculosis infection among cows on beef operations in the United States.

Design—Cross-sectional seroprevalence study.

Sample Population—A convenience sample of 380 herds in 21 states.

Procedure—Serum samples were obtained from 10,371 cows and tested for antibodies to M avium subsp paratuberculosis with a commercial ELISA . Producers were interviewed to collect data on herd management practices.

Results—30 (7.9%) herds had 1 or more animals for which results of the ELISA were positive; 40 (0.4%) of the individual cow samples yielded positive results. None of the herd management practices studied were found to be associated with whether any animals in the herd would be positive for antibodies to M avium subsp paratuberculosis.

Conclusions and Clinical Relevance—Results suggest that the prevalence of antibodies to M avium subsp paratuberculosis among beef cows in the United States is low. Herds with seropositive animals were widely distributed geographically. (J Am Vet Med Assoc 2001;219:497–501)

Restricted access
in Journal of the American Veterinary Medical Association