Objective—To evaluate the association between fecal excretion of Mycobacterium avium subsp paratuberculosis (MAP) by dairy cows in the periparturient period and detection of MAP DNA in colostrum specimens and on teat skin surfaces.
Animals—112 Holstein cows.
Procedures—Fecal specimens were collected within 48 to 72 hours prior to parturition, and colostrum and teat swab specimens were collected immediately after parturition. Detection of MAP in fecal specimens was performed via microbial culture, and detection of MAP DNA in colostrum and teat swab specimens was achieved via a PCR assay targeting the genetic element ISMAP02. Logistic regression was used to model the relationship between MAP fecal shedding status and detection of MAP DNA in colostrum or teat swab specimens. Population attributable fractions for the proportion of colostrum and teat swab specimens containing MAP DNA were also calculated.
Results—The odds of detecting MAP DNA in colostrum or teat swab specimens in cows with MAP-positive (vs negative) fecal specimens were 2.02 and 1.87 respectively. Population attributable fractions estimates suggested that withholding colostrum from MAP-positive cows could reduce the odds of exposing calves to MAP in colostrum by 18.2%. Not permitting natural suckling by calves could reduce the odds of exposing calves to MAP on the teat surfaces of MAP-positive cows by 19.5%.
Conclusions and Clinical Relevance—Results underscored the need for strict adherence to practices that limit contact of calves with adult cows from the time of birth and promote hygienic colostrum handling to avoid possible contamination with MAP during colostrum harvest, storage, or feeding.
Objective—To estimate the risk of subclinical Mycobacterium avium subsp paratuberculosis (MAP) infection in cows that ingested MAP DNA–positive raw colostrum as calves, compared with risk in cows that ingested MAP DNA–negative raw colostrum as calves.
Animals—205 calves born in 12 commercial dairy herds.
Procedures—Each calf was separated from its dam within 30 to 60 minutes after birth and fed raw colostrum. For each calf, samples of the colostrum fed were collected and tested for the presence of MAP DNA by use of a nested PCR assay for the target gene ISMAP02. Calves fed colostrum positive or negative for MAP DNA were classified into exposed (n = 69) and unexposed (136) groups, respectively. Each calf was tested for MAP infection at 30, 42, and 54 months of age by use of a serum ELISA and bacterial culture of feces. Weibull hazard regression models were used to evaluate the association between exposure to MAP DNA–positive colostrum and time to testing positive for MAP infection.
Results—Hazard of MAP infection was not different between groups (exposed vs unexposed) when serum ELISA, bacterial culture of feces, or both diagnostic tests (parallel interpretation) were positive.
Conclusions and Clinical Relevance—Heifer calves fed MAP DNA–positive colostrum were at no greater risk of MAP infection, compared with heifer calves fed MAP DNA–negative colostrum. This result contradicts findings from other studies and should be interpreted with caution.
Objective—To characterize the risk of interactions that may lead to the transmission of Mycobacterium bovis between cattle and white-tailed deer (Odocoileus virginianus) on farms in northern Minnesota.
Sample—53 cattle farms in northwestern Minnesota adjacent to an area where bovine tuberculosis-infected cattle and deer were detected.
Procedures—A semiquantitative deer-cattle interaction assessment tool was used for the 53 cattle herds. Farm risk scores were analyzed on the basis of deer damage to stored feed.
Results—27 (51%) farms reported deer damage to stored cattle feeds within the year previous to the farm visit. A strong association was found between increases in the percentage of land that could serve as deer cover and deer damage to stored feeds on a farm. The total risk score was significantly associated with the probability of a farm having deer damage. By use of a logistic regression model, the total risk score and proportion of nonagricultural land around a farm could be used to predict the likelihood of deer damage to stored feeds.
Conclusions and Clinical Relevance—Management practices on many farms in northwestern Minnesota allowed potential deer-cattle interactions. The on-farm risk assessment tool served as a valuable tool for prioritizing the biosecurity risks for farms. Continued development of biosecurity is needed to prevent potential transmission of bovine tuberculosis between deer and cattle, especially on farms that have a higher risk of deer damage.
Objective—To evaluate the effect of delayed exposure of dairy cattle to Mycobacterium avium subsp paratuberculosis (MAP) on the incidence of those cows testing positive for MAP and developing clinical Johne's disease (CJD).
Animals—79 cows not exposed to MAP as calves (unexposed cohort) and 260 cows exposed to MAP as calves (exposed cohort).
Procedures—Cows in the unexposed cohort were born into 5 MAP-uninfected herds and introduced at various ages into 5 MAP-infected herds where the exposed cohort cows were born and raised. Beginning when each cow was 24 months old, fecal and serum samples were collected annually from 2003 through 2006. Feces were cultured for MAP, and an ELISA was used to analyze serum samples for antibodies against MAP. Date and reason for culling were obtained from herd records. Incidence of positive culture and ELISA results and CJD was compared between unexposed and exposed cohort cows with Cox regression.
Results—Compared with exposed cohort cows, the hazard ratios for unexposed cohort cows having positive culture results, having positive ELISA results, and developing CJD were 0.12, 0.03, and 0.001, respectively, and those ratios increased by 2%, 6%, and 17%, respectively, for each month spent in an MAP-infected herd.
Conclusions and Clinical Relevance—Delayed exposure of cows to MAP resulted in lower incidences of positive culture and ELISA results and CJD in those cows, compared with incidences of cows exposed to MAP since birth. The hazard of testing positive for MAP or developing CJD increased with time, regardless of cohort.
Objectives—To determine the sensitivity of bacteriologic
culture of pooled fecal samples in detecting
Mycobacterium paratuberculosis, compared with bacteriologic
culture of individual fecal samples in dairy
Study Design—Cross-sectional study.
Animals—24 dairy cattle herds.
Procedure—Individual and pooled fecal samples
were submitted for bacteriologic culture, and results
were compared between these groups.
Results—Ninety-four and 88% of pooled fecal samples
that contained feces from at least 1 animal with
high (mean, ≥ 50 colonies/tube) and moderate (mean,
10 to 49 colonies/tube) concentrations of M paratuberculosis,
respectively, were identified by use of
bacteriologic culture of pooled fecal samples.
Prevalences of paratuberculosis determined by bacteriologic
culture of pooled and individual fecal samples
were highly correlated.
Conclusions and Clinical Relevance—Bacteriologic
culture of pooled fecal samples provided a valid and
cost-effective method for the detection of M paratuberculosis
infection in dairy cattle herds and can be
used to estimate prevalence of infection within a
herd. (J Am Vet Med Assoc 2003;223:1022–1025)
Objective—To evaluate longevity, milk production, and breeding performance in adult Holstein cows fed either a plasma-derived commercial colostrum replacer (CR) or raw bovine maternal colostrum (MC) at birth.
Design—Randomized controlled clinical trial.
Animals—497 heifer calves born in 12 commercial dairies located in Minnesota and Wisconsin.
Procedures—All calves were separated from their dams within 30 to 60 minutes after birth and systematically assigned to be fed either MC (control group [n = 261 calves]) or CR (treatment group ). Calves were observed from birth up to adulthood (approx 54 months old), during which time death and culling events plus milk yield and breeding performance data were collected. Time to death, time to culling, time to death or culling combined, time to first calving, and time to conception intervals were evaluated by use of proportional hazards survival analysis models. Number of times inseminated per conception and lifetime milk yield (up to 54 months old) were evaluated by use of general linear models.
Results—Cows fed CR as calves at the time of birth were no different than cows fed MC as calves with respect to overall risk of death, culling, or death or culling combined (from birth to 54 months of follow-up and from first calving to 54 months old); lifetime milk yield; and breeding performance.
Conclusions and Clinical Relevance—No difference was detected in overall risk of death or culling, milk production, or reproductive performance between cows fed CR and those fed MC as calves at birth.
Objective—To estimate the relative risk of paratuberculosis (Johne's disease [JD]) in calves fed a plasma-derived colostrum-replacement (CR) product versus raw bovine maternal colostrum (MC).
Study Design—Randomized controlled clinical trial.
Animals—497 heifer calves born in 12 JD-endemic commercial Holstein dairy farms located in Minnesota and Wisconsin.
Procedures—Every calf was separated from its dam within 30 to 60 minutes after birth and systematically assigned to be fed raw bovine MC (control group, n = 261 calves) or CR (treatment group, 236 calves). The calves were monitored to adulthood and tested for Mycobacterium avium subsp paratuberculosis (MAP) infection by use of an ELISA to detect serum antibodies against MAP and bacterial culture for MAP in feces at approximately 30, 42, and 54 months of age. Weibull regression models were used to evaluate the effect of feeding CR (vs raw bovine MC) on the risk of developing JD infection.
Results—Calves fed CR at birth were less likely (hazard ratio = 0.559) to become infected with MAP (as determined by use of an ELISA, bacterial culture, or both diagnostic tests), compared with the likelihood for calves fed MC at birth.
Conclusions and Clinical Relevance—This study revealed that feeding CR reduced the risk of developing MAP infection in Holstein calves born in JD-endemic herds, which implied that feeding raw bovine MC may be a source of MAP for calves. Plasma colostrum-replacement products may be an effective management tool for use in dairy herds attempting to reduce the prevalence of JD.
Objective—To determine growth, morbidity, and
mortality rates in dairy calves fed pasteurized nonsaleable
milk versus commercial milk replacer and
compare economics of feeding pasteurized nonsaleable
milk versus commercial milk replacer in dairy
Animals—438 dairy calves.
Procedure—Calves were assigned at 1 to 2 days of
age to be fed pasteurized nonsaleable milk or a commercial
milk replacer until weaned. Body weight was
measured at the time of study enrollment and at the
time of weaning, and any medical treatments administered
and deaths that occurred prior to weaning
were recorded. A partial budget model was developed
to examine the economics of feeding pasteurized nonsaleable
milk versus commercial milk replacer.
Results—Calves fed conventional milk replacer had
significantly lower rates of gain (–0.12 kg/d [–0.26
lb/d]), lower weaning weights (–5.6 kg [–12.3 lb]),
higher risk for treatment during the summer and winter
months (odds ratio [OR], 3.99), and higher risk of
death during the winter months (OR, 29.81) than did
calves fed pasteurized nonsaleable milk. The estimated
savings of feeding pasteurized nonsaleable milk,
compared with milk replacer, was $0.69/calf per day.
The estimated number of calves needed to economically
justify the nonsaleable milk pasteurization system
was 23 calves/d.
Conclusions and Clinical Relevance—Results suggest
that dairy calves fed pasteurized nonsaleable milk
have a higher growth rate and lower morbidity and
mortality rates than do calves fed conventional milk
replacer. Feeding pasteurized nonsaleable milk could
be an economically viable strategy for dairy calf producers.
(J Am Vet Med Assoc 2005;226:1547–1554)
Objective—To determine whether measurement of blood cardiac troponin I (cTnI) concentrations with a cage-side analyzer could be used to differentiate cardiac from noncardiac causes of dyspnea in cats.
Design—Prospective, multicenter study.
Animals—44 client-owned cats with dyspnea and 37 healthy staff-owned cats.
Procedures—Affected cats were examined because of dyspnea; treatment was administered in accordance with the attending clinician's discretion. Cats were judged to have a cardiac or noncardiac cause of dyspnea on the basis of results of physical examination, thoracic radiography, and echocardiography. Blood cTnI concentrations were determined with a cage-side analyzer on samples collected within 12 hours after admission of affected cats. Concentrations for healthy cats were obtained for comparison.
Results—5 enrolled cats were excluded from the study because of concurrent cardiac and respiratory disease. Of the remaining 39 cats with dyspnea, 25 had a cardiac cause and 14 had a noncardiac cause. The 25 cats with a cardiac cause of dyspnea had a significantly higher blood cTnI concentration than did the 37 healthy cats or the 14 cats with a noncardiac cause of dyspnea.
Conclusions and Clinical Relevance—Measurement of cTnI concentrations with a cage-side assay in emergency settings may be useful for differentiating cardiac from noncardiac causes of dyspnea in cats.
Objective—To evaluate use of crotalid antivenom, frequency of hypersensitivity reactions, and risk factors for hypersensitivity reactions and death in envenomed cats.
Design—Retrospective multicenter case series.
Animals—115 envenomed cats treated with antivenom and 177 envenomed cats treated without antivenom.
Procedures—Medical records from 5 institutions were searched by means of a multiple-choice survey with standardized answers for patient data including signalment, diagnosis, antivenom administration criteria, premedication, product, dose, administration rate, hypersensitivity reactions, and mortality rate.
Results—95 of 115 (82.6%) cats received whole IgG antivenom, 11 (9.57%) received F(ab′)2 antivenom, and 4 (3.48%) received Fab antivenom. The majority (101/115 [878%]) of cats received 1 vial of antivenom. In all cats, the median dilution of antivenom was 1:60 (range, 1:10 to 1:250) administered over a median period of 2.0 hours (range, 0.3 to 9.0 hours). There was no mortality rate difference between cats that did (6.67%) or did not (5.08%) receive antivenom. A type I hypersensitivity reaction was diagnosed in 26 of 115 (22.6%) cats. The use of premedications did not decrease type I hypersensitivity or improve mortality rate. Cats that had a type I hypersensitivity reaction were 10 times as likely to die as were those that did not have such a reaction.
Conclusions and Clinical Relevance—The mortality rate of cats treated with antivenom was low. The administration of premedications did not improve mortality rate or prevent hypersensitivity reactions. The only variable associated with mortality rate was development of a type I hypersensitivity reaction. The rate of antivenom administration should be further evaluated as a possible risk factor for type I hypersensitivity reactions.