Search Results

You are looking at 1 - 10 of 13 items for

  • Author or Editor: Michael W. Sanderson x
  • Refine by Access: All Content x
Clear All Modify Search

Abstract

Objective—To determine whether tepoxalin alters kidney function in dogs with chronic kidney disease (CKD).

Animals—16 dogs with CKD (International Renal Interest Society stage 2 or 3) and osteoarthritis.

Procedures—Kidney function was assessed via serum biochemical analysis, urinalysis, urine protein-to-creatinine concentration ratio, urine γ-glutamyl transpeptidase-to-creatinine concentration ratio, iohexol plasma clearance, and indirect blood pressure measurement twice before treatment. Dogs received tepoxalin (10 mg/kg, PO, q 24 h) for 28 days (acute phase; n = 16) and an additional 6 months (chronic phase; 10). Recheck examinations were performed weekly (acute phase) and at 1, 3, and 6 months (chronic phase). Kidney function variables were analyzed via repeated-measures ANOVA.

Results—There was no difference over time for any variables in dogs completing both phases of the study. Adverse drug events (ADEs) resulting in discontinuation of tepoxalin administration included increased serum creatinine concentration (1 dog; week 1), collapse (1 dog; week 1), increased liver enzyme activities (1 dog; week 4), vomiting and diarrhea (1 dog; week 8), hematochezia (1 dog; week 24), and gastrointestinal ulceration or perforation (1 dog; week 26). Preexisting medical conditions and concomitant drug use may have contributed to ADEs. Kidney function was not affected in the latter 5 dogs. Discontinuation of tepoxalin administration stabilized kidney function in the former dog and resolved the ADEs in 4 of the 5 latter dogs.

Conclusions and Clinical Relevance—Tepoxalin may be used, with appropriate monitoring, in dogs with International Renal Interest Society stage 2 or 3 CKD and osteoarthritis.

Full access
in American Journal of Veterinary Research

Abstract

OBJECTIVE To determine whether animal-to-animal and community contact patterns were correlated with and predictive for bovine respiratory disease (BRD) in beef steers during the first 28 days after feedlot entry.

ANIMALS 70 weaned beef steers (mean weight, 248.9 kg).

PROCEDURES Calves were instrumented with a real-time location system transmitter tag and commingled in a single pen. The location of each calf was continuously monitored. Contact between calves was defined as ≤ 0.5 m between pen coordinates, and the duration that 2 calves were within 0.5 m of each other was calculated daily. Bovine respiratory disease was defined as respiratory tract signs and a rectal temperature > 40°C. Locational data were input into a community detection program to determine daily calf contact and community profiles. The number of BRD cases within each community was determined. A random forest algorithm was then applied to the data to determine whether contact measures were predictive of BRD.

RESULTS Probability of BRD was positively correlated with the number of seconds a calf spent in contact with calves presumably shedding BRD pathogens and number of calves with BRD within the community on the day being evaluated and the previous 2 days. Diagnostic performance of the random forest algorithm varied, with the positive and negative predictive values generally < 10% and > 90%, respectively.

CONCLUSIONS AND CLINICAL RELEVANCE Results indicated that direct transmission of BRD pathogens likely occurs among feedlot cattle. The relative contribution of animal-to-animal contact to BRD risk remains unknown and warrants further investigation.

Full access
in American Journal of Veterinary Research

Abstract

OBJECTIVE

To evaluate associations between weather conditions and management factors with the incidence of death attributable to bovine respiratory disease complex (BRDC) in high-risk auction-sourced beef calves.

ANIMALS

Cohorts (n = 3,339) of male beef calves (545,866) purchased by 1 large cattle feeding operation from 216 locations and transported to 1 of 89 feeding locations (backgrounding location or feedlot) with similar management protocols.

PROCEDURES

Associations between weather conditions and management factors on the day of purchase (day P) and during the first week at the feeding location and cumulative BRDC mortality incidence within the first 60 days on feed were estimated in a mixed-effects negative binomial regression model.

RESULTS

Significant factors in the final model were weaning status; degree of com-mingling; body weight; transport distance; season; precipitation, mean wind speed, and maximum environmental temperature on day P; environmental temperature range in the first week after arrival at the feeding location; and interactions between distance and wind speed and between body weight and maximum environmental temperature. Precipitation and wind speed on day P were associated with lower cumulative BRDC mortality incidence, but wind speed was associated only among calves transported long distances (≥ 1,082.4 km). Higher mean maximum temperature on day P increased the incidence of cumulative mortality among calves with low body weights (< 275.5 kg).

CONCLUSIONS AND CLINICAL RELEVANCE

Several weather conditions on day P and during the first week after arrival were associated with incidence of BRDC mortality. The results may have implications for health- and economic-risk management, especially for high-risk calves and calves that are transported long distances.

Full access
in American Journal of Veterinary Research

Abstract

Objective—To determine the biocontainment, biosecurity, and security practices at beef feedyards in the Central Plains of the United States.

Design—Survey.

Sample Population—Managers of feedyards in Colorado, Kansas, Nebraska, Oklahoma, and Texas that feed beef cattle for finish before slaughter; feedyards had to have an active concentrated animal feeding operation permit with a 1-time capacity of ≥ 1,000 cattle.

Procedures—A voluntary survey of feedyard personnel was conducted. Identified feedyard personnel were interviewed and responses regarding facility design, security, employees, disease preparedness, feedstuffs, hospital or treatment systems, sanitation, cattle sources, handling of sick cattle, and disposal of carcasses were collected in a database questionnaire.

Results—The survey was conducted for 106 feedyards with a 1-time capacity that ranged from 1,300 to 125,000 cattle. Feedyards in general did not have high implementation of biocontainment, biosecurity, or security practices. Smaller feedyards were, in general, less likely to use good practices than were larger feedyards.

Conclusions and Clinical Relevance—Results of the survey provided standard practices for biocontainment, biosecurity, and security in feedyards located in Central Plains states. Information gained from the survey results can be used by consulting veterinarians and feedyard managers as a basis for discussion and to target training efforts.

Full access
in Journal of the American Veterinary Medical Association

Abstract

Objective—To evaluate biosecurity practices of cowcalf producers.

Design—Cross-sectional survey.

Sample Population—2,713 cow-calf operations were used in phase 1 of the study, and 1,190 cow-calf operations were used in phase 2.

Procedure—Producers were contacted for a personal interview between Dec 30, 1996 and Feb 3, 1997 regarding their management practices. Noninstitutional operations with 1 or more beef cows were eligible to participate in the study. Producers who participated in the first phase of the study and who had ≥ 5 beef cows were requested to continue in the study and were contacted by a veterinarian or animal health technician who administered further questionnaires. All contacts for the second phase of the study were made between Mar 3, 1997 and Apr 30, 1997. Additional data on use of various vaccines, testing of imported cattle for brucellosis, Mycobacterium paratuberculosis, bovine viral diarrhea, and tuberculosis as well as potential for feed contamination were collected during the second phase of the study.

Results—Producers commonly engaged in management practices that increased risk of introducing disease to their cattle such as importing cattle, failing to quarantine imported cattle, and communal grazing. Producers inconsistently adjusted for the increased risk of their management practices by increasing the types of vaccines given, increasing the quarantine time or proportion of imported animals quarantined, or increasing testing for various diseases in imported animals.

Conclusions and Clinical Relevance—Cow-calf herds are at risk for disease exposure from outside sources when cattle are introduced to the herd, and producers do not always adjust management practices such as vaccination schedules and quarantine procedures appropriately to minimize this risk. Veterinary involvement in education of producers regarding biosecurity risks and development of rational and economical biosecurity plans is needed. (J Am Vet Med Assoc 2000;217:185–189)

Full access
in Journal of the American Veterinary Medical Association

Abstract

Objective—To compare the detection of pulmonary nodules by use of 3-view thoracic radiography and CT in dogs with confirmed neoplasia.

Design—Prospective case series.

Animals—33 dogs of various breeds.

Procedures—3 interpreters independently evaluated 3-view thoracic radiography images. The location and size of pulmonary nodules were recorded. Computed tomographic scans of the thorax were obtained and evaluated by a single interpreter. The location, size, margin, internal architecture, and density of pulmonary nodules were recorded. Sensitivity, specificity, positive predictive value, and negative predictive value were calculated for thoracic radiography (with CT as the gold standard).

Results—21 of 33 (64%) dogs had pulmonary nodules or masses detected on CT. Of the dogs that had positive CT findings, 17 of 21 (81 %) had pulmonary nodules or masses detected on radiographs by at least 1 interpreter. Sensitivity of radiography ranged from 71 % to 95%, and specificity ranged from 67% to 92%. Radiography had a positive predictive value of 83% to 94% and a negative predictive value of 65% to 89%. The 4 dogs that were negative for nodules on thoracic radiography but positive on CT were all large-breed to giant-breed dogs with osteosarcoma.

Conclusions and Clinical Relevance—CT was more sensitive than radiography for detection of pulmonary nodules. This was particularly evident in large-breed to giant-breed dogs. Thoracic CT is recommended in large-breed to giant-breed dogs with osteosarcoma if the detection of pulmonary nodules will change treatment.

Full access
in Journal of the American Veterinary Medical Association

Abstract

Objective—To evaluate the use of dipstick, sulfosalicylic acid (SSA), and urine protein-tocreatinine ratio (UP:C) methods for use in detection of canine and feline albuminuria.

Design—Evaluation study.

Sample Population—599 canine and 347 feline urine samples.

Procedures—Urine was analyzed by use of dipstick, SSA, and UP:C methods; results were compared with those for a species-specific ELISA to determine sensitivity, specificity, positive predictive value (PPV), negative predictive value, and positive and negative likelihood ratios.

Results—Positive results for dipstick and SSA tests (trace reaction or greater) in canine urine had moderate specificity (dipstick, 81.2%; SSA, 73.3%) and poor PPV (dipstick, 34.0%; SSA, 41.8%). Values improved when stronger positive results (≥ 2+) for the dipstick and SSA tests were compared with ELISA results (specificity, 98.9% and 99.0% for the urine dipstick and SSA tests, respectively; PPV, 90.7% and 90.2% for the dipstick and SSA tests, respectively). Data obtained for cats revealed poor specificity (dipstick, 11.0%; SSA, 25.4%) and PPV (dipstick, 55.6%; SSA, 46.9%). Values improved slightly when stronger positive test results (≥ 2+) were used (specificity, 80.0% and 94.2% for the dipstick and SSA tests, respectively; PPV, 63.5% and 65.2% for the dipstick and SSA tests, respectively). The UP:C had high specificity for albuminuria in dogs and cats (99.7% and 99.2%, respectively) but low sensitivity (28.7% and 2.0%, respectively).

Conclusions and Clinical Relevance—Caution should be used when interpreting a positive test result of a dipstick or SSA test for canine or feline albuminuria.

Full access
in Journal of the American Veterinary Medical Association

Abstract

Objective—To characterize direct and indirect contacts among livestock operations in Colorado and Kansas.

Design—Cross-sectional quarterly survey.

Sample—532 livestock producers.

Procedures—Livestock producers in Colorado and Kansas were recruited by various means to participate in the survey, which was sent out via email or postal mail once quarterly (in March, June, September, and December) throughout 2011. Data were entered into an electronic record, and descriptive statistics were summarized.

Results—Large swine operations moving animals to other large swine operations had the highest outgoing direct contact rates (range, 5.9 to 24.53/quarter), followed by dairy operations moving cattle to auction or other dairy operations (range, 2.6 to 10.34/quarter). Incoming direct contact rates for most quarters were highest for large feedlots (range, 0 to 11.56/quarter) and dairies (range, 3.90 to 5.78/quarter). For large feedlots, mean total indirect contacts through feed trucks, livestock haulers, and manure haulers each exceeded 725 for the year. Dairy operations had a mean of 434.25 indirect contacts from milk trucks and 282.25 from manure haulers for the year.

Conclusions and Clinical Relevance—High direct contact rates detected among large swine operations may suggest a risk for direct disease transmission within the integrated swine system. Indirect contacts as well as incoming direct contacts may put large feedlots at substantial risk for disease introduction. These data can be useful for establishing and evaluating policy and biosecurity guidelines for livestock producers in the central United States. The results may be used to inform efforts to model transmission and control of infectious diseases such as foot-and-mouth disease in this region.

Full access
in Journal of the American Veterinary Medical Association

Summary

Four boars intranasally inoculated with porcine reproductive and respiratory syndrome (PRRS) virus were monitored for 56 days after exposure for changes in semen characteristics and for the presence of virus in the semen. Clinically, 2 of 4 boars had mild respiratory signs of 1 day's duration after infection. Changes in appetite, behavior, or libido were not detected. All boars seroconverted on the indirect fluorescent antibody and serum virus neutralization tests by day 14 after inoculation. Virus was isolated from serum between days 7 and 14 after inoculation. During the monitoring period, semen volume decreased and pH correspondingly increased; however, this change began 7 to 10 days prior to infection. Differences in sperm morphologic features, concentration, or motility between the preinfection and postinfection samples were not observed. The PRRS virus was detected in semen at the first collection in each of the 4 boars (ie, 3 or 5 days after challenge exposure). Virus was detected in nearly all semen samples collected from the 4 infected boars through days 13, 25, 27, and 43, respectively. Neither gross nor microscopic lesions attributable to PRRS virus were observed in tissues collected at the termination of the experiment (day 56), and virus isolation results from reproductive tissues were negative.

Free access
in Journal of the American Veterinary Medical Association

Abstract

Objective—To examine the feasibility of depopulation of a large feedlot during a foot-and-mouth disease (FMD) outbreak in the United States.

Design—Delphi survey followed by facilitated discussion.

Sample—27 experts, including veterinary toxicologists and pharmacologists, animal welfare experts, feedlot managers, and consulting veterinarians.

Procedures—4 veterinary pharmacologists, 5 veterinary toxicologists, 4 animal welfare experts, 26 consulting veterinarians, and 8 feedlot managers were invited to participate in a Delphi survey to identify methods for depopulation of a large feedlot during an FMD outbreak. A facilitated discussion that included 1 pharmacologist, 1 toxicologist, 1 animal welfare expert, 2 consulting veterinarians, and 2 feedlot managers was held to review the survey results.

Results—27 of 47 invited experts participated in the Delphi survey. Survey consensus was that, although several toxic agents would effectively cause acute death in a large number of animals, all of them had substantial animal welfare concerns. Pentobarbital sodium administered IV was considered the most effective pharmacological agent for euthanasia, and xylazine was considered the most effective sedative. Animal welfare concerns following administration of a euthanasia solution IV or a penetrating captive bolt were minimal; however, both veterinarians and feedlot managers felt that use of a captive bolt would be inefficient for depopulation. Veterinarians were extremely concerned about public perception, human safety, and timely depopulation of a large feedlot during an FMD outbreak.

Conclusions and Clinical Relevance—Depopulation of a large feedlot during an FMD outbreak would be difficult to complete in a humane and timely fashion.

Full access
in Journal of the American Veterinary Medical Association