Search Results

You are looking at 1 - 9 of 9 items for

  • Author or Editor: Charles Allen x
  • Refine by Access: All Content x
Clear All Modify Search

Abstract

Objective

To evaluate effect of alternate-day oral administration of prednisolone on endogenous plasma ACTH concentration and adrenocortical response to exogenous ACTH in dogs.

Animals

12 Beagles.

Procedure

Dogs were allotted to 2 groups (group 1, 8 dogs treated with 1 mg of prednisolone/kg of body weight; group 2, 4 dogs given excipient only). During a 30-day period, blood samples were collected for determination of plasma ACTH and Cortisol concentrations before, during, and after treatment with prednisolone. From day 7 to 23, prednisolone or excipient was given on alternate days. Sample collection (48-hour period with 6-hour intervals) was performed on days 1, 7, 15, 21, and 28; on other days, sample collection was performed at 24-hour intervals. Pre-and post-ACTH plasma Cortisol concentrations were determined on days 3, 9, 17, 23, and 30.

Results

A significant difference was detected between treatment and time for group 1. Plasma ACTH concentrations significantly decreased for 18 to 24 hours after prednisolone treatment in group-1 dogs. At 24 to 48 hours, ACTH concentrations were numerically higher but not significantly different in group-1 dogs. Post-ACTH plasma Cortisol concentration significantly decreased after 1 dose of prednisolone and became more profound during the treatment period. However, post-ACTH Cortisol concentration returned to the reference range 1 week after prednisolone administration was discontinued.

Conclusions and Clinical Relevance

Single oral administration of 1 mg of prednisolone/kg significantly suppressed plasma ACTH concentration in dogs for 18 to 24 hours after treatment. Alternate-day treatment did not prevent suppression, as documented by the response to ACTH. (Am J Vet Res 1999;60:698–702)

Free access
in American Journal of Veterinary Research
in Journal of the American Veterinary Medical Association

Abstract

Objective—To define the vertical position of the patella in clinically normal large-breed dogs.

Sample Population—Cadavers of 13 clinically normal large-breed dog.

Procedure—Both hind limbs were harvested with intact stifle joints and mounted on a positioning device that allowed full range of motion of the stifle joint. Lateral radiographic views were obtained with the stifle joints positioned at each of 5 angles (148°, 130°, 113°, 96°, and 75°). Vertical position of the patella through a range of motion was depicted on a graph of mean stifle angle versus corresponding mean proximal patellar position (PPP) and distal patellar position (DPP) relative to the femoral trochlea for each dog. Ratio of length of the patellar ligament to length of the patella (L:P) was determined for each dog. Overall mean, SD, and 95% confidence intervals for L:P were calculated for all dogs.

Results—Evaluation of vertical position of the patella through a range of motion revealed a nearly linear relationship between joint angle and PPP and joint angle and DPP. Evaluation of L:P results did not reveal significant differences between limbs (left or right) or among joint angles. Overall mean ± SD L:P for all dogs was 1.68 ± 0.18 (95% confidence interval, 1.33 to 2.03).

Conclusion and Clinical Relevance—The L:P proved to be a repeatable measurement of vertical patellar position, which is independent of stifle angles from 75° to 148°. This measurement could be used as a quantitative method for diagnosing patella alta and patella baja in large-breed dogs. (Am J Vet Res 2002;63:42–46)

Full access
in American Journal of Veterinary Research

Abstract

Objective—To develop a better system for classification of herd infection status for paratuberculosis (Johne's disease [JD]) in US cattle herds on the basis of the risk of potential transmission of Mycobacterium avium subsp paratubeculosis.

Sample—Simulated data for herd size and within-herd prevalence; sensitivity and specificity for test methods obtained from consensus-based estimates.

Procedures—Interrelationships among variables influencing interpretation and classification of herd infection status for JD were evaluated by use of simulated data for various herd sizes, true within-herd prevalences, and sampling and testing methods. The probability of finding ≥ 1 infected animal in herds was estimated for various testing methods and sample sizes by use of hypergeometric random sampling.

Results—2 main components were required for the new herd JD classification system: the probability of detection of infection determined on the basis of test results from a sample of animals and the maximum detected number of animals with positive test results. Tables were constructed of the estimated probability of detection of infection, and the maximum number of cattle with positive test results or fecal pools with positive culture results with 95% confidence for classification of herd JD infection status were plotted. Herd risk for JD was categorized on the basis of 95% confidence that the true within-herd prevalence was ≤ 15%, ≤ 10%, ≤ 5%, or ≤ 2%.

Conclusions and Clinical Relevance—Analysis of the findings indicated that a scientifically rigorous and transparent herd classification system for JD in cattle is feasible.

Full access
in American Journal of Veterinary Research

Abstract

Objective—To compare calf weaning weight and associated economic variables for beef cows with serum antibodies against Mycobacterium avium subsp paratuberculosis (MAP) or from which MAP was isolated from feces with those for cows that were seronegative for antibodies against or culture negative for MAP.

Design—Retrospective study.

Animals—4,842 beef cows from 3 herds enrolled in the USDA National Johne's Disease Demonstration Herd Project.

Procedures—Individual cow ELISA and culture results were obtained from the project database. During each parity evaluated for each cow, the 205-day adjusted weaning weight (AWW) of its calf was calculated. The AWW was compared between test-positive and test-negative cows by use of multilevel mixed-effect models. The median value for feeder calves from 2007 to 2011 was used to estimate the economic losses associated with MAP test–positive cows.

Results—The AWW of calves from cows with strongly positive ELISA results was 21.48 kg (47.26 lb) less than that of calves from cows with negative ELISA results. The AWW of calves from cows classified as heavy or moderate MAP shedders was 58.51 kg (128.72 lb) and 40.81 kg (89.78 lb) less, respectively, than that of calves from MAP culture–negative cows. Associated economic losses were estimated as $57.49/calf for cows with strongly positive ELISA results and $156.60/calf and $109.23/calf for cows classified as heavy and moderate MAP shedders, respectively.

Conclusions and Clinical Relevance—Calves from cows with MAP-positive test results had significantly lower AWWs than did calves from cows with MAP-negative test results, which translated into economic losses for MAP-infected beef herds.

Full access
in Journal of the American Veterinary Medical Association

SUMMARY

Casein has been used as a protein source in diets designed to dissolve canine ammonium urate uroliths and to prevent their recurrence, because it contains fewer purine precursors than do many other sources of protein. However, an important question is whether reduced quantities of dietary casein have any benefit in modifying saturation of urine with urates. To answer this question, activity productd ratios of uric acid, sodium urate, and ammonium urate were determined in 24-hour urine samples produced by 6 healthy Beagles during periods of consumption of a 10.4% protein, casein-based (10.4% casein) diet and a 20.8% protein, casein-based (20.8% casein) diet. Significantly lower activity product. ratios of uric acid, sodium urate, and ammonium urate were observed when dogs consumed the 10.4% casein diet. Significantly lower 24-hour urinary excretions of ammonia and phosphorus were observed when dogs consumed the 10.4% casein diet. Twenty-four-hour urinary excretions of magnesium and 24-hour urine pH values were significantly higher when dogs were fed the 10.4% casein diet. These results suggest that use of the 10.4% casein diet in protocols designed for dissolution and prevention of uric acid, sodium urate, and ammonium urate uroliths in dogs may be beneficial.

Free access
in American Journal of Veterinary Research

SUMMARY

Urine activity product ratios of uric acid, sodium urate, and ammonium urate and urinary excretion of metabolites were determined in 24-hour samples produced by 6 healthy Beagles during periods of consumption of a low-protein, casein-based diet (diet A) and a high-protein, meat-based diet (diet B). Comparison of effects of diet A with those of diet B revealed: significantly lower activity product ratios of uric acid (P = 0.025), sodium urate (P = 0.045), and ammonium urate (P = 0.0045); significantly lower 24-hour urinary excretion of uric acid (P = 0.002), ammonia (P = 0.0002), sodium (P = 0.01), calcium (P = 0.005), phosphorus (P = 0.0003), magnesium (P = 0.01), and oxalic acid (P = 0.004); significantly (P = 0.0001) higher 24-hour urine pH; and significantly (P = 0.01) lower endogenous creatinine clearance. These results suggest that consumption of diet A minimizes changes in urine that predispose dogs to uric acid, sodium urate, and ammonium urate urolithiasis.

Free access
in American Journal of Veterinary Research

SUMMARY

Urine activity product ratios of uric acid (aprua), sodium urate (aprna), and ammonium urate (aprau), and urinary excretion of 10 metabolites were determined in 24-hour urine samples produced by 6 healthy Beagles during periods of consumption of 4 diets containing approximately 11% protein (dry weight) and various protein sources: a 72% moisture, casein-based diet; a 10% moisture, egg-based diet; a 72% moisture, chicken-based diet; and a 71% moisture, chicken-based, liver-flavored diet. Significantly (P < 0.05) higher aprua, aprna, and aprau were observed when dogs consumed the egg-based diet, compared with the other 3 diets; there were no differences in these ratios among the other 3 diets.

Twenty-four-hour urinary excretions of chloride, potassium, phosphorus, and oxalic acid were significantly (P < 0.05) higher when dogs consumed the egg-based diet. Twenty-four-hour urinary excretions of sodium were significantly (P < 0.05) higher when dogs consumed the egg-based diet, compared with the casein-based diet and the chicken-based, liver-flavored diet, but were not significantly different between the egg-based diet and chicken-based diet. Twenty-four-hour urine volume was similar when dogs consumed the 4 diets. Twenty-four-hour endogenous creatinine clearance was significantly (P < 0.05 lower when dogs consumed the casein based diet there were no differences among the other 3 diets. Although consumption all diets was associated with production alkaline urine, the 24-hour urine pH was significantly (P < 0.05) higher when dogs consumed the egg-based diet.

These results suggest that use diets containing approximately 10.5% protein (dry weight) and 70 moisture protocols designed for dissolution and prevention urate uroliths may be beneficial. The source dietary protein in canned formulated diets does not appear significantly influence the saturation of urine with uric acid, sodium urate, or ammonium urate.

Free access
in American Journal of Veterinary Research

Abstract

Objective—To examine the effect of various clinical tracks within the veterinary medical clinical curriculum at Texas A&M University on clinical diagnostic proficiency as determined by pre- and post-training assessment. We expected that the clinical track chosen by the student would impact their measured outcome with bias toward higher scores in their chosen field.

Design—Prospective cohort study.

Study Population—32 students from the College of Veterinary Medicine and Biomedical Sciences at Texas A&M University.

Procedures—By use of standardized, written case scenarios, clinical reasoning was assessed twice: once prior to the clinical (fourth) year of the curriculum and again at completion of the clinical year. Students demonstrated their abilities to collect and organize appropriate clinical data (history, physical examination, and laboratory findings), determine clinical diagnoses, and formulate and implement acceptable treatment modalities. Data from clinical assessments were compared for a given cohort and correlated with other measures (eg, grades, standardized test scores, and species-specific curricular track).

Results—Differences were detected in clinical diagnostic proficiency among students in different clinical tracks and for different species groups in the case scenarios. Tracking by species group in the clinical veterinary curriculum appeared to affect development of clinical reasoning and resulted in differential proficiency among cases for differing species groups.

Conclusions and Clinical Relevance—Differences in clinical experiences between small animal tracks and all other track opportunities (large animal, mixed animal, and alternative) influenced the development of clinical proficiency in fourth-year veterinary students during their clinical training period.

Full access
in Journal of the American Veterinary Medical Association