PROCEDURES Blood samples were obtained before and at completion of surgery. Serum cortisol and aldosterone and plasma cACTH concentrations were measured by use of validated radioimmunoassays. Changes in concentrations (postoperative concentration minus preoperative concentration) were calculated. Data were analyzed by use of the Wilcoxon signed rank test, Pearson correlation analysis, and Mann-Whitney rank sum test.
RESULTS Cortisol, aldosterone, and cACTH concentrations increased significantly from before to after surgery. Although cortisol and aldosterone concentrations increased in almost all dogs, cACTH concentrations decreased in 6 of 32 (19%) dogs. All dogs had preoperative cortisol concentrations within the reference range, but 24 of 39 (62%) dogs had postoperative concentrations above the reference range. A correlation between the change in cACTH concentration and the change in cortisol concentration was not detected.
CONCLUSIONS AND CLINICAL RELEVANCE Laparotomy caused a significant increase in serum cortisol and aldosterone concentrations. In most dogs, but not all dogs, plasma cACTH concentrations increased. Lack of correlation between the change in cACTH concentration and the change in cortisol concentration suggested that increased postoperative cortisol concentrations may have been attributable to ACTH-independent mechanisms, an early ACTH increase that caused a sustained cortisol release, or decreased cortisol clearance. Further studies are indicated to evaluate the effects of various anesthetic protocols and minimally invasive surgical techniques on the stress response.
Objective—To determine whether trilostane or ketotrilostane is more potent in dogs and determine the trilostane and ketotrilostane concentrations that inhibit adrenal gland cortisol, corticosterone, and aldosterone secretion by 50%.
Sample—24 adrenal glands from 18 mixed-breed dogs.
Procedures—Adrenal gland tissues were sliced, placed in tissue culture, and stimulated with 100 pg of ACTH/mL alone or with 5 concentrations of trilostane or ketotrilostane. Trials were performed independently 4 times. In each trial, 6 samples (1 for each time point) were collected for each of the 5 concentrations of trilostane and ketotrilostane tested as well as a single negative control samples. At the end of 0, 1, 2, 3, 5, and 7 hours, tubes were harvested and media and tissue slices were assayed for cortisol, corticosterone, aldosterone, and potassium concentrations. Data were analyzed via pharmacodynamic modeling. One adrenal slice exposed to each concentration of trilostane or ketotrilostane was submitted for histologic examination to assess tissue viability.
Results—Ketotrilostane was 4.9 and 2.4 times as potent in inhibiting cortisol and corticosterone secretion, respectively, as its parent compound trilostane. For trilostane and ketotrilostane, the concentrations that inhibited secretion of cortisol or corticosterone secretion by 50% were 480 and 98.4 ng/mL, respectively, and 95.0 and 39.6 ng/mL, respectively.
Conclusions and Clinical Relevance—Ketotrilostane was more potent than trilostane with respect to inhibition of cortisol and corticosterone secretion. The data should be useful in developing future studies to evaluate in vivo serum concentrations of trilostane and ketotrilostane for efficacy in the treatment of hyperadrenocorticism.
Objective—To determine the lowest dose of cosyntropin on a per body weight basis that would produce maximal cortisol and aldosterone secretion and the ideal timing of blood sample collection after ACTH stimulation in healthy cats.
Design—Randomized crossover trial.
Animals—7 adult sexually intact male purpose-bred cats.
Procedures—Each cat received saline (0.9% NaCl) solution (control) and 5 doses (125 μg/cat and 10, 5, 2.5, and 1 μg/kg [4.54, 2.27, 1.14, and 0.45 μg/lb]) of cosyntropin IV with a 2-week washout period between treatments. Blood samples were obtained before (baseline) and at 15, 30, 45, 60, 75, and 90 minutes after administration of saline solution or cosyntropin.
Results—Serum cortisol and aldosterone concentration increased significantly, compared with baseline values, after administration of all cosyntropin doses. Lower doses of cosyntropin resulted in an adrenocortical response equivalent to the traditional dose of 125 μg/cat. The lowest doses of cosyntropin that stimulated a maximal cortisol and aldosterone response were 5 and 2.5 μg/kg, respectively. Lower doses of cosyntropin resulted in a shorter interval between IV administration of cosyntropin and peak serum cortisol and aldosterone concentrations.
Conclusions and Clinical Relevance—Low-dose ACTH stimulation testing with IV administration of cosyntropin at 5 μg/kg followed by blood sample collection at 60 to 75 minutes resulted in concurrent peak serum cortisol and aldosterone concentrations that were equivalent to those achieved following administration of cosyntropin at 125 μg/cat, the standard dose currently used.
Objective—To compare adrenal gland stimulation achieved following administration of cosyntropin (5 μg/kg [2.3 μg/lb]) IM versus IV in healthy dogs and dogs with hyperadrenocorticism.
Animals—9 healthy dogs and 9 dogs with hyperadrenocorticism.
Procedures—In both groups, ACTH stimulation was performed twice. Healthy dogs were randomly assigned to receive cosyntropin IM or IV first, but all dogs with hyperadrenocorticism received cosyntropin IV first. In healthy dogs, serum cortisol concentration was measured before (baseline) and 30, 60, 90, and 120 minutes after cosyntropin administration. In dogs with hyperadrenocorticism, serum cortisol concentration was measured before and 60 minutes after cosyntropin administration.
Results—In the healthy dogs, serum cortisol concentration increased significantly after administration of cosyntropin, regardless of route of administration, and serum cortisol concentrations after IM administration were not significantly different from concentrations after IV administration. For both routes of administration, serum cortisol concentration peaked 60 or 90 minutes after cosyntropin administration. In dogs with hyperadrenocorticism, serum cortisol concentration was significantly increased 60 minutes after cosyntropin administration, compared with baseline concentration, and concentrations after IM administration were not significantly different from concentrations after IV administration.
Conclusions and Clinical Relevance—Results suggest that in healthy dogs and dogs with hyperadrenocorticism, administration of cosyntropin at a dose of 5 μg/kg, IV or IM, resulted in equivalent adrenal gland stimulation.
To evaluate effects of the addition of glucose to dog and cat urine on urine specific gravity (USG) and determine whether glucosuria affects assessment of renal concentrating ability.
Urine samples from 102 dogs and 59 cats.
Urine for each species was pooled to create samples with various USGs. Glucose was added to an aliquot of each USG pool (final concentration, 2,400 mg/dL), and serial dilutions of the glucose-containing aliquot were created for each pool. The USG then was measured in all samples. The difference in USG attributable to addition of glucose was calculated by subtracting the USG of the unaltered sample from the USG of the sample after the addition of glucose. The relationship between the difference in USG and the USG of the unaltered, undiluted sample was evaluated by the use of linear regression analysis.
Addition of glucose to urine samples increased the USG. There was a significant relationship between USG of the undiluted sample and the difference in USG when glucose was added to obtain concentrations of 300, 600, 1,200, and 2,400 mg/dL in canine urine and concentrations of 600, 1,200, and 2,400 mg/dL in feline urine. The more concentrated the urine before the addition of glucose, the less change there was in the USG. Changes in USG attributable to addition of glucose were not clinically important.
CONCLUSIONS AND CLINICAL RELEVANCE
Substantial glucosuria resulted in minimal alterations in specific gravity of canine and feline urine samples. Thus, USG can be used to assess renal concentrating ability even in samples with glucosuria.
To assess the accuracy of automated readings of urine dipstick results for assessment of glucosuria in dogs and cats, compare visual versus automated readings of urine glucose concentration, and determine the utility of the urine glucose-to-creatinine ratio (UGCR) for quantification of glucosuria.
310 canine and 279 feline urine samples.
Glucose concentration was estimated in 271 canine and 254 feline urine samples by visual assessment of urine dipstick results and with an automated dipstick reader. Absolute urine glucose and creatinine concentrations were measured in 39 canine and 25 feline urine samples by colorimetric assay with a clinical chemistry analyzer (reference standard for detection of glucosuria), and UGCRs were determined.
Automated assessment of the urine dipsticks yielded accurate results for 163 (60.1%) canine urine samples and 234 (92.1%) feline urine samples. Sensitivity of the automated dipstick reader for detection of glucosuria was 23% for canine samples and 68% for feline samples; specificity was 99% and 98%, respectively. Visual readings were more accurate than automated readings for both canine and feline urine. The UGCR was significantly correlated with absolute urine glucose concentration for both dogs and cats, yet there was incomplete distinction between dipstick categories for glucose concentration and UGCR.
CONCLUSIONS AND CLINICAL RELEVANCE
Urine dipstick readings for dogs and cats were useful for ruling glucosuria in when the result was positive but not for ruling it out when the result was negative. The evaluated dipsticks were more accurate for detection of glucosuria in cats than in dogs. Visual dipstick readings were more accurate than automated readings. The UGCR did not appear to provide additional useful information.
OBJECTIVE To evaluate effects of blood contamination on dipstick results, specific gravity (SG), and urine protein-to-urine creatinine ratio (UPCR) for urine samples from dogs and cats.
SAMPLE Urine samples collected from 279 dogs and 120 cats.
PROCEDURES Urine pools were made for each species (dogs [n = 60] and cats ). Blood was added to an aliquot of a pool, and serial dilutions were prepared with the remaining urine. Color and dipstick variables were recorded, and SG and UPCR were measured. For cats, 1 set of pools was used; for dogs, 2 sets were used. Comparisons were made between undiluted urine and spiked urine samples for individual colors. Repeated-measures ANOVA on ranks was used to compare dipstick scores and UPCR results; χ2 tests were used to compare proteinuria categorizations (nonproteinuric, borderline, or proteinuric).
RESULTS Any blood in the urine resulted in significantly increased dipstick scores for blood. In both species, scores for bilirubin and ketones, pH, and SG were affected by visible blood contamination. No significant difference for the dipstick protein reagent results was evident until a sample was visibly hematuric. The UPCR was significantly increased in dark yellow samples of both species. Proteinuria categorizations differed significantly between undiluted urine and urine of all colors, except light yellow.
CONCLUSIONS AND CLINICAL RELEVANCE Any degree of blood contamination affected results of dipstick analysis. Effects depended on urine color and the variable measured. Microscopic blood contamination may affect the UPCR; thus, blood contamination may be a differential diagnosis for proteinuria in yellow urine samples.