Renal Handling of Urea

July 22, 2017 Laboratory Medicine, Nephrology, Physiology and Pathophysiology, Urology No comments , , , , , , ,

Renal Handling of Urate

Urate, an anion that is the base form of uric acid, provides a fascinating example of the renal handling of organic anions that is particularly important for clinical medicine and is illustrative of renal pathology. An increase in the plasma concentration of urate can cause gout and is thought to be involved in some forms of heart disease and renal disease; therefore, its removal from the blood is important. However, instead of excreting all the urate it can, the kidneys actually reabsorb most of the filtered urate. Urate is freely filterable. Almost all the filtered rate is reabsorbed early in the proximal tubule, primarily via antiporters (URAT1) that exchange urate for another organic anion. Further on the proximal tubule urate undergoes active tubular secretion. Then, in the straight portion, some of the urate is once again reabsorbed. Because the total rate of reabsorption is normally much greater than the rate of secretion, only a small fraction of the filtered load is excreted.

Although urate reabsorption is greater than secretion, the secretory process is controlled to maintain relative constancy of plasma urate. In other words, if plasma urate begins to increase because of increased urate production, the active proximal secretion of urate is stimulated, thereby increasing urate excretion.

Given these mechanisms of renal urate handling, the reader should be able to deduce the 3 ways by which altered renal function can lead to decreased urate excretion and hence increased plasma urate, as in gout: 1) decreased filtration of urate secondary to decreased GFR, 2) excessive reabsorption of urate, and 3) diminished secretion of urate.

Urate, and some other organic solutes, although more membrane permeable in the neutral form, are less soluble in aqueous solution and tend to precipitate. The combination of excess plasma urate and low urinary pH, which converts urate to the neutral uric acid, often leads to the formation of uric acid kidney stones.

Renal Handling of Urea

Urea is a very special substance for the kidney. It is an end product of protein metabolism, waste to be excreted, and also an important component for the regulation of water excretion. Urea differs from all the other organic solutes in several significant ways. 1) There are no membrane transport mechanisms in the proximal tubule; instead, it easily permeates the tight junctions of the proximal tubule where it is reabsorbed paracellularly. 2) Tubular elements beyond the proximal tubule express urea transporters and handle urea in a complex, regulated manner.

Urea is derived from proteins, which form much of the functional and structural substance of body tissues. Proteins are also a source of metabolic fuel. Dietary protein is first digested into its constituent amino acids. These are then used as building blocks for tissue protein, converted to fat or oxidized immediately. During fasting, the body breaks down proteins into amino acids that are used as fuel, in essence consuming itself. The metabolism of amino acids yields a nitrogen moiety (ammonium) and a carbohydrate moiety. The carbohydrate goes on to further metabolic processing, but the ammonium cannot be further oxidized and is a waste product. Ammonium per se is rather toxic to most tissues and the liver immediately converts most ammonium to urea and a smaller, but crucial amount to glutamine. While normal levels of urea are not toxic, the large amounts produced on a daily basis, particularly on a high protein diet, represent a large osmotic load that must be excreted. Whether a person is well fed or fasting, urea production proceeds continuously and constitutes about half of the usual solute content of urine.

The normal level of urea in the blood is quite variable, reflecting variations in both protein intake and renal handling of urea. Over days to weeks, renal urea excretion must match hepatic production; otherwise plasma levels would increase into the pathological range producing a condition called uremia. On a short-term basis (hours to days), urea excretion rate may not exactly match production rate because urea excretion is also regulated for purposes other than keeping a stable plasma level.

The gist of the renal handling of urea is the following: it is freely filtered. About half is reabsorbed passively in the proximal tubule. Then an amount equal to that reabsorbed is secreted back into the loop of Henle. Finally, about half is reabsorbed a second time in the medullary collecting duct. The net result is that about half the filtered load is excreted.

pH Dependence of Passive Reabsorption or Secretion

Many of the organic solutes handled by the kidney are weak acids or bases and exist in both, neutral and ionized forms. The state of ionization affects both the aqueous solubility and membrane permeability of the substance. Neutral solutes are more permeable than ionized solutes. As water is reabsorbed from the tubule, any substance remaining in the tubule becomes progressively more concentrated. And the luminal pH may change substantially during flow through the tubules. Therefore, both the progressive concentration of organic solutes and change in pH strongly influence the degree to which they are reabsorbed by passive diffusion through regions of tubule beyond the proximal tubule.

At low pH weak acids are predominantly neutral, while at high pH they dissociate into an anion and a proton. Imagine the case in which the tubular fluid becomes acidified relative to the plasma, which it does on a typical Western diet. For a weak acid in the tubular fluid, acidification converts much of the acid to the neutral form and therefore, increases its permeability. This favors diffusion out of the lumen (reabsorption). Highly acidic urine tends to increase passive reabsorption of weak acids (and promote less excretion). For many weak bases, the pH dependence is just opposite. At low pH they are protonated cations. As the urine becomes acidified, more is converted to the impermeable charged form and is trapped in the lumen. Less is reabsorbed passively, and more is excreted.

Some Critical Notices Should Knowing When Using Warfarin

June 30, 2017 Anticoagulant Therapy, Hematology, Laboratory Medicine No comments , , , , , , , , , , , ,

PT/INR and Anticoagulation Status

For the vast majority of patients        , monitoring is done using the prothrombin time with international normalized ratio (PT/INR), which reflects the degree of anticoagulation due to depletion of vitamin K-dependent coagulation. However, attention must be paid that the PT/INR in a patient on warfarin may note reflect the total anticoagulation status of the patient in certain settings:

  • First few day of warfarin initiation

The initial prolongation of the PT/INR during the first one to three days of warfarin initiation does not reflect full anticoagulation, because only the factor with the shortest half-life is initially depleted; other functional vitamin K-dependent factors with longer half-lives (e.g., prothrombin) continues to circulate. The full anticoagulation effect of a VKA generally occurs within approximately one week after the initiation of therapy and results in equilibrium levels of functional factors II, IX, and X at approximately 10 to 35 percent of normal.

  • Liver disease

Individuals with liver disease frequently have abnormalities in routine laboratory tests of coagulation, including prolongation of the PT, INR, and aPTT, along with mild thrombocytopenia, elevated D-dimer, especially when liver synthetic function is more significantly impaired and portal pressures are increased. However, these tests are very poor at predicting the risk of bleeding in individuals with liver disease because they only reflect changes in procoagulant factors.

  • Baseline prolonged PT/INR

Some patients with the antiphospholipid antibody syndrome (APS) have marked fluctuations in the INR that make monitoring of the degree of anticoagulation difficult.

Time in the Therapeutic Range (TTR)

For patients who are stably anticoagulated with a VKA, the percentage of time in the therapeutic range (TTR) is often used as a measure of the quality of anticoagulation control. TTR can be calculated using a variety of methods. The TTR reported depends on the method of calculation as well as the INR range considered “therapeutic.” A TTR of 65 to 70 percent is considered to be a reasonable and achievable degree of INR control in most settings.

Factors Affecting the Dose-Response Relationship Between Warfarin and INR

  • Nutritional status, including vitamin K intake
  • Medication Adherence
  • Genetic variation
  • Drug interactions
  • Smoking and alcohol use
  • Renal, hepatic, and cardiac function
  • Hypermetabolic states

In addition, female sex, increased age, and previous INR instability or hemorrhage have been associated with a greater sensitivity to warfarin and/or an increased risk of bleeding.

Dietary Factors

Vitamin K intake – Individuals anti coagulated with warfarin generally are sensitive to fluctuations in vitamin K intake, and adequate INR control requires close attention to the amount of vitamin K ingested from dietary and other sources. The goal of monitoring vitamin K intake is to maintain a moderate, constant level of intake rather than to eliminate vitamin K from the diet. Specific guidance from anticoagulation clinics may vary, but a general principle is that maintaining a consistent level of vitamin K intake should not interfere with a nutritious diet. Patients taking warfarin may wish to be familiar with possible sources of vitamin K (in order to avoid inconsistency).

Of note, intestinal microflora produce vitamin K2, and one of the ways antibiotics contribute to variability in the prothrombin time/INR is by reducing intestinal vitamin K synthesis.

Cranberry juice and grapefruit juice have very low vitamin K content but have been reported to affect VKA anticoagulation in some studies, and some anticoagulation clinics advise patients to limit their intake to one or two servings (or less) per day.

Medication Adherence

Medication adherence for vitamin K antagonists can be challenging due to the need for frequent monitoring and dose adjustments, dietary restrictions, medication interactions, and, in some cases, use of different medication doses on different days to achieve the optimal weekly intake. Reducing the number of medications prescribed may be helpful, if this can be done safely.

Drug Interactions

A large number of drugs interact with vitamin K antagonists by a variety of mechanisms, and additional interacting drugs continue to be introduced. Determine clinically important drug interactions is challenging because the evidence substantiating claims for some drug is very limited; in other cases, the evidence is strong but the magnitude of effect is small. Patients should be advised to discuss any new medication or over-the-counter supplement with the clinician managing their anticoagulation, and clinicians are advised to confirm whether a clinically important drug-drug interaction has been reported when introducing a new medication in a patient anticoagulated with a VKA.

Smoking and Excess Alcohol

The effect of chronic cigarette smoking on warfarin metabolism was evaluated in a systematic review and that included 13 studies involving over 3000 patients. A meta-analysis of the studies that evaluated warfarin dose requirement found that smoking increased the dose requirement by 12 percent, corresponding to a requirement of 2.26 additional mg of warfarin per week. However, two studies that evaluated the effect of chronic smoking on INR control found equivalent control in smokers and non-smokers.

The mechanisms by which cigarette smoking interacts with warfarin metabolism is by causing enhanced drug clearance through induction of hepatic cytochrome P-450 activity by polycyclic aromatic hydrocarbons in cigarette smoke. Nicotine itself is not thought to alter warfarin metabolism.

The interaction between excess alcohol use and warfarin anticoagulation was evaluated in a case-control study that compared alcohol use in 265 individuals receiving warfarin who had major bleeding with 305 controls from the same cohort receiving warfarin who did not have major bleeding. The risk of major bleeding was increased with moderate to severe alcohol use and with heavy episodic drinking.

Mechanism by which alcohol use interacts with warfarin anticoagulation are many, and the contribution of various factors depends greatly on the amount of intake and the severity of associated liver disease. Excess alcohol consumption may interfere with warfarin metabolism. Severe liver disease may also be associated with coagulopathy, thrombocytopenia, and/or gastrointestinal varices, all of which increase bleeding risk independent of effects on warfarin metabolism.


The major comorbidities that affect anticoagulation control are hepatic disease, renal dysfunction, and heart failure. In addition, other comorbidities such as metastatic cancer, diabetes, or uncontrolled hyperthyroidism may also play a role.

The liver is the predominant site of warfarin metabolism. It is also the source of the majority of coagulation factors. Thus, liver disease can affect warfarin dosage, INR control, and coagulation in general. Importantly, individuals with severe liver disease are not “auto-anticoagulated,” because they often have a combination of abnormalities that both impair hemostasis and increase thrombotic risk.

Warfarin undergoes partial excretion in the kidney. Patients with kidney disease can receive warfarin, and management is generally similar to the population without renal impairment; however, dose requirement may be lower.

Heart failure has been shown to interfere with INR stabilization.

Acute illnesses may alter anticoagulation through effects on vitamin K intake, VKA metabolism, and medication interactions, especially infections and gastrointestinal illnesses.

Genetic Factors

Genetic polymorphisms have been implicated in altered sensitivity to warfarin and other vitamin K antagonists.

Variability – Differ in Drug Response

April 13, 2017 Adverse Drug Reactions, Pharmacodynamics, Pharmacogenetics, Pharmacokinetics, Therapeutics No comments , , , , , , , , , , , , , ,

Screen Shot 2017 04 13 at 8 09 29 PM

Substantial differences in response to drugs commonly exist among patients. Such between or interindividual variability is often reflected by various marketed dose strengths of a drug. Because variability in response within a subject from one occasion to another (intraindividual variability) is generally smaller than interindividual variability, there is usually little need to subsequently adjust an individual’s dosage regimen, once well-established, unless the condition or treatment of the patient changes. Clearly, if intraindividual variability were large and unpredictable, finding and maintaining dosage for an individual would be an extremely difficult task, particularly for a drug with a low therapeutic index (e.g., warfarin).

Many patients stabilized on one medicine receive another for the treatment of the same or concurrent condition or disease. Sometimes, the second drug affects the response to the first. The change in response may be clinically insignificant for most of the patient population, with the recommendation that no adjustment in dosage be made. However, a few individuals may exhibit an exaggerated response, which could prove fatal unless the dosage of the first drug given to them is reduced. The lesson is clear: Average data are useful as a guide; but ultimately, information pertaining to the individual patient is all-important.

PS: Evidence for interindividual differences in drug response

  • Variability in the dosage required to produce a given response – daily dose of warfarin
  • Variability in pharmacokinetics – phenytoin’s wide scatter in plateau plasma concentration
  • Variability in pharmacodynamics – levels of endogenous agonists or antagonists

Clearly, variability exists in both pharmacokinetics and pharmacodynamics, and measurement of drug in plasma is a prerequisite for separating the two. The characterization of pharmacokinetic and pharmacodynamic variabilities within the population is called population pharmacokinetics and population pharmacodynamics, respectively.

The dependence on dose and time in the assignment of variability is minimized by expressing variability not in terms of observations but rather in terms of the parameter values defining pharmacokinetics and pharmacodynamics, that is, in F, ka, Cl, and V for pharmacokinetics, and in Emax, C50, and the factor defining the steepness of the concentration-response relationship for pharmacodynamics.

Why People Differ

Screen Shot 2017 04 13 at 9 22 06 PM

The reasons why people differ in their responsiveness to a given dose of a drug are manifold and include genetics, disease, age, gender, body weight, drugs given concomitantly, and various behavioral and environmental factors. Age, body weight, disease, and concomitantly administered drugs are important because they are measurable sources of variability that can be taken into account. Gender-linked differences in hormonal balance, body composition, and activity of certain enzymes manifest themselves in differences in both pharmacokinetics and responsiveness, but overall, the effect of gender is small. Although inheritance accounts for a substantial part of the differences in response among individuals for many drugs, much of this variability is still largely unpredictable, particularly in regard to pharmacodynamics.
Pharmaceutical formulation and the process used to manufacture a product can be important because both can affect the rate and extent of release, and hence entry, into the body. A well-designed formulation diminishes the degree of variability in the release characteristics of a drug in vivo.
Heavy cigarette smoking tends to reduce clinical and toxic effects of some drugs, including theophylline, caffeine, and olanzapine. The drug affected are extensively metabolized by hepatic oxidation catalyzed by CYP1A2; induction of this enzyme is the likely cause.
Although on average the body maintains homeostasis, many biological functions and many endogenous substances undergo temporal rhythms. The period of the cycle is often circadian, approximately 24 hr, although there may be both shorter and longer cycles upon which the daily one is superimposed. The menstrual cycle and seasonal variations in the concentrations of some endogenous substances are examples of cycles with a long period. Drug responses and pharmacokinetics may therefore change with time of the day, day of the month, or season of the year.
Screen Shot 2017 04 13 at 9 26 17 PM