Education > Article > Element Testing: Why Sample Type Matters

Element Testing: Why Sample Type Matters

Urine, serum, plasma, whole blood, red blood cells, feces, hair, fingernails … the list goes on. How do you decide what biological sample(s) to use for element analysis? Can results be compared to scientific literature or do they have clinical significance? Is it possible for values to be elevated or low in one sample type and normal in another?


elements

Urine, serum, plasma, whole blood, red blood cells, feces, hair, fingernails … the list goes on. How do you decide what biological sample(s) to use for element analysis? Can results be compared to scientific literature or do they have clinical significance? Is it possible for values to be elevated or low in one sample type and normal in another? Do test results indicate recent intake, body burden, acute toxicity, chronic toxicity, deficiency, or homeostatic regulation? These are just some of the questions facing a testing laboratory when they want to develop and validate essential and toxic element profiles that will provide clinically meaningful results.

Most element panels commercially available today consist of 20-30 elements analyzed using a single sample type (most commonly urine or serum). It may seem like a reasonable one-stop-shop for element analysis, but this is not the case! Each element is unique in the way it is excreted, when it is excreted, and how results should be interpreted.  The problem with testing a single sample type is that results may be meaningful for one element, and meaningless for another. ZRT Laboratoryprides itself in producing results with meaning, so instead of creating large element panels using a single sample type, we broke our element profiles up to test key toxic and essential elements in what we believe is the most clinically significant sample type.

This blog post focuses on key differences in element testing in urine, serum, and whole blood. Hair and nail analyses will not be discussed other than to say while they may have clinical utility, they are prone to contamination from external sources such as nail polish, cosmetics, personal hygiene products, or shampoo, introducing many more variables.

What we test and in which sample type

Urine (Dried Urine) – Iodine, Bromine, Selenium, Arsenic, Cadmium, and Mercury (plus Creatinine to correct for urine dilution)

Whole Blood (Blood Spot) – Zinc, Copper, Zinc/Copper Ratio, Magnesium, Selenium, Cadmium, Lead, and Mercury

Taking each element in turn, here’s the rationale for the choice of sample type.

Iodine – Urine is the best indicator of recent dietary iodine intake, as >90% is excreted in urine [1]. Nearly all iodine related studies published by major health organizations and independent research groups have used urine iodine to determine deficiency and excess in populations and recent intake in individuals. Serum iodine is sometimes used in hospitals as a quick screen to detect acute exposure, but this is not common.

Bromine – Urine is the best indicator of recent dietary bromine intake, as the majority is excreted in urine [2].

Selenium – Urine is the best indicator of recent dietary selenium intake, as 50-70% is excreted in urine [3]. Both whole blood and serum indicate current body selenium status, but whole blood is believed to reflect long term intake better than serum [4][5]. The concentration of selenium in serum is about 80% of what you find in whole blood [6].

Arsenic – Urinary arsenic is the best indicator of recent dietary intake, as 80% is excreted in urine after 3 days [7]. Serum and whole blood are poor indicators of recent dietary intake or body status for arsenic as it is cleared rapidly within a couple of hours [8]. Serum and blood should only be used to detect very recent or extremely high levels of exposure [9].

Cadmium – Urinary cadmium is the best indicator of long term exposure to this toxic element.  Cadmium is concentrated in the kidneys and urinary levels represent cumulative cadmium exposure over the long term (it has a 30 year half-life) [10]. Whole blood cadmium levels reflect recent exposure within the last 50 days [11][12]. Only about 0.01-0.02% of the total body cadmium burden is excreted every day because it accumulates primarily in the kidneys [13]. Serum is a poor indicator of exposure because cadmium in the bloodstream binds to red blood cells, with erythrocyte concentrations 20 times higher than serum [14].

Lead – Whole blood is the best indicator of lead status and the most commonly used sample for population and individual monitoring.[15] Around 95% of lead is bound to red blood cells with the rest complexed with intracellular proteins [15]. Lead in serum is only 1% of what is found in whole blood [16][17]. Lead is excreted very slowly in urine, and is only of interest for long term occupational monitoring programs and chelation therapy [18][19].

Mercury – Urinary mercury is the best indicator of inorganic and elemental mercury exposure and kidney burden [20] [21]. Whole blood is the best indicator of organic (methyl or ethyl) mercury exposure with 70-95% bound to hemoglobin in red blood cells and a half-life of around 50 day [22] [23] [24]. Serum should not be used for mercury analysis [25].

Zinc and Copper – Whole blood or serum can be used to assess zinc and copper. Zinc and copper are functional antagonists; therefore, the zinc/copper ratio should be determined, especially in cases where values of both border high and low normal ranges [26]. Urinary zinc levels reflect recent intake, but studies have not been able to correlate urinary zinc to tissue concentrations [27]. In normal people, less than 3% of copper intake is excreted in urine [28]. Whole blood copper levels correlate better to symptoms of copper toxicity than serum, while whole blood zinc levels may better reflect intracellular and long term zinc status than serum [29] [30] [31].

Magnesium – There is no simple laboratory test to indicate total body Mg status in humans [32].  Less than 1% of body magnesium is found in blood, with approximately 0.3% in serum [33]. Urinary magnesium reflects recent dietary intake and intestinal absorption, but is not commonly measured [34]. Serum magnesium is commonly tested, but there is little correlation to total body magnesium or concentrations in specific tissues.  Serum magnesium levels are kept under tight homeostatic control, and are usually normal even when there is a nutritional magnesium deficiency because serum levels are raised at the expense of intracellular stores [35]. Whole blood magnesium contains a high concentration of magnesium ions which are essential for many metabolic processes and better reflects long term body status [36] [37].

Examples

  • A person regularly eats mercury-contaminated fish. Testing would potentially show low urinary and serum mercury, while whole blood tests would be high for mercury. This is because a majority of the mercury in fish tissue is methylmercury which can only be detected in whole blood samples.
  • A person continuously drinks water contaminated with arsenic from a well. Testing would potentially show low whole blood and serum arsenic and high urinary arsenic. This is because arsenic is cleared rapidly in blood but is excreted over multiple days in urine.
  • A person ceased smoking cigarettes (high source of cadmium) 6 months ago, but was a habitual smoker for 20 years. Whole blood and serum would potentially show low cadmium levels while urine tests high for cadmium. This is because whole blood represents recent cadmium intake and serum is a poor indicator of cadmium burden, while urine indicates long term cadmium exposure.

As you can see, proper sample type matters when testing toxic and essential elements. In certain cases testing two sample types will provide a better picture of total exposure.

Related Articles

GI Microbiome, Immunology & Inflammation
IBD & The Microbiome

The range of disorders cumulatively known as IBD affects many people across the UK, and the human microbiome is involved in more ways than one. The question is, what can ...

GI Microbiome, Immunology & Inflammation
Iron Absorption and The Gut Microbiota

Iron deficiency is one of the most common micronutrient deficiencies worldwide, affecting around two billion people globally. For this reason, there has been a growing interest in understanding the interaction ...

GI Microbiome, Immunology & Inflammation
Immune Resilience: The Keystone Role of the Mucosal Immune System

In this article, we explore this crucial aspect of immunity, with a particular focus on how secretory immunoglobulin A (SIgA) provides a window into the strength of our mucosal immune ...