Essentially, this chapter is a short list of reasons as to why your ABG (or venous biochemistry) measurement may be wrong, i.e. not reflective of what is happening in the patient. In 99% of cases, it is a problem with the collection storage and transport of the sample, because these are factors which are subject to human input and thus human error. The self-calibrating blood gas analyser is a dutiful and dependable servant; some sort of failure in its internal workings will only rarely contribute to the error (and usually it will be because some idiot human has improperly calibrated it). Lastly, a tiny fraction of errors are begot by the physicochemical limitations of the measurement method (eg. when an absurdly high bromine concentration interferes with chloride measurement and returns a spuriously raised chloride concentration value). This whole thing enjoys a rich full discussion in the section dedicated to arterial blood gas analysis.
Sample contaminated with bubbles
Published information on the errors of ABG sampling arising from the presence of air bubbles comes from Biswas et al (1982). After two minutes, the presence of air bubbles or froth in the syringe had resulted in a significant increase of PaO2 and a decrease in PaCO2.
Sample contaminated with venous blood
Sample clotted
Sample contains too much heparin (liquid heparin dilutes the sample, and causes pH changes)
Haemolysis en route to the ABG analyser
Inappropriate type of syringe used (gas-tight syringes are needed, rather than evacuated tubes)
The sample contains an excessive number of leukocytes, and they have consumed all the oxygen (eg. leukaemia blasts)
The sample took too long to transport, and blood cell metabolism has changed the gas concentration (see the section on delayed processing below)
The sample was transported to the lab by a pressurised pneumatic system - this has the tendency to amplify errors caused by gas bubbles (Woolley and Hickling, 2003).
Sample was chilled with ice (and then analysed at body temperature values)
Low temperatures also make the ABG syringe polymer more gas-permeable, allowing gases to exchange with the atmosphere.
Clotted sample
Haemolysed by inappropriately small needle
Haemolysed by syringe vacuum
Inappropriate collection tube
Question 3.2 from the second paper of 2015 presents the candidates with a venous blood sample which demonstrates characteristic features of a delay in processing. A good reference is a 2008 paper by Tanner et al, examining the delayed processing of samples collected in rural and remote areas. The studies have discovered that over 4-24 hours of storage various changes take place. These changes (and the reasons behind them) were as follows:
Potassium increases. After 24 hours, Tanner et al found the original K level of 3.8 increased to 8.0.
Phosphate increases. In the same study, the PO4- went from an average value of 1.36 to 4.36.
Total protein increases
LDH increases
Sodium decreases
Acidosis develops (bicarbonate drops)
LDH increases (also because of anaerobic metabolism)
Glucose decreases (consumed by RBCs)
Lactate increases (produced by RBCs)
An ideal reference for this exists in the Journal of Critical Care (Woolley and Hickling, 2003). It is unfortunately not available as free full text, but the author was able to get access to the article anyway, and summarise its contents.
Some of the changes listed above apply:
pH decreases
HCO3- decreases
SBE decreases
Lactate increases
Glucose decreases
PaO2 decreases and PaCO2 increases as a consequence of RBC metabolism (in the experiments by Biswas et al (1982), the PaO2 fell by up to 40% in samples which were stored at room temperature for twenty minutes. This ameliorated by storing the sample at 4°C, but all sorts of other problems develop as a consequence of this).
The sample on the left was first, the one on the right was the thirty minute delay. The ambient temperature was probably close to 20 degrees. As one can see, the differences are trivial. During that half hour, this anaemic patient's few remaining RBCs managed to metabolise only 0.1mmol/L of glucose, and produced a barely measurable increase in lactate and CO2.
Inappropriate electrolyte temperature
Poorly calibrated ABG analyser
Inadequate quality control and maintenance of the electrodes (they have a measurable lifespan; for instance the Clark electrode needs to be changed every 3 years as its Teflon membrane becomes coated with proteinaceous filth).
Poorly compensated Clark electrode nonlinearity.
Heparin of any sort: Given its extreme anionic charge and near-total dissociation at physiologic pH, it would be surprising if heparin failed to interfere in some way with the strong ion difference in a blood gas sample. Indeed, it would appear that even a 'thin film' of it skews the results in the direction of a metabolic acidosis. The quoted study is from 1985, a grim and primitive age when intensive care staff would have to manually heparinise the ABG syringe prior to collecting the sample, sending it to a distant analyser with a porter.
Lithium heparin: Cations other than sodium could potentially exist in the sample in such concentrations as to fool the supposedly sodium-selective ceramic electrode membrane. Of these the most likely culprit is lithium from lithium heparin. These days the arterial blood gas syringe typically contains a very small amount of dry heparin, about 20-25 units per ml.
Halogen ions: the measured Cl- increases in presence of halogen ions, such as bromine and iodine. This is a selectivity failure of the supposedly selective chloride-sensitive electrode membrane, which will accept any halogen if their concentration is high enough.
Salicylates: in massive salicylate overdose there is enough salicylic acid around to compete with chloride for the ionophore in the chloride-selective electrode membrane.
Haemolysis: because the ABG machine shreds the red blood cells with 30Hz ultrasound in order to get to their haemoglobin, it will not notice if there is already free haemoglobin in the bloodstream (eg. due to haemolysis).
Foetal haemoglobin and carboxyhaemoglobin have a very similar absorptivity spectrum. Thus, foetal haemoglobin interferes with the spectrophotometric measurement of carboxyhaemoglobin and falsely increases the measured FCOHb. Conversely, the presence of carbon monoxide poisoning in a pregnant woman might mask a foeto-maternal haemorrhage.
The amperometric measurement of lactate using the lactate-sensitive electrode relies on the use of lactate oxidase. This enzyme catalyses the reaction which converts lactate into pyruvate, producing hydrogen peroxide which is reduced at the measurement cathode. Glycolic acid, the metabolic byproduct of ethylene glycol metabolism, also acts as the substrate for this enzyme. Therefore, in ethylene glycol poisoning the lactate measurement by the blood gas analyser will be spuriously elevated. The formal lactate measurment by use of a lactate dehydrogenase enzyme assay will still yield a correct result. The difference between the 'formal' and the ABG lactate is described as the 'lactate gap', and is a well known phenomenon of ethylene glycol toxicity. This source of ABG error is made famous by its use in a classic ABG question (Question 1.12) from the 2003 edition of Data Interpretation in Intensive Care Medicine by Bala Venkatesh et al. It had also made an appearance in Question 20.2 from the second paper of 2017. Interestingly, the Reference Manual for the local ABG analyser lists a large number of molecules which can potentialy cause interference with lactate measurement- notably ascorbic acid, bilirubin, citrate, EDTA, ethanol, heparin, glucose, paracetamol, salicylate and urea.
联系客服