PCB Laboratory Analysis Note

Soxhlet Extraction Schematic
Soxhlet Extraction Schematic

Thanks to a friend’s sharp eye, I recently learned something new about the analysis PCB caulk samples.  Because of its potential significance I thought it deserved a special blog note.

First a little background on how caulk samples get tested for PCBs.  It’s basically a three step process:

  1. First a carefully measured amount of the caulk sample is extracted with an organic solvent. As a chemist would say, PCBs would rather be dissolved in a non-polar organic solvent than to be in the caulk, so they move from the caulk to the solvent.  If you are in USEPA Region 1 this extraction must be conducted using “the Soxhlet method” also known as EPA Method 3540C.  The Soxhlet method is the gold standard of extraction methods, but it uses a lot of energy, water, solvent and glassware so ecologically it is not a very “green” method.  Additionally, it takes a long time.  The method calls for the extraction to proceed for 16 to 24 hours. In other EPA Regions other extraction methods (such as sonication) may still be acceptable.
  2. Once the PCBs have been extracted from the caulk to the solvent phase, the solvent needs to be cleared of the other potentially interfering chemical schmutz that got extracted out of the caulk along with the PCBs. These cleanup steps are fairly critical before you run any of the extract through the gas chromatograph (GC).  The GC is the instrument that will tell the analyst how much PCB is in the extract.
  3. Following the cleanup steps you inject a very small portion of solvent extract  into the GC. At the end of the GC is a very, very sensitive detector that can measure the truly minuscule amounts of PCBs in that may have been in the sample.  The detector generates a signal that allows the analyst to back-out the concentration of PCBs that were originally in the caulk, if any.

Well this probably seems simple enough, but think for a minute about what might happen when your GC is set to measure PCB concentration levels of between 1 and 10 ppm and all of a sudden a caulk sample comes through with 200,000 ppm of Aroclor 1260! Yikes!  This is the equivalent of trying to weigh a full grown African elephant on an office postage scale!  You are not going to get an accurate weight, and your postage scale will never be the same.

And of course it’s not a happy day for the analyst who will now need to spend many hours or days getting the residual PCBs out of that very sensitive GC detector, not to mention all the grossly contaminated glassware and other lab equipment.  Obviously labs need to take steps to protect themselves from this possibility or they would very quickly be out of business.

How Labs try to Reduce this Risk

One thing labs can do to reduce the risk of blowing out their GCs is to ask the people submitting samples if they know the approximate concentration of PCBs in the caulk.  But usually they don’t know, and if it were your lab would you necessarily take the word of the person submitting the sample?  I’m not sure I would.

Another option is to pre-screen samples using a “quick and dirty” method to get a rough idea of the PCB concentration.  Such a method might involve a very simple extraction, followed by a big dilution of the extract to reduce the PCB concentration (if any are actually there) followed by injection into the GC.  Something very close to this procedure is known to EPA as Method 3580A, but is also known colloquially as the “swish and shoot” method.

Now this method is completely fine for getting a quick read on the relative PCB concentration in a sample.  In fact, if the results from the swish and shoot screening shows the analyst that the sample is hot (i.e. lots of PCBs well in excess of the regulatory limits) then there really is no need to conduct any further analysis because the person submitting the sample is in most cases just wanting to know whether the concentration is greater or less than the regulatory thresholds for PCBs.  So some labs stop the analysis at this point and report the results from the sample prepared with the method 3580A extraction.

Situations Where Swish and Shoot Results might Steer you Wrong

If a sample is analyzed following a relatively inefficient extraction and the resulting sample concentration still exceeds regulatory standards, then a more efficient extraction can only result in a concentration that exceeds standards by an even greater amount.  As long as the sole analytical objective is to identify whether or not samples exceed regulatory standards, then this objective can be satisfied by a less efficient extraction provided the result is greater than the regulatory standard.

However, if your analytical objective is also to map PCB concentrations over an site area to achieve a clearer picture of how concentrations change spatially in the field, then you need an extraction and analysis protocol that is consistent, efficient and reproducible.   Without these qualities you won’t be able to reliably tease out the forensic trends you want from the data.

The lesson to be learned  from choosing  the right extraction method for PCB analysis is the timeless quality assurance principle of identifying how you want to use data before you collect samples and analyze them.  Some of the biggest problems with scientific studies can arise when data is collected for one purpose, but then used in a way that was not anticipated by the scientists who collected and analyzed the samples.  Data that satisfied the original study’s objectives may not be suitable for a subsequent study with different objectives.

So-called “meta studies” and a number of retrospective studies where batches of pre-existing data are aggregated to increase the statistical power of a study’s conclusions can be guilty of not thinking about whether the data quality objectives of the original studies meet the needs of the new study.  What motivates the meta study authors is creating as large a data set as possible to give their results statistical significance.  But this quest for large data sets can cause the consideration of data quality objectives to fall by the wayside.

These “big data” studies can sometimes make for splashy headlines because the large number of samples make results look statistically significant.  But too often these results need to be walked-back because the authors did not adequately consider the data quality objectives of the original studies in assembling their meta-data sets.

Last Word

So to reiterate again, think about how your PCB data will be used before you submit the samples to a lab, then make sure the extraction and analysis methods to be used will give you the data you need.