Hello everyone, welcome to today’s live broadcast. Holding Data to a Higher Standard: A Guide to Data Accuracy and Reducing Error & Contamination. I’m Laura Bush, the editorial director of LCGC and I’ll be your moderator for today’s event. We are pleased to bring you this web seminar presented by LCGC and Spectroscopy and sponsored by SPEX CertiPrep. SPEX CertiPrep has been servicing the scientific communities since 1954. They’re a leading manufacturer of certified reference materials or CRM and calibration standards for analytical spectroscopy and chromatography. They offer a full range of Inorganic and Organic CRMs. They are certified by DQS to ISO 9001 2015 and are proud to be accredited by A2LA to ISO 17025 2017 and ISO 17034 2016. The scope of their accreditation is most comprehensive in the industry and it encompasses all of their manufacturer products. For more information please visit SPEXCertiPrep.com.
We have a few house keeping announcements before we get started. The web cast is designed to be interactive so we encourage you to ask questions. You can submit questions at any time by typing them in the Q&A box which you can find on the right hand side of your screen. You can enlarge your side window by clicking on the small green icon in the upper right hand corner of your window or by hovering your mouse over the lower right hand corner and dragging the window to the desired side. The slides will advance automatically during the webcast. We invite you to take advantage of the resources provided by our sponsor that you can find in the green resource widget in the dock at the bottom of your screen. And if you have any technical problems viewing or hearing the presentation today please click on the question mark “Help” widget in the dock at the bottom your window.
I would now like to introduce today’s speakers. We are very pleased today to be joined by Susan Audino and Patricia Atkins. Susan Audino is the principal of S Audino & Associates LLC. She obtained her PHD in Chemistry with an Analytical Chemistry major and Physical and Bio Chemistry minor areas. She currently owns and operates a consulting firm to service chemical and biological laboratories. She is also an A2LA lead assessor and instructor. A board member for the center for research on environmental medicine in Maryland and serves on several extra advisory panels for the cannabis industry. Susan provides scientific and technical guidance to cannabis dispensaries, testing laboratories, and medical personnel. She serves on many national advisory panels for consensus standards. Susan also shares the cannabis advisory panel and working group with AOC International. Is a member of the executive committee of the ASTM cannabis division and has consulted for numerous laboratories and state regulatory bodies. Patricia Atkins is a Senior Applications Scientist. She is a graduate of Rutgers University in New Jersey and was a laboratory supervisor for Ciba Specialty Chemicals in the water treatment division. Patricia lead conditions conducting research and managing in air pollution research group within Rutgers University’s civil and environmental engineering department. In 2008, Patricia joined SPEXCertiPrep as a Sr. Applications Scientist in the certified reference materials division. She spends her time researching industry trends and developing new reference materials. Thank you both for joining us today. Patricia please get us started.
Thank you very much and I appreciate everyone being on today. We are going to be talking about data accuracy and how to improve your data accuracy by reducing your error in contamination. First, when we are talking about data accuracy we are talking about looking for your true value. We are talking about accuracy and precision. Accuracy is how close you get to that true value. This can be changed by any Instrument Bias, Contamination issues or Error. When you are talking about precision, that’s how close your results are to one another and is often expressed as percent RSD. This can be seen as Instrument Variation, Operator proficiency, or Error. What your targets are can be very different depending on what your instrumentation of what you’re looking for. You can have very high accuracy meaning you can hit the target all the time, you get all of your results in that target area but low precision meaning your results aren’t typically repeatable. This can have very bad RSD values for you. Or you might have some bias either in your system or in your process or in your injector and this could give you very high precision all of your results cluster together but you missed the target. And you can also have low accuracy and precision where your results are just all over the place and not even anywhere near the field of where your target is and of course that’s none of our goal. We want accurate and high accuracy precision results. In order to get that you need to eliminate any error, contamination and you do that in three specific areas. With your quality management system, this is your organizational system, your methodologies, and your standards. Your lab components, all the processes of your lab. And finally, your lab environment & human components and we will go through all three of these today. Susan is going to start us off with our quality management system. Susan over to you.
Welcome everybody and thank you for joining us. This is a subject matter that is near and dear to my heart. So I believe that ISO is gaining in popularity in the cannabis industry, particularly 17025 and for those who are aware we are currently in transition from 2005 to the 2017 issue. When we look at the 17025 2005 revision we see that section 5.4.2 directs laboratories to use methods published in international or regional or national standards preferably before in house methods. And that is going to become more important to us as we move forward, especially in the cannabis industry.
So quality management system known as QMS is somewhat related to ISO and GMP and GLP as well as the laboratory’s own processes. So the QMS encompasses not just the ISO standards, not only GMP and GLP requirements but also their own requirements. And the quality management system is based on official methods preferably, so these would be methods that are published by the FDA, EPA, and other federal bodies. Or I think, I’m sorry, we could also use consensus methods or voluntary methods such as ASTM, AOAC, USP, AOCS and several others that are out there. So the benefit of using consensus strategies is that they already have done the work. These are methods that have been well suited, well vetted, well validated and the journal of American Public Health really gives us a nice definition of this and encapsulates the benefit of using consensus strategies. And there are times when consensus methods or voluntary methods can actually become required by official mandates such as the FDA. So very often we will see that the FDA will say in some cases refer to the AOAC methods or the determination of X and Y matrix, this happens repeatedly. So you will very often see official methods that are AOAC based or ASTM based or USP based. Not all consensus methods make it to that official mandate but it happens more frequently than I think some people may realize and that is also going to become a very important distinction to the cannabis industry as we move forward. If we look at 17025 2017 revisions section 7.2.1.5 says “The laboratory shall verify that it can properly perform methods before introducing them by ensuring that it can achieve the required performance”. This is known as method verification and there is some confusion with method validation but were going to get to that. We would verify a method if we were going from one method to a different one. Such as, let’s say GC-FID and if we wanted to move GC-MS we would verify that the method were using is capable of achieving the required performance parameters that the former had, so we are using method verification here. Some of the challenges we need to address during the verification process include Matrix, are there any difference in the Matrix. Precision, Accuracy, and several other parameters. So what happens when official methods such as AOAC and AOCS and the EPA and the FDA and a whole host of others are not available? This actually is very much the case in the cannabis industry so as we know at the moment we don’t really have any cannabis specific test methods. And I’m going to correct myself here because of last night that has changed. As of last night, ASTM has issued two new standards. One is D8196, which is the standard practice for the determination of water activity in cannabis flower. And standard number D8197 which is the standard specification for maintaining acceptable water activity range from 0.55 to 0.65 for dried cannabis flower. These two methods represent the first technical methods that are cannabis specified for industry and that’s a monumental effort and achievement. Although we have those two methods that are on the book as standard methods or voluntary consensus methods that doesn’t tell the whole story in a laboratory. As we know we issue test reports that have to tell a story about the data we have acquired. So that means we have to develop of our own method or borrow somebody else’s method or implement somebody else’s method. When we’re doing that we need to validate that method. If we look at 17025, section 5.5.5.1 we see the definition for method validation is the confirmation by examination and provision of objective evidence that the particular requirements for a specific intended use are fulfilled. Now that definition is pretty different although there is some overlap with method verification but those differences are critical to recognize as we move forward. So we are talking about now some of the parameters that go into method validation. MDL or Method Detection Limit is one parameter that we need to define. The LOQ or Limit of Quantitation needs to be defined and Linear or Dynamic Range needs to be defined. These are all elements that are necessary as we develop a method. We could see where these are showing up in a calibration set up. We see the LOD, we can see the Limit of Quantitation, and we can see the Dynamic of Linear Range. And I would also like to point out that when we use calibration curves there is also error associated with a calibration curve its self even though we may end up with an R squared value of .9999999999, there is still error associated with each value of that line. Additional parameters and method validation includes the Bias, Accuracy, Precision, and Selectivity. And the definitions are up here on the slides. When talking about Bias and Bias is essentially the difference between the true value and the measured value. And we need to know what we are talking about Bias vs Accuracy and vs Precision. Very often we see these three terms used interchangeably and they are not interchangeable. They have unique definitions and they reflect unique characteristics of our methods. ASTM has also published conditions for Precision and these are noted as ASTM E177 and E456. So we can see up here in the lower right corner that the conditions for Precision are specified by ASTM. That’s a good definition to follow and a good chart to maintain. Additionally we have Repeatability and Reproducibility and Ruggedness. Again, very often used interchangeably, they are different. One reflects the short term variation within a laboratory and another is the Precision between different laboratories. And ruggedness is measure of the affects we make that are small in the method conditions. Sometimes there’s an overlap, a repeatability in reproducibility. Sometimes we get identical results and that’s really our ideal. It doesn’t always happen that way but it is ideal. And these are factors that we need to take into consideration when we develop our method validation protocols.
As we move into Error and Uncertainty, Error is a deviation of difference between the estimated or measured value and the true, specified, or theoretically correct value. So if we were to take a measured value and we see that here illustrated in the black line and we have a true value measured in a green line and we look at the difference between that. Were actually looking at the error between those two values. If we wanted to get a range, if we wanted to buffer that error zone and have a greater understanding of what we are working with we are going to look at a lower level and upper limit. When we do that we are actually defining a confidence interval. And that confidence interval as we know as we come to accept that over the years in really the uncertainty. So that’s Error and that is Uncertainty is very different. Uncertainty is a characterization of the range of values that occurs through variation of a process. And the Error is the difference between the true and measured valued. And the Uncertainty is the range of values in which the true value actually occurs. So we can see pretty clearly that Error is not the same as Uncertainty. So, what about Uncertainty? We need to identify the components is always an important thing. It’s always very critical to understand what we are doing and to understand where Error can occur and Uncertainty occurs and it happens every day. We were to take one ruler and have five people use that one ruler to measure a desk we would have five different answers. One component of Uncertainty is sample preparation. It might include improper sampling or sub sampling. So sampling from field vs sub sampling from the laboratory. It could be environmental such as temperature or humidity or other vibrations or any other environmental conditions that may affect a sample. It also includes the test locations. Another component of Uncertainty is the sample property. The properties of the sample itself. So stability becomes an issue with complexity and the matrix . In the cannabis industry we know that all three of these are profoundly critical in analysis and in testing laboratories. Test methods also carry uncertainty. Operator Skill, we may all have the same training. We may all do things according to a specified SOP, however we have different skill sets. Instrumentation. Functionality of it, and the calibration of Instruments also create sources of uncertainty. And the applicability of the test method itself. So if the test method was derived or developed for a GCSID and now we are using it for a GCMS we may have introduce additional uncertainty or we may have reduce some uncertainty. We also have reference materials. So these are also sources of uncertainty. So things like purity, is it a Neat Material? We have Mass, we have the technique that is involved with just measuring the material. We have traceability. Traceability is a big issue in our industry. We tend to get source material from one or two or three sources and how each manufacturer handles that maybe a little bit different. We have a Purity of the solution preparation not just off a neat material. So these are sources of uncertainty in reference materials. We like to think that reference materials or CRMs are perfect and ideal and to the best possible solution they really are. But, if we look at those CRM reports we see that tolerances and other data are on there for a reason. So, if we look at all this and put them all into a funnel, then we can see that together or combined they yield an uncertainty. So the standard uncertainty is a measurement result such as the standard deviation or µ which occurs when that result is obtained from a host of other sources.
So we have expanded uncertainty, so we saw in the previous slide of total uncertainty. So we have here we put in all of the sources of uncertainty and we get back sort of a global uncertainty. We need to expand that so we look at expanded uncertainty and the way we do that is we identify the individual components. So for example, Sample Properties, Sample Preparation, Test Methods, and Reference Materials. So we just did Step 1. We have to calculate or quantify uncertain components in each of those little bubbles. And then we need to convert that into the appropriate standard deviations. From there we calculate the combined uncertainty, so we are taking the sample properties and squaring that. We’re adding that to a square of sample preparations and square of test methods and similarly for the reference materials. So we just calculated the combine uncertainty. And then we need to calculate the expanded uncertainty and to finish out this equation the expanded uncertainty is K x the square root of the combined uncertainty. And this represents that uncertainty interval we looked at earlier that encompasses the distribution of values that could be contributed to measuring. We look at the normal curve, I’m sure everyone has seen this hundred times over. K=1 represents this yellow center and that is giving us 68.27% certainty coverage factor. K=2 will give us 95.45% coverage factor so our true value will lay somewhere in that blue range. And finally, if we look at K=3 we are looking at 99.73% confidence interval. And for most of our laboratories were using K=2. So if we are developing and calculating our independent K values, my recommendation would be just to use K=2 because that is the most appropriate for our labs. Just as a reminder, we have method verification where we have a demonstration that a previously validated method can meet the analytical requirements of a new method in one laboratory. And now we have method validation which is the confirmation by examination and the provision of objective evidence that the particular requirements for a specific intended use are fulfilled. And again, that comes from ISO 17025.
Here’s a little chart I developed that illustrates the differences between method validation and method verification. And this is where we can begin to see some of the overlap between verifying and validating a method. So at your leisure go through the list. Specificity is one that we may want to verify if the sample matrix is different. Precision we need to look at the threshold concentration when we’re verifying. The range, the bias, limited quantitation, ruggedness & robustness may or may not be necessary for method verification but are always required for method validation. When in doubt, validate but ask yourself a question first. Is the previous validation available? If the answer is yes then just verify. If it’s no, you need to go through validation and validation does take more time. Sometimes I hear that laboratories will validate complex analytical method in one afternoon. And quite honestly I don’t think that’s plausible or even possible. Validations can run from anywhere from days or weeks up to years depending complexity and what is intended to encompass.
In summary, up to this point QMS and Method development. You want to use consensus methods or official methods whenever possible. Those folks have already done the work for us. Once we use those methods we just need to verify that we can perform them. When if necessary which is more often the case than not, we have to develop in house methods. And we need to develop those methods according to some characterizations that are pre-determined. So before you start validating your method, write up a nice verification and validation protocol. What are your acceptable limits and desired results? Include measurement uncertainty as part of method validation. Highly, highly, recommended if you’re seeking accreditation or your successor will be looking for measurement uncertainty calculations.
So we have environmental issues and environmental programs that are already existing. The cannabis industry is really not any different from the environmental labs and food labs and seed labs that have already been running and operating. It’s just a new industry, it’s a new sector and we need to apply what we already know without reinventing the wheel. I would like to add that we do have some challenges and issues because this is a complex matrix. Sometimes developing Quality Management Systems is challenging for laboratories that are not using consensus methods or official methods.
Pesticides are becoming a big issue in the cannabis industry. No pesticides have been addressed by the EPA and this is the biggest challenge, I think, in the country right now because as a whole we are not quite sure what to do with that. When we talking about Pesticides however, what we are really talking about is integrated pest management. We’re talking about Insecticides, Rodenticides, Anti-Microbial processes, Herbicides, Fungicides. And this represents integrated pest management and not just pesticides. States and Pesticides, there are no thresholds federally that the states are able to look to for direction and oversight. So we are individually trying to do that. Keep in mind that all pesticides carry a toxicity rating and all though we have no current methods on the books for cannabis. Work is under way to develop a method or a suite of methods for a multi-residue analysis of all the pesticides. I think we are up to about eighty-ish or approximately a hundred if we include all of the United States and Canada as their pesticides are currently under review for the individual states. So we have standards companies working on developing the appropriate standards. Alright, so, that’s about all I have at the moment. Thank you and I’ll be on hold waiting for questions later. So back to you Patty.
Thank you Susan. So one part of the QMS program for our point of view are standards. Those are your CRMs, reference materials, and there is often a lot of questions of what is exactly meant by all of that. The simple is the Standard is ‘The Known’. There are two types of standards. There are primary standards, this is a standard that needs no reference to another standard. For us in the United States, most of our standards are considered to be coming from NIST. It’s very interesting, one of the ultimate standards is the Grand K in Europe. That’s the Grand Kilo and a funny piece of trivia is that the Grand Kilo over the century has started to lose weight. So even the primary standard for the Kilo is under suspect right now. Then you have secondary standards. This is a value assigned by comparison to an equivalent primary standard. So this means that it’s NIST Traceable. What are CRMs or Certified Standards? Certified Reference Material is accompanied by a certificate. It has a lot of information on it about Homogeneity, Stability, and Traceability. It has Certifying Bodies, Methods, and Statistics. But the most important part are the certified values meaning the Calculated Value and Uncertainty. If we look at a typical reference material you see Supplementary Information on the certificate. This is our Part Numbers, Dates, Descriptions, and Certifications. Then the Values, the Calculated Value and the Uncertainty. This is the CRM uncertainty, not an instrumental uncertainty. Like Susan said, Uncertainty is associated with every part of the process and this is the uncertainty associated with a reference material. Then there is information about Traceability, Stability, and Homogeneity. Traceability is the ability to go from origin to manufacturer through final delivery. For a Standard it means it’s traceable to NIST or some other primary standard. Stability means material is not active during normal conditions. It retains all of its properties in expected time frame and conditions. That means if you receive a standard that has a year of stability and it says keep it at four degrees, if you keep it at four degrees it is reasonably expected not be reactive and retain its properties for your year. However, if you violate the conditions of stability, either you do not keep it at four degrees or you keep it past those twelve months then it is no longer suitable to be considered stable. And then you have Homogeneity. That’s uniform in composition or character. You can have Homogeneity within a bottle and in this case if you see a particular standard and it has settled or you see some settling going on in the standard, if there are instructions on a certificate about how to reconstitute that standard and you follow those instructions that is still considered to be a Homogeneous standard. Then there are between bottle or between lot homogeneity. How does one bottle or one lot compare to the next one?
Standards have several different roles. To calculate uncertainty so you can use standards to actually trace the uncertainty in your processes. You can use standards to correct for error or eliminate error in your process. And of course we use standards for identification and data production both quantitative and qualitative data. An external standard is not added to a sample. So it’s located outside the sample. You can however though do a matrix matched standard or an unmatched standard. A matched standard you can use for quantification. You can build your Calibration Curves, your Response Curves, you can use as far as Identification. You can pick and compare that response to an IS to judge how well you are doing your sample prep or extraction or your analysis. An unmatched standard you could use for Identification and them some cases it’s a semi-quantitative process.
The best types of external standards duplicate your target Analytes. So if you’re looking for a particular pesticide, then you want the external standard to match that. And you also want it to be in the range of the expected concentration of your target. So if you are looking for five parts per billion, your standard should bracket that five parts per billion. This will not compensate for volume variations or errors, that’s why it’s Semi-Quantitative. So if you are trying to track your injector’s variation volume of the injection. This won’t compensate for those little changes in an auto sampler or hand injecting sample not an instrument. When you are looking at external standards you really want to be aware of those limits. Your Limits of Detection, your Limit of Quantitation and your Limit of Linearity. You want your target to be between the limit of quantitation and limit of linearity, that dynamic range. That’s the best working range for your target and that’s the best working range for your standards. Your external standard should also bracket those target ranges without getting too close to that Limit of Quantitation or that Limit of Linearity. You want to make sure that you are in the right range for your targets.
Internal standards are added to your samples, they are added to your blanks, and they are added to any external standards. These are sometimes called Recoveries, Surrogates, IS’s, Standard Additions, Spiked Samples or Spiked Standards and by their nature, they are matrix matched. This could be used to correct for instrumental variability. It could monitor or correct a process. You can create Response Factors, Recovery Factors, Extraction Efficiencies, and in some cases you can use it for quantification as well. The best type of Internal Standards are labeled Analogs of your targets. So if you’re looking at Naphthylene, deuterated naphthylene, is a great Internal Standard. If you don’t have it available, you want a compound that is very similar in character but not in the sample. So if you happen to know a particular pesticide is not in a sample you can then use that particular pesticide as an internal standard. Again, you want it above the level of detection and the level of quantification and in range of your Analyte.
Some of the challenges looking at our well established environmental field. Of course, environmental field is very well established. There are hundreds of standards for it and many labs not require Certified Standards. Unfortunately, the environmental industry and the EPA can be a little slow to adapt to some changes. One of the big changes of the past few years has been the requirements for Speciation and as we saw before there are a lot of increasing challenges for the number of targets and the number of limits. So we are seeing parts per billion contamination and different species actually becoming a challenge for the standards industry. In the Cannabis, in the new industry, there are limitations for the standards available, some of them being very high cost. The levels and the actually lists are changing very often and there are issues around legality and transportation issues. As Susan said before we haven’t quite all gotten on board with what are the targets and what are their limits.
So now we are going to look at some of the most common sources errors of contamination, the lab components. This includes things like Sample Prep, Chemical Components, and Lab Apparatus. So why do we really have to pin down this error? Well, modern instrumentation has gotten lower and lower. So now we are looking at PPB and lower. Things like ICP-MS and LC-ICP-MS, GC/MS, LC-MS. And the lower you get the more difficult it gets to eliminate critical error. This is Horwitz Trumpet it is showing that the relative standard deviation of targets actually increases as the concentration decreases. So when you get into that PPB level, you are talking a lot larger standard deviation for your targets then if you are looking for something in the percent levels, something like a major nutrient or a pharmaceutical.
So what is Trace Analysis Concentrations? What does the PPB or PPM look like? Well, 1 ppm is 1 second in almost twelve days. A ppb is 1 second in 32 years. A ppm in money, is 1 cent in $10,000 while 1 ppb is 1 cent in $10,000,000. And a ppm in distance is 1 inch in 16 miles or ppb is 1 inch in 16,000 miles, which is the diameter twice around the Earth. So we are looking at smaller and smaller targets.
Let’s look at some Sampling Issues. One of the biggest concerns is Homogeneity. How do we know what samples that we are picking is the right part? Is it the right location? Especially when you’re talking about a nontraditional sample. Think of something like a pharmaceutical patch. Which part of that sample do you analyze? Do you analyze the entire patch? Do you take a section of it? Then you have some difficult process matrices, a lot of interferences, some of them environmental water samples can be quite dirty, have a lot of interferences. Then the product form. Are you looking at a treated sample? Something that might have be acidified. Are you looking at a dangerous material that’s just dangerous to transport? Is it a finished product? Is it a raw material? And how much do you really need for a single sample? How many sets of samples do you need for it to be validated? And what type of batch size are we looking at?
Well it turns out that Particle Size has a lot to do with homogeneity and efficiency. Here some common particle sizes. A 5 mm particle is about the size of a pencil eraser. A 2 mm particle is about the tip of a crayon. 1 mm the tip of a sharpen pencil and 0.5 mm is one of those really fine tipped pens. So how much do you need if you use these different particle sizes? It also depends on what is your uncertainty. How much uncertainty do you need? How is exact is it do you have to be? If you have a very exacting method and you only have 1 or 2 percent uncertainty allowed for that method then you really need to grind that particle down. If you’re using a 5 mm particle at 1% uncertainty, you need over 12,000 grams of material. And something like the cannabis industry or spices or something else with a commodity of materials that are expensive, that’s a lot to sacrifice for homogeneity. But, if you go into a smaller particle size like 0.5 mm and you still need that 1% of uncertainty, now you’re talking about 13 grams of sample. So that’s definitely a lot less. And it also effects your extraction or your digestion efficiency. We cut up some particles of plastic and when we used bigger particles we did not get as an efficient extraction as we did as when we got down to a very powder form of sub 0.5 mm particles.
So sampling challenges for your environmental. A lot of environmental samples will need some pretreatment which can change your species. So if you’re doing speciation then you are going to be worried that your acid is going to change your PH and change your species. Again, sometimes those samples are considered to be hazardous. Liquid samples can have stratification especially for EPA samples. Solid samples can be hard to homogenize, but the good thing is you aren’t really limited to how much sample. Usually an EPA method doesn’t care if you sample a gallon of water or two gallons of water. Cannabis Industry, little different targets. There some questions on which portions should be sampled. Again, legal and transportation issues. It’s a very expensive commodity to sacrifice to sample. It could be very difficult to get homogeneous samples for testing or sampling queues.
Let’s look at where some of the error and contamination will come in our lab processes. The first one is Solvents. Solvents can contain some contamination, things like particles, gases, preservatives, they can leech compounds from the bottle. So you can have dust and rust in your samples. One of the very common contaminates in Solvents are Phthalates because the caps are sometimes plastic or the liners that have DEHP in them. You can leech elements like Sodium and Boron and Silica from your bottles into your solvents. But some Solvents can also be a source for contamination. Some Solvents are very persistent in the lab. If you have DMSO or carbon sulfide or even dichloromethane. Those can persist in the lab throughout all of your processes and you will be seeing that Solvent over and over again.
You also have to understand that there are different contamination levels in different grades of Solvents. Something like an ACS grade is a general lab Solvent. It meets ACS expectations but it might not meet low levels of impurities. Something like a LCMS Solvent low impurities so less than 0.1 ppm of impurities. If you’re doing pesticide analysis you want to look into something like a pesticide residue grade because that will then meet for environmental and pesticide analysis.
If you’re doing analysis by GC or HPLC you need to be aware of certain physical constants. For HPLC you want things that are in the UV Cutoff range, so in that purple area, you want it below 205 nanometers (42:09). You want it to be invisible to UV or UVA systems. If you’re doing reverse phase or normal phase than your going to have to worry about the Polarity Index. You have your non polar solvents. Your Hexane, your Isohextane, and your Cyclohexane with polarity index. Those are for more of your normal phase processes. If you are doing reverse phase HPLC then you are looking for things with high polarity. Solvents like water, Acetonitrile, THF.
You also have to be aware of what water you are using. ASTM put guidelines out for laboratory water and if you are doing critical processes you should only be using ASTM type 1 water. That’s because it has low organic carbon, low Sillica, low Sodium, so that is the best type of water. Anything above that is general grade water that is meant to feed into other systems, meant for cleaning, they are not meant for lab use. So if you're really doing critical analysis you want ASTM type 1 water. One of the problems with water is that if you get it from different sources, again its Phthalate. We looked at some water from different sources in our own lab. We looked at some HPLC water, some LCMS water, we had our DI system where we had water sitting overnight vs water that was flushed over many gallons before sampling. We found something like a HPLC water that we purchased had almost 100 PPB phthalates in it. Where something like our municipal tap water had about 3 PPB phthalates in it. Even our DI sourced water we had to flush those several gallons out of the system because of the piping. You want to get all of those Phthalates out.
Then we look at our Chemical Stock Contamination. You can get contamination from plastic seals, from septums, from caps, manufacturing contamination. Again, it can be very difficult to eliminate these contaminations so you usually have to test before use. In some cases you can rinse it with a Solvent, you can bake it out, or clean it. We looked at some very common salt standards and solids and we tried looking at the Phthalates concentrations. What we did was we sampled our sodium chloride and sodium sulfate and then we did some series of washes. We did several washes with methylene chloride and you can see with especially with sodium chloride we had high ppb to the ppm level of phthalates in our first two washes. Then we heated the system, we heated our sodium chloride, we baked it out, we did two more washes, that eliminated our phthalates. In this case we were basically able to clean our solids and get rid of that contamination.
Now let's look at some laboratory components, our lab ware. Of course you need to be aware that if you’re looking for sodium or silicates that you probably should be aware you shouldn't be using sodium or borosilicate glass. Borosilicate glass can have silicates, active sites, if you’re doing LCMS you will see a lot of sodium adducts from glassware. If you are using plastic ware, you have to aware of plastic additives. That plastic ware will absorb chemicals and elements and some acids or some compounds can etch or dissolve the container itself.
Here at SPEX we separate our lab ware into use, into low and high use. For us, “Low level” use is more metals less than 1 ppm or low part per million organics. “High level” 1 ppm or over metals or high % or ppm over organics. That’s because lab ware can have memory effects and chemical interactions. You also need to segregate lab ware for specific metals and specific uses. So if you are concerned about Boron, Silica, Lead, Chromium, don't use glass. If you are looking at low levels of mercury at the ppb levels you want to be aware that you are going to be using glass or polypropylene. You do not want to use a fluoropoluymer or polyethylene because it can possibly diffuse through your container. When you look at your volumetrics you are going to really need to under the 4 “I”s of volumetrics or the 4 uses that can cause error. Improper use that is the lack of understanding of how it is used and the information it gives you. The incorrect choice meaning the wrong type or the wrong job. Inadequate cleaning which can contribute to contamination and carry over and Infrequent Calibration.
When you look at a volumetric there is a lot of information on it. A Lot of uncertainty, a lot of tolerances that are printed right on the glassware. You have your country of origin, you have your certifications, your manufacturer. You have what the nominal value of that piece of glassware is and its tolerance. Tolerance is the uncertainty associated with that glassware. You can see where it says tolerance plus or minus .03 mils that's the uncertainty associated with this particular piece of glassware. Another interesting piece of information is whether it is supposed to contain a sample or deliver a sample. This means are you going to keep your sample in the container and measure it into that container or are you going to pour it from that container into someplace else. If you are going to pour it than you want something that says to deliver. And if you want that you also want to know that it also has additional instructions. It tells you the temperature that the sample should be stabilized at and how long that sample should be stabilized before pouring. There also should be some notification of what kind of class that the glassware is. All the people using glassware in laboratories or measurement purposes need to be using class A glassware quantitative glassware. That is the highest precision glassware. Class B is qualitative, these are things used to clean, hold things in. These are really not meant to be measured into or measured from. And then you have to be aware of what the etchings and graduations mean on your glassware.
Let's look at some Syringe Carryover Contamination. If you use syringe, the manufacturer will say on it’s little pamphlet side that you should probably rinse it 2-3 times to waste before you decant (48:34) your sample. We did a little study here at SPEX and looked at carryover. We took an internal standard at 2000 ppm and we filled up different size syringes. Everything from 1000 down to 10 microliters syringes. And we filled it once and discarded it from the standard. Then we rinsed the syringe with DCM and we collected the different rinses. So the first 10 rinses we collected and then every 5 or so after. What we found is if you are using a 1mil or 1000 microliter syringe that you need those 3 or 4 syringe rinses to get all of the concentration out of the syringe and to be clean. But if you are looking at something like a 10 microliter syringe, that real small syringe, even through 10 and 15 washes we were still getting 1 ppm of the internal standard coming through in those washes so you really needed to wash those syringes a lot more.
Another problem with syringes and pipettes are volume errors. A manufacturer will say that if you if you are using a syringe you should measure no less than 10%. So if you are using a 1mil syringe that is 100 microliters. You need to be doing no less than 10% so we did a study of syringe accuracy. We dispensed various amounts from 10%-100% in different size syringes. Again we found that the truth is a little higher than that. If you are looking at something like a 10 microliter syringe you really need to fill it almost that full 10 microliters to reduce your errors to the lowest point because even at 100% volume for 10 microliters syringe we were seeing about 3% error. Now with the bigger syringes you can get closer to maybe 20% or 50% to get a lower amount of error. For a 25 microliter syringe we were looking about 80% of the fill volume for 1% error. For 100 microliter syringe were talking about 50%. So the larger you go with syringe than yes the less you can fill. Even with the 1000 microliter syringe to get that 1% error you still have to fill it about 25%.
So to reduce error and contamination for smaller syringe volumes you need to definitely rinse it more and if your sample is viscous you are going to need more cleaning. You might need a vacuum system or to take that syringe apart. Now if you are using a fixed needle syringe be aware you do not want to apply heat or solvents because you can actually weaken the cement or glue that is involved with keeping that needle fixed and you want to take it apart it let it air dry if possible. And when to increase your accuracy use the correct syringe or pipette for the volume dispensing. If you are dispensing 90 microliters and you don’t have a 100 microliter syringe and you want to use 1000, don’t use 1000 go and try and find a 10 or 50 microliter syringe and do it twice if you have to. You want at least the minimum of 20% of the syringes volume, up to 50% for smaller syringes and if possible try to avoid serial dilutions. Everyone likes to do serial dilutions because they go 1, 2, 3, and they’re done but if you make a mistake in your serial dilution than you basically carry that mistake throughout your process.
Let’s look at some of the storage components in the lab. We have bottles that can usually leech different components into it. When you have standards you should be aware that most standard companies leech their bottles at least once. We leech our bottles here at SPEX many times.You want to be aware of your own storage bottles. Are you leeching your storage bottles before you use them? And what contamination is present. Well you can see metals and phthalates. Some bottles have huge concentrations of different elements in it or different ppm of metals. Something like a high density polyethylene can have almost 700 ppm of major impurities. Things like calcium, zinc, and silica. These contaminants will increase over time. The longer something stays in a storage container or storage bottle or in one of your storage bottles you are going to increase that contamination and leaching. The more you use it the more head space use, you are going to create volatilization and transpiration so you are actually going to change the head space in that bottle and you might lose some of your components.
Finally we are going to look at the laboratory and the human components. How we contribute and our lab environment contributes. We have cosmetics and perfumes which have phthalates, colorants, preservatives, metals, some mascara has mercury in it, and lipstick has lead. You have hair products with metals, dyes, solvents, other fragrances especially VOA and SVOA with perfumes and other things like that. Medicated lotions like ZnO, pharmaceuticals, waxes, and solvents can be brought into the lab by human factor. Soaps and fragrances can give you fragrances, antibacterials, and alcohols. If you are wearing jewelry you can reactions to whatever you are working on. One of the most unfortunate lessons I have ever learned in the lab is that methylenhloride does not work well with metals and it went right through my gloves and it burns underneath my wedding ring from working with the methyleenchloride with the wrong types of gloves because it went right through. It can also contribute lead, paint, and dyes. We have some cigarette smokers who bring the VOA on their clothes which can have some lead, chromium, and other metals in it. Of course we all know a scientist who has a lab coat that can walk away by itself because they don’t put it in to be washed so they collect absolutely everything in the lab. And finally our huge component that is in the lab is dust and dirt. It can bring in lead, phthalates, pesticides, and basically whatever else is around the lab. Dust is pervasive in the lab. It has earth elements like sodium, calcium, potassium, silica. Human activities kick up dust and generate dust. It will give you things like nickel, lead, pesticides, persistent organic pollutants, and different phthalates. This contamination by particles can be transferred by friction into your samples by a process called triboelectric series. So you open a teflon bottle and that air begins to attract skin, hair, and other components. This triboelectric series shows you all the different charges. You have something like air which is very positively charged. Human skin positively charged. But then you have teflon or a teflon bottle that has an extreme negative charge. The same process that you show your children when you comb your hair and you get that static electricity and you are able to attract the hair with the comb happens when you open that teflon bottle. You create that bit of friction, this triboelectric series goes into effect and the air is pulled the negative surface of the teflon bottle and pulls dust and other contaminants into your sample.
So how do you control some of these lab and human elements? Well, you can keep your work surfaces clean with solvents or reagent water cleaning. You can try to avoid cross contamination. If you think about how many laboratories you walk into with piles of paper, ink, glue, and tape and all sorts of stuff next to the preparation area, you need to keep that clean. You need to do your protective equipment. You need to know if it’s a source of contamination or are you preventing contamination with it. You can prevent contamination by using sticky mats, putting on a lab coat, sweat can contain different ions and metals like lead. And if you are using gloves, are you wearing powdered gloves because they could have zinc in them. Gloves themselves also can have a lot of contamination. They can have organic and inorganic contaminations. There was a study that I saw that showed all the different types of laboratory gloves where they tested them rinsed and unrinsed. Something like an unrinsed nitrile glove which is used very commonly in metal labs can have up to 2000 nanograms per centimeter cubed of calcium in it or sodium or a bunch of other metals. So it becomes a big source of contamination in the lab.
How do you determine if you have a clean lab? You run blanks. “Think Blank” Your blanks have to be absolutely clean to avoid any false positives and false negative results. Carry blanks and standards through all steps of the analytical process so you can track that. Anything that touches your sample must be absolutely clean. I thank you, and now back to Laura for any questions we might have.
Thank you Patricia and Susan. It is indeed time for questions and for a quick reminder if any of you would like to ask a question, submit it using the Q&A box which you can find on the right hand side of your window. We don’t have a lot of time so what we will do is going past the top of the hour by a couple of minutes to get a couple questions in here.
Alright, so let’s start. How do we get around requirement for accreditation before opening for business? That’s an interesting question. It’s an unfortunate situation that many states are creating and traditionally when we audit a laboratory to method on a real sample because apart of the requirement is the demonstration of competence. That is not possible when we are validating. My advice for a laboratory caught in this conundrum is to find a surrogate material as close as to authentic cannabis material and develop a method. Now you are going to have to validate that method based on surrogates, however once you are open for business and you can legally access a real authentic cannabis matrix you need to verify your methods. Unfortunately, some laboratories are in the position where they think they validated a good method using a surrogate however once they have access to authentic matrix material they need to revalidate the methods but the accreditation bodies are not able to give accreditation certificates without that demonstration of competence and that is a defining feature of 1705 accreditation. Thank you Susan.
Is it possible to get references or links to some of the studies that were presented in the web seminar? Absolutely, we have a list of references with the different studies we have referenced here. Also links to our own studies so we will make that available for anyone interested in looking up those studies themselves.
Can a lab use any published method and what are the advantages and disadvantages of using them? The published methods are great in general because if they are consensus methods they have been vetted by a series or group of experts in the field. For example, AOEC develops their consensus methods based on the consensus set of parameters around selectivity, precision, and accuracy and then candidate methods that meet that meet those requirements are vetted through that process. Published methods may or may not have the robustness of those consensus methods where you are using a published method that has not been externally validated and is recommended and if I were to advise a poll to fully validate that method.
Is it really possible to see tiny amounts of contamination from cosmetics or medications, etc in your lab samples and does really change the results? The question actually comes down to what you are looking for. If you are in a ppb environment, you can definitely see some of these small contaminations. In fact, we have a funny story that in our own lab we were making someone’s standards quite a few years ago. We had a contamination issue and used our root cause analysis to try and figure out where the contamination was coming from and we backed through our entire process and we ended up at the bench of the manufacturing chemist. It turns out he had poison ivy and his medication was contaminating what he was working on. So depending on what you are looking for it can definitely be a problem. If you are looking in a high ppm environment probably not you’re not going to see ppb in your lower levels but in a critical environment you can definitely see some of those contaminants.
How does the uncertainty on a certificate of analysis for something like a standard relate to the actual results in an analysis? Well the certificates that we showed you has the uncertainty associated with measurement that’s printed on that certificate. So if you have a standard that says 5 ppb plus or minus .01 ppm that is the uncertainty for that measurement that is traceable to a nist standard. Meaning that sample or standard within that range of .01 or whatever the uncertainty is associated with that measurement. Now if you are running that particular sample on instrumentation as Susan said earlier, that has its own uncertainties. So how your instrument reads that .01 ppm can anywhere within your instruments uncertainty and if your instruments uncertainty is 1% then you could be reading anywhere 5.1 plus or minus 1%. So that can change because your own process has its own uncertainty. This is Susan, I would like to interject and mention that it is impossible to do your own calculations of uncertainty where you achieve a value that is less than that of a NIST standard. If you are calculating your own uncertainty and your uncertainty is less than that stated by that NIST standard, than you have incorrectly implemented that calculation.
How many internal standards should be added to a sample? That depends on how many targets you plan on running. If you have a small group of targets than you should have enough internal standards that are pretty much in close proximity to your specific targets. So that could be as little as 1 or 2 but if you are doing something like a pesticide screening than you should be doing multiple different internal standards that will go across the entire time range of whatever instrument you are using, GCMS or LCMS so you always have some internal standard in close contact with your targets so you don’t have any changes through the run of that sample.
I think we should wrap up as we are a little past the hour. Patricia and Susan for your presentations today. I would like to thank all of you for attending today and our sponsor SPEX CertiPrep for making today’s webcast possible. Today’s webcast will be available for on demand viewing through May of next year. You will receive an email from LCGC alerting you when the webcast is available for replay. We encourage you to forward this to colleagues. We will see you next time, goodbye.