Illnesses caused by foodborne pathogenic microorganisms, as well as their control, are major worldwide public health issues. The prevention and/or reduction of foodborne disease has been, and continues to be, a major goal of societies that dates back to when food was first preserved by drying and salting.[1] Currently there is a much greater public concern about (and less tolerance towards) health risks associated with foods than risks from other manufactured products, such as cars and tobacco. Due to the biological nature of food and the emergence/evolution of potentially deadly microorganisms, the socioeconomic impact of foodborne illness is well chronicled. Pressure for new and stricter enforcement of food safety regulations typically gains momentum with highly publicized incidents of food contamination, such as those that have been associated with bovine spongiform encephalopathy, Salmonella, Escherichia coli and Listeria monocytogenes.
Control of food-related microbial risks involves procedures to eliminate or minimize the presence of specific groups of microorganisms, their by-products and/or their toxins. While the total elimination of foodborne disease remains an unattainable goal, both government public health managers and the food industry are committed to reducing the number of illnesses due to contaminated food.[1] Assessing the public health (safety) status of a food is a risk-based activity as well as a conundrum. That is, what is an acceptable level of risk and for whom is that level appropriate? With this as a backdrop, consider the following definition of food safety: “Food safety is the biological, chemical or physical status of a food that will permit its consumption without incurring excessive risk of injury, morbidity or mortality.”
Common Parameters
While a government often expresses public health goals relative to the incidence of disease, this does not provide food processors, producers, handlers, retailers or trade partners with specific information to help achieve these societal goals.[1] In order for these goals to be practically met by the food sector, food safety targets set by governing bodies need to be translated into parameters that can be used by food processors to manufacture food and that can be assessed by government agencies. Food Safety Objectives (FSOs) are intended to form the link between public health-based goals and suitable control measures and to allow for the equivalence of control measures to be determined. Good Manufacturing Practices, Good Agricultural Practices, Good Hygiene Practices (GHPs) and Hazard Analysis and Critical Control Points remain essential for food safety management systems to achieve FSOs or performance objectives (POs). The following new food safety management terms will be used throughout the remainder of this article.
Food Safety Objective (FSO): The maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the appropriate level of protection (ALOP). Deciding if and when to use an FSO is the responsibility of governments and should only be used in situations where they will have an impact on public health. Therefore, it is unnecessary to establish FSOs for all foods.
Performance Objective (PO): The maximum frequency and/or concentration of a hazard in a food at a specified step in the food chain before the time of consumption that provides or contributes to an FSO or ALOP, as applicable. This concept is useful particularly when the FSO is likely to be very low or “absent in a serving of the food at the point of consumption.”
Performance, Process and Product Criteria: When designing and controlling food operations, one must consider likely pathogen contamination, destruction methods and factors governing microbial growth, survival and possible recontamination. Consideration must be given to the conditions to which the food is likely to be exposed, including further processing and potential abuse during storage, distribution and preparation for use. The ability of those in control of foods at each stage in the food chain to prevent, eliminate or reduce food safety hazards varies with the type of food and the effectiveness of available technology.
A performance criterion (PC) is defined by the Codex Committee on Food Hygiene as “The effect in frequency and/or concentration of a hazard in a food that must be achieved by the application of one or more control measures to provide or contribute to a PO or FSO.” When establishing PCs, the initial levels of the hazard and changes of the hazard during production, processing, distribution, storage, preparation and use must be taken into account.
Process criteria are the control parameters (e.g., time, temperature, pH and water activity) at a step, or combination of steps, that can be applied to achieve a PC.
Product criteria consist of parameters that are used to prevent unacceptable multiplication of microorganisms in foods. Microbial growth is dependent on the composition and environment of the food. Consequently, pH, water activity, temperature, gas atmosphere, packaging barrier properties, etc. have an influence on the safety of particular foods.
Once the FSO is set, determination of several factors in achieving the FSO must be made. When establishing PCs, consideration must be given to the initial level of a hazard and changes in the hazard during production and processing, distribution, storage, preparation and use. A PC can be defined by the simplified equation:
H{0} - SR + SI = FSO,
where FSO = Food Safety Objective
H{0} = Initial level of the hazard
SR = Total (cumulative) reduction of the hazard
SI = Total (cumulative) increase in the hazard
FSO, H{0}, R and I are expressed in log{10} units.
It should be recognized that the parameters that may be used in the above equation are point estimates, whereas in practice, they will have a distribution of values associated with them. If data exists for the variance associated with the different parameters, then the underlying probability distributions may be established using an approach similar to that used in risk assessment. Control measures can be put into place to manage each part of the process and generally fall into three categories:[2]
• Controlling initial levels of a hazard (Ho): for example, avoiding food with a history of contamination or toxicity (e.g., raw milk or raw molluscan shellfish harvested under certain conditions); selecting ingredients (e.g., pasteurized liquid eggs or milk); using microbiological testing and criteria to reject unacceptable ingredients or products.
• Reducing the level of a hazard (∑R): for example, destroying pathogens (e.g., freezing to kill certain parasites, using disinfectants, pasteurization and irradiation); removing pathogens (e.g., washing, using ultrafiltration and centrifugation).
• Preventing an increase in the hazard (∑I); for example, preventing contamination [e.g., adopting GHPs that minimize contamination during slaughter, separating raw from cooked ready-to-eat (RTE) foods, implementing employee practices that minimize product contamination and using aseptic filling techniques]; preventing growth of pathogens (e.g., chilling and holding temperatures, pH, aw and preservatives).
Managing the Safety and Stability of RTE Meat
The remainder of this article will focus on the design and use of microbiological challenge and shelf-life studies to assess and manage the microbiological safety and stability of RTE meat products. This information allows for the appropriate determination of product criteria and assessment of ∑I for use in risk assessment and risk management exercises. The major pathogenic microorganism of concern for refrigerated RTE meats is L. monocytogenes.
L. monocytogenes is widespread in the environment and has been found in soil, water, sewage and decaying vegetation. It can be readily isolated from humans, domestic animals, raw agricultural commodities and food processing environments.[3] Control of L. monocytogenes in the food processing environment has been the subject of a number of scientific publications.[3] L. monocytogenes can grow slowly at refrigeration temperatures; therefore, refrigerated RTE foods that can support the growth of L. monocytogenes must be managed appropriately.
Conclusions from the U.S. Department of Agriculture (USDA) Food Safety and Inspection Service (FSIS) risk assessment for deli meats found that increased frequency of food-contact surface testing and sanitation can lead to a proportionally lower risk of listeriosis. Furthermore, combinations of interventions (i.e., microbiological testing and sanitation of food contact surfaces, pre- and post-packaging treatments and the use of growth inhibitors/product reformulation) appear to be more effective than any single intervention step. The assessment estimated that the annual number of deaths due to listeriosis could drop from 250 to less than 100 if industry used a combination of growth inhibitors and post-packaging pasteurization of products.
Based on some of the information and scenarios from this risk assessment, the USDA FSIS, under an interim final rule released on June 6, 2003,[4] afforded RTE products a different regulatory treatment. The agency stated that products must be produced under one of three alternative control programs to reduce or eliminate L. monocytogenes, or suppress or limit its growth:
• Post-lethality treatment that reduces or eliminates L. monocytogenes and an antimicrobial agent or process that suppresses or limits its growth throughout shelf life (Alternative 1)
• Post-lethality treatment that reduces or eliminates L. monocytogenes OR an antimicrobial agent or process that suppresses or limits its growth during its shelf life (Alternative 2)
• Sanitation procedures only to prevent L. monocytogenes contamination (Alternative 3; manufacturing plants using Alternative 3 will get the most frequent verification testing attention from government regulators)
Under alternatives 1 or 2, the FSIS applies relatively less microbiological sampling for testing of a product undergoing a post-lethality treatment giving at least a two-log10 reduction of L. monocytogenes, relatively more microbiological sampling for testing of a product receiving a post-lethality treatment giving between one- and two-log{10} reduction and would consider products in which a less than 1-log{10} reduction is achieved ineligible to be categorized as one of these two Alternatives, unless there is supporting documentation that the treatment provides an adequate safety margin.[5]
The U.S. Food and Drug Administration’s (FDA) risk assessment estimated that those foods with the highest risk of listeriosis support the growth of L. monocytogenes. In contrast, the foods that the risk assessment estimated to pose the lowest risk of listeriosis are foods that either have intrinsic or extrinsic factors to prevent the growth of L. monocytogenes or are processed to alter the normal characteristics of the food.[3] For example, it is well known that L. monocytogenes does not grow when: the pH of the food is less than or equal to 4.4; the water activity of the food is less than or equal to 0.92; or the food is frozen.
To assess the safety of specific RTE meat formulations over the intended shelf life and/or to determine the Alternative status of these products, microbiological challenge studies should be conducted. Food safety-related challenge studies include the following categories. The use of each is dependent on the type of information desired and/or already collected:[6]
• Pathogen growth inhibition study – Evaluates the ability of a particular food product formulation with a specific type of processing and packaging to inhibit the growth of certain bacterial pathogens when held under specific storage conditions.
• Pathogen inactivation study – Evaluates the ability of a particular food product formulation, a specific food manufacturing practice or their combination to cause the inactivation of certain bacterial pathogens. These studies may also be impacted by food storage and packaging conditions and must account for these variables.
• Combined growth and inactivation study – These studies may be combined to evaluate the ability of a particular food or process to inactivate certain bacterial pathogens and to inhibit the growth of certain other pathogenic bacteria or to achieve a level of inactivation followed by inhibition of either the growth of survivors or contaminants introduced after processing.
It is unreasonable to expect that every individual refrigerated RTE meat product would need a microbiological challenge study. However, a safe history of a food product is only relevant if all conditions remain the same. Even seemingly minor changes to a food formulation, process or packaging method may have a large impact on the safety of the product. Additionally, changes in the ecology, physiology or genetic makeup of a pathogen may result in food safety issues in products with a history of safety.[6]
NACMCF Guidelines
To both appropriately develop the experimental design and execute a microbiological challenge study, the following steps should be considered (detailed information on each step has been made available[6]):
1. Purpose of study determined: Microbiological challenge studies can include growth inhibition studies, microbial inactivation studies or a combination of the two. A challenge study is performed by inoculating selected microorganisms into a food to determine whether the microorganisms are either a potential health hazard risk or a spoilage risk.
2. Collect product characteristics information (i.e., pH, aw, salt, moisture, antimicrobials and processing): To design an appropriate challenge study, information about the product must be understood.
3. Determination of target microorganisms of concern: Selection of appropriate challenge microorganisms for use in a challenge study is based on a number of factors including product characteristics (pH, aw, packaging environment and storage conditions), how the product is processed and packaged as well as epidemiological data.
4. Number and selection of strains; is a surrogate qualified? For refrigerated RTE meats, the primary pathogen is L. monocytogenes. This organism is endemic in plant processing environments, ubiquitous in nature and can multiply in refrigerated foods at a minimal temperature of 0 °C (32 °F). Listeria is a facultative anaerobe and is, therefore, a significant pathogen in both vacuum-packaged and aerobic storage conditions.
5. Determination of inoculum level: When conducting a challenge study to determine the ability of a microorganism to grow in a food product, the inoculation level should reflect the expected level of contamination in the product. Typically, an inoculation level of 102–103 colony-forming units/g is used in these growth inhibition studies. In some cases, this level may exceed expected levels, but allows for direct plating to determine actual counts.
6. Inoculum preparation: Is cold or acid adaptation required? For refrigerated RTE meats, cold adaptation of L. monocytogenes prior to inoculation is recommended. Products should be inoculated such that both the intrinsic and extrinsic parameters of the food are maintained, the inoculation procedure mimics contamination that could realistically occur during production or storage and, if present, different interfaces receive the inoculum.
7. Packaging requirements: Product packaging used in microbiological challenge studies should be representative of typical packaging used in the commercial production of the food products evaluated.
8. Storage conditions: Storage temperatures should be representative of the temperature range to which the product is expected to be exposed during distribution and storage. The National Advisory Committee on Microbiological Criteria for Foods (NACMCF) recommends that challenge studies for refrigerated products be conducted at 7 °C (44.6 °F) to account for consumer storage temperatures in the United States.
9. Selection of microbiological tests for uninoculated controls: It is prudent to analyze the product, including uninoculated control samples, at either each or selected sampling points in the study to see how the background microflora is behaving over product shelf life.
10. Determination of methods for analysis of microorganisms and toxins: Methods chosen should be based on standard methods documented in reference manuals, including the Compendium of Methods for the Microbiological Examination of Foods, the FDA Bacteriological Analytical Manual and the USDA Microbiological Laboratory Guidebook or in other publications in peer-reviewed journals.
11. Determination of physical and chemical parameters to be tested: When designing microbiological challenge studies, it is important to include relevant physical and chemical testing in the design, as these factors can influence the behavior of microorganisms. Some of these factors include pH, titratable acidity, a{w}, salt, residual nitrite or other antimicrobials and proximate analysis (fat, moisture and protein).
12. Sampling considerations (number of samples and replicates to be tested): A minimum of two samples are analyzed at each interval; however, three or more samples are preferred. Replicate trials should be conducted using different batches/lots of product and inoculum to account for variability in product and inoculum procedures.
13. Study duration and sampling frequency: Ideally, products should be held for some period beyond the intended shelf life to account for consumers who may consume the product past the expiration and to add an additional margin of safety. Extended periods of sampling are dependent on the product and can be 25% (for products with longer shelf lives, i.e., 3–6 months) to 50% (for products with shelf lives of 7–10 days).
14. Data analysis and acceptance criteria: If the level of the challenge microorganism does not increase during storage, the product formulation is resistant to microbial growth and is considered microbiologically stable. FDA has recommended pass/fail criteria for various challenge microorganisms.
For Further Consideration
According to the NACMCF guidelines,[6] a number of considerations should be taken into account when conducting challenge studies internally or when selecting a contract laboratory to conduct the studies for your company. Challenge studies must be designed and evaluated by an expert food microbiologist. Potential sources of expertise include in-house experts, university faculty, testing laboratories and independent consultants. Choosing a laboratory requires careful consideration as not all laboratories have the expertise to design challenge studies and the quality control procedures necessary to produce valid results that will be accepted by either the regulatory authority or another reviewer. Laboratories may be certified by various organizations and state or federal agencies for various types of testing, for example, water and waste water testing, ISO 17025 and Grade A dairy testing. However, these certifications do not necessarily qualify a laboratory to design and conduct microbiological challenge studies.
A laboratory selected for challenge testing must be able to demonstrate prior experience in conducting challenge studies. It is necessary to ensure that personnel are experienced and qualified to conduct the types of analyses needed for the challenge studies and will follow generally accepted Good Laboratory Practices. Laboratories conducting microbial challenge studies should use test methods validated for their intended use. In situations where approved methods are not available or applicable, laboratories may consider using other widely accepted methods, such as those that have been cited in peer-reviewed journals.
Failure to properly design studies and use valid methods and appropriate controls may render the challenge study unacceptable and require additional time and resources to repeat the study.
Cynthia Stewart, Ph.D. is general manager of the Silliker Food Science Center, a contract research laboratory in the U.S.
John Williams Jr. is a Silliker, Inc. senior communications specialist. The authors can be reached at info@silliker.com.
References
1. ICMSF (International Commission on the Microbiological Specifications for Foods). 2005. A simplified guide to understanding and using Food Safety Objectives and Performance Objectives. Accessed on http://www.icmsf.iit.edu/main/articles_papers.html
2. ICMSF. 2002. Microorganisms in Foods 7: Microbiological Testing in Food Safety Management. Kluwer Academic /Plenum Publishers, New York.
3. www.fda.gov/Food/FoodSafety/FoodborneIllness/FoodborneIllnessFoodbornePathogensNaturalToxins/BadBugBook/ucm070064.htm.
4. www.fsis.usda.gov/OPPDE/rdad/FRPubs/97-013F.pdf.
5. http://www.fsis.usda.gov/OPPDE/rdad/FRPubs/97-013F/Lm_Rule_Compliance_Guidelines_2004.pdf;
Updated to http://www.fsis.usda.gov/oppde/rdad/FRPubs/97-013F/LM_Rule_Compliance_Guidelines_May_2006.pdf.
6. NACMCF Executive Secretariat. 2009. Parameters for Determining Inoculated Pack/Challenge Study Protocols. Washington, D.C.