Predictive microbiology in food industry
On the application of the general principles of food hygiene, it is the responsibility of the Food Business Operators (FBO) to control microbial risks in foods (CE 2073/2005). Therefore, the FBO shall conduct studies in order to investigate compliance with the criteria, and investigate the ability of microorganisms of concern to grow or survive in the product during the shelf-life under different reasonably foreseeable storage conditions. For that purpose, FBO could use literature data, mathematical models and challenge tests studies. It makes sense that:
- The challenge-tests shall comply with internationally recognized Guidance Documents recognized by Regulation bodies (under draft ISO SC 9 WG19, NF V009)
- The cardinal values which help in predicting strains behavior shall be determined according to good laboratory practices and quality assurance procedures, providing accurate values with the related standard deviations
- And the predictive modeling shall be carried out using calculators under quality assurance, using state-of-the art approaches
To optimize the use of industrials data, Sym’Previus offers fitting model tools, growth/no growth interface but as well:
- Growth simulation tool based on gamma concept, enabling different scenarios of storage and formulation conditions (Zwietering, 1992).
- Inactivation simulation tool based on lambda concept (Mafart’s group, 2002, 2005, 2012))
Sym’Previus quality assurance is guaranteed by a scientific committee in charge to validate experimental data, and by the publication of all predictive models in peer review journals.
It proposes to fill challenge test results, autocontrol data (microbiological count, pH, aw,…) and to combine this information with a microorganism database (for growth cardinal model)
Besides Sym’Previus, it can be interesting to use complementary international database, like Combase.
Predictive microbiology
Predictive microbiology is a research area in which microbial and mathematical knowledge is combined for the development of mathematical models that describe microbial evolution in foods. The basic underlying idea of predictive microbiology is that the behaviour of microorganisms is deterministic and able to be predicted from knowledge of the microorganism itself, and its immediate environment. For most food products, a combination of the principal environmental factors (temperature, Aw, pH, atmosphere and antimicrobials) prevents or slows down microbial growth. There are two main methods for the quantitative study of combined hurdles: through models describing the combined effect of individual factors, and through modelling composite factors using response surfaces or polynomial approach.
The Gamma hypothesis (Zwietering et al., 1996) states that inhibitory processes affect microbial growth independently and that they combine together in a multiplicative manner, except where synergy or antagonism between factors occurs. Furthermore, the Gamma concept suggests that by understanding the effect of individual factors, each for example described by a Cardinal model, they can be combined in the form of “Gamma factors” or ratios of the observed growth to the uninhibited growth. Additionally, the interaction between environmental factors is described as an independent term which does not require any additional parameter. The gamma-concept is also the only method to use challenge test results in further growth simulation. At last, the lambda concept (Mafart’s group 2002, 2005, 2012) uses the same approach that the gamma concept but applied to inactivation.
Surface response models describe variables by multiple regressions. These have the advantage of being easy to produce, but lack any mechanistic insight. One peculiarity of surface response models is that the cross-terms of the polynomial functions are often used to suggest the presence of interactions between the parameters, e.g. terms signifying interactions between temperature and salt concentration or between temperature and nitrite levels. Indeed, this is one of the problems with such models – they are purely empirical fitting functions with no mechanistic justification, yet cross-terms are often considered to show interactive effects. Additionally, these models provide deterministic information with no or little indication of variability or uncertainty.