Efficacy of electron beam irradiation in reduction of mycotoxin-producing fungi, aflatoxin, and fumonisin, in naturally contaminated maize slurry


Maize is a staple food in Kenya. However, maize is prone to fungal infestation, which may result in production of harmful aflatoxins and fumonisins. Electron beam (eBeam) food processing is a proven post-harvest technology, but published literature is rare on the ability of eBeam to reduce mycotoxins in naturally contaminated maize samples. This study evaluated the efficacy of eBeam doses in reducing viable fungal populations and the destruction of aflatoxins and fumonisins in naturally highly contaminated maize samples from eastern Kenya. Ninety-seven maize samples were analyzed for total aflatoxins and fumonisins using commercial ELISA kits. Then, 24 samples with >100 ng/g of total aflatoxins and >1000 ng/g of total fumonisins were chosen for eBeam toxin degradation studies. Prior to eBeam exposure studies, the samples were made into a slurry using sterile de-ionized water. These slurry samples were exposed to target doses of 5 kGy, 10 kGy, and 20 kGy, with 0 kGy (untreated) samples as controls. Samples were analyzed for total fungal load using culture methods, the quantity of total aflatoxins and fumonisins using ELISA, and the presence of Aspergillus and Fusarium spp. nucleic acids using qPCR for just control samples. There was a significant positive correlation in the control samples between total Aspergillus and aflatoxin levels (r = 0.54; p = 0.007) and total Fusarium and fumonisin levels (r = 0.68; p < 0.001). Exposure to eBeam doses 5 kGy and greater reduced fungal loads to below limits of detection by plating (<1.9 log(CFU/g)). There was also a significant (p = 0.03) average reduction of 0.3 log (ng/g) in aflatoxin at 20 kGy (range from −0.9 to 1.4 log (ng/g)). There was no significant reduction in fumonisin even at 20 kGy. eBeam doses below 20 kGy did not reduce mycotoxins. These results confirm the sensitivity of fungi to eBeam doses in a naturally contaminated maize slurry and that 20 kGy is effective at degrading some pre-formed aflatoxin in such maize preparation

DOI: https://doi.org/10.1016/j.toxcx.2022.100141

A Validated Preharvest Sampling Simulation Shows that Sampling Plans with a Larger Number of Randomly Located Samples Perform Better than Typical Sampling Plans in Detecting Representative Point-Source and Widespread Hazards in Leafy Green Fields


Commercial leafy greens customers often require a negative preharvest pathogen test, typically by compositing 60 produce sample grabs of 150 to 375 g total mass from lots of various acreages. This study developed a preharvest sampling Monte Carlo simulation, validated it against literature and experimental trials, and used it to suggest improvements to sampling plans. The simulation was validated by outputting six simulated ranges of positive samples that contained the experimental number of positive samples (range, 2 to 139 positives) recovered from six field trials with point source, systematic, and sporadic contamination. We then evaluated the relative performance between simple random, stratified random, or systematic sampling in a 1-acre field to detect point sources of contamination present at 0.3% to 1.7% prevalence. Randomized sampling was optimal because of lower variability in probability of acceptance. Optimized sampling was applied to detect an industry-relevant point source [3 log(CFU/g) over 0.3% of the field] and widespread contamination [−1 to −4 log(CFU/g) over the whole field] by taking 60 to 1,200 sample grabs of 3 g. More samples increased the power of detecting point source contamination, as the median probability of acceptance decreased from 85% with 60 samples to 5% with 1,200 samples. Sampling plans with larger total composite sample mass increased power to detect low-level, widespread contamination, as the median probability of acceptance with −3 log(CFU/g) contamination decreased from 85% with a 150-g total mass to 30% with a 1,200-g total mass. Therefore, preharvest sampling power increases by taking more, smaller samples with randomization, up to the constraints of total grabs and mass feasible or required for a food safety objective.

DOI: https://doi.org/10.1128/aem.01015-22

Enabling cost-effective screening for antimicrobials against Listeria monocytogenes in ham


Ready-to-eat (RTE) meat products, such as deli ham, can support the growth of Listeria monocytogenes (LM) which can cause severe illness in immunocompromised individuals. The objectives of this study were to validate a miniature ham model (MHM) against the ham slice method and screen antimicrobial combinations to control LM on ham using response surface methodology (RSM) as a time- and cost-effective high-throughput screening tool. The effect of nisin (Ni), potassium lactate sodium acetate (PLSDA), lauric arginate (LAG), lytic bacteriophage (P100), and Ɛ-polylysine (EPL) added alone, or in combination, was determined on the MHM over 12 days of storage. Results showed the MHM accurately mimics the ham slice method since no statistical differences were found (p=0.526) in the change of LM cell counts in MHM and slice counts after 12 days of storage at 4°C for treated and untreated hams. The MHM was then used to screen antimicrobial combinations using an on-face design and three center points in a central composite design. The RSM was tested using a cocktail of five LM strains isolated from foodborne disease outbreaks. Three levels of the above mentioned antimicrobials were used in combination for a total of 28 runs performed in triplicate. The change of LM cell counts were determined after 12 days of storage at 4°C. All tested antimicrobials were effective on reducing LM cell counts on ham when added alone. A significant antagonistic interaction (p=0.002) was identified by the RSM between LAG and P100, where this antimicrobial combination caused a 2.2 logCFU/g change of LM cell counts after 12 days of storage. Two interactions, between Ni and EPL (p=0.058), and Ni and P100 (p=0.068), showed possible synergistic effects against LM on the MHM. Other interactions were clearly non-significant, suggesting additive effects. In future work, the developed MHM in combination with RSM can be used as a high-throughput method to analyze novel antimicrobial treatments against LM.

DOI: https://doi.org/10.4315/JFP-20-435

CRISPR-Based Subtyping Using Whole Genome Sequence Data Does Not Improve Differentiation of Persistent and Sporadic Listeria monocytogenes Strains


The foodborne pathogen Listeria monocytogenes can persist in food-associated environments for long periods. To identify persistent strains, the subtyping method pulsed-field gel electrophoresis (PFGE) is being replaced by whole genome sequence (WGS)-based subtyping. It was hypothesized that analyzing specific mobile genetic elements, CRISPR (Clustered Regularly Interspaced Short Palindromic Short Repeat) spacer arrays, extracted from WGS data, could differentiate persistent and sporadic isolates within WGS-based clades. To test this hypothesis, 175 L. monocytogenes isolates, previously recovered from retail delis, were analyzed for CRISPR spacers using CRISPRFinder. These isolates represent 23 phylogenetic clades defined by WGS-based single nucleotide polymorphisms and closely related sporadic isolates. In 174/175 (99.4%) of isolates, at least one array with one spacer was identified. Numbers of spacers in a single array ranged from 1 to 28 spacers. Isolates were grouped into 13 spacer patterns (SPs) based on observed variability in the presence or absence of whole spacers. SP variation was consistent with WGS-based clades forming patterns of (i) one SP to one clade, (ii) one SP across many clades, (iii) many SPs within one clade, and (iv) many SPs across many clades. Unfortunately, SPs did not appear to differentiate persistent from sporadic isolates within any WGS-based clade. Overall, these data show that (i) CRISPR arrays are common in WGS data for these food-associated L. monocytogenes and (ii) CRISPR subtyping cannot improve the identification of persistent or sporadic isolates from retail delis. Practical Application: CRISPR spacer arrays are present in L. monocytogenes isolates and CRISPR spacer patterns are consistent with previous subtyping methods. These mobile genetic artifacts cannot improve the differentiation between persistent and sporadic L. monocytogenes isolates, used in this study. While CRISPR-based subtyping has been useful for other pathogens, it is not useful in understanding persistence in L. monocytogenes. Thus, the food safety community might be able to use CRISPRs in other areas, but CRISPRs do not seem likely to improve the differentiation of persistence in L. monocytogenes isolates from retail delis.


Persistent and sporadic Listeria monocytogenes strains do not differ when growing at 37°C, in planktonic state, under different food associated stresses or energy sources


Background: The foodborne pathogen Listeria monocytogenes causes the potentially lethal disease listeriosis. Within food-associated environments, L. monocytogenes can persist for long periods and increase the risk of contamination by continued presence in processing facilities or other food-associated environments. Most research on phenotyping of persistent L. monocytogenes’ has explored biofilm formation and sanitizer resistance, with less data examining persistent L. monocytogenes’ phenotypic responses to extrinsic factors, such as variations in osmotic pressure, pH, and energy source availability. It was hypothesized that isolates of persistent strains are able to grow, and grow faster, under a broader range of intrinsic and extrinsic factors compared to closely related isolates of sporadic strains. Results: To test this hypothesis, 95 isolates (representing 74 isolates of 20 persistent strains and 21 isolates of sporadic strains) from a series of previous studies in retail delis, were grown at 37 °C, in (i) stress conditions: salt (0, 5, and 10% NaCl), pH (5.2, 7.2, and 9.2), and sanitizer (benzalkonium chloride, 0, 2, and 5 μg/mL) and (ii) energy sources: 25 mM glucose, cellobiose, glycogen, fructose, lactose, and sucrose; the original goal was to follow up with low temperature experiments for treatments where significant differences were observed. Growth rate and the ability to grow of 95 isolates were determined using high-throughput, OD600, growth curves. All stress conditions reduced growth rates in isolates compared to control (p < 0.05). In addition, growth varied by the tested energy sources. In chemically defined, minimal media there was a trend toward more isolates showing growth in all replicates using cellobiose (p = 0.052) compared to the control (glucose) and fewer isolates able to grow in glycogen (p = 0.02), lactose (p = 2.2 × 10-16), and sucrose (p = 2.2 × 10-16). Still, at least one isolate was able to consistently grow in every replicate for each energy source. Conclusions: The central hypothesis was rejected, as there was not a significant difference in growth rate or ability to grow for retail deli isolates of persistent strains compared to sporadic strains for any treatments at 37 °C. Therefore, these data suggest that persistence is likely not determined by a phenotype unique to persistent strains grown at 37 °C and exposed to extrinsic stresses or variation in energy sources.