Efficacy of electron beam irradiation in reduction of mycotoxin-producing fungi, aflatoxin, and fumonisin, in naturally contaminated maize slurry

Abstract

Maize is a staple food in Kenya. However, maize is prone to fungal infestation, which may result in production of harmful aflatoxins and fumonisins. Electron beam (eBeam) food processing is a proven post-harvest technology, but published literature is rare on the ability of eBeam to reduce mycotoxins in naturally contaminated maize samples. This study evaluated the efficacy of eBeam doses in reducing viable fungal populations and the destruction of aflatoxins and fumonisins in naturally highly contaminated maize samples from eastern Kenya. Ninety-seven maize samples were analyzed for total aflatoxins and fumonisins using commercial ELISA kits. Then, 24 samples with >100 ng/g of total aflatoxins and >1000 ng/g of total fumonisins were chosen for eBeam toxin degradation studies. Prior to eBeam exposure studies, the samples were made into a slurry using sterile de-ionized water. These slurry samples were exposed to target doses of 5 kGy, 10 kGy, and 20 kGy, with 0 kGy (untreated) samples as controls. Samples were analyzed for total fungal load using culture methods, the quantity of total aflatoxins and fumonisins using ELISA, and the presence of Aspergillus and Fusarium spp. nucleic acids using qPCR for just control samples. There was a significant positive correlation in the control samples between total Aspergillus and aflatoxin levels (r = 0.54; p = 0.007) and total Fusarium and fumonisin levels (r = 0.68; p < 0.001). Exposure to eBeam doses 5 kGy and greater reduced fungal loads to below limits of detection by plating (<1.9 log(CFU/g)). There was also a significant (p = 0.03) average reduction of 0.3 log (ng/g) in aflatoxin at 20 kGy (range from −0.9 to 1.4 log (ng/g)). There was no significant reduction in fumonisin even at 20 kGy. eBeam doses below 20 kGy did not reduce mycotoxins. These results confirm the sensitivity of fungi to eBeam doses in a naturally contaminated maize slurry and that 20 kGy is effective at degrading some pre-formed aflatoxin in such maize preparation

DOI: https://doi.org/10.1016/j.toxcx.2022.100141

A Validated Preharvest Sampling Simulation Shows that Sampling Plans with a Larger Number of Randomly Located Samples Perform Better than Typical Sampling Plans in Detecting Representative Point-Source and Widespread Hazards in Leafy Green Fields

Abstract

Commercial leafy greens customers often require a negative preharvest pathogen test, typically by compositing 60 produce sample grabs of 150 to 375 g total mass from lots of various acreages. This study developed a preharvest sampling Monte Carlo simulation, validated it against literature and experimental trials, and used it to suggest improvements to sampling plans. The simulation was validated by outputting six simulated ranges of positive samples that contained the experimental number of positive samples (range, 2 to 139 positives) recovered from six field trials with point source, systematic, and sporadic contamination. We then evaluated the relative performance between simple random, stratified random, or systematic sampling in a 1-acre field to detect point sources of contamination present at 0.3% to 1.7% prevalence. Randomized sampling was optimal because of lower variability in probability of acceptance. Optimized sampling was applied to detect an industry-relevant point source [3 log(CFU/g) over 0.3% of the field] and widespread contamination [−1 to −4 log(CFU/g) over the whole field] by taking 60 to 1,200 sample grabs of 3 g. More samples increased the power of detecting point source contamination, as the median probability of acceptance decreased from 85% with 60 samples to 5% with 1,200 samples. Sampling plans with larger total composite sample mass increased power to detect low-level, widespread contamination, as the median probability of acceptance with −3 log(CFU/g) contamination decreased from 85% with a 150-g total mass to 30% with a 1,200-g total mass. Therefore, preharvest sampling power increases by taking more, smaller samples with randomization, up to the constraints of total grabs and mass feasible or required for a food safety objective.

DOI: https://doi.org/10.1128/aem.01015-22

Single kernel aflatoxin and fumonisin contamination distribution and spectral classification in commercial corn

Abstract

Aflatoxin and fumonisin contamination distribution in corn is non-homogeneous. Therefore, bulk sample testing may not accurately represent the levels of contamination. Single kernel analysis could provide a solution to these problems and lead to remediation strategies such as sorting. Our study uses extensive single kernel aflatoxin (AF) and fumonisin (FM) measurements to (i) demonstrate skewness, calculate weighted sums of toxin contamination for a sample, and compare those values to bulk measurements, and (ii) improve single kernel classification algorithm performance. Corn kernels with natural contamination of aflatoxin and fumonisin (n = 864, from 9 bulk samples) were scanned individually twice for reflectance between the ultraviolet–visible–near infrared spectrum (304 nm–1086 nm), then ground and measured for aflatoxin and fumonisin using ELISA. Single kernel contamination distribution was non-homogeneous with 1.0% (n = 7) of kernels with ≥20 ppb aflatoxin (range 0 – 4.2×10^5 ppb), and 5.0% (n = 45) kernels with ≥2 ppm fumonisin (range 0 – 7.0×10^2 ppm). A single kernel weighted sum was calculated and compared to bulk measurements. Average difference in mycotoxin levels (AF = 0.0 log(ppb), FM = 0.0 log(ppm), weighted sum – measured bulk levels) calculated no systematic bias between the two methods, though with considerable range of −1.4 to 0.7 log(ppb) for AF and −0.6 to 0.8 log(ppm) for FM. Algorithms were trained on 70% of the kernels to classify aflatoxin (≥20ppb) and fumonisin (≥2ppm), while the remaining 30% of kernels were used for testing. For aflatoxin, the best performing algorithm was stochastic gradient boosting model with an accuracy of 0.83 (Sensitivity (Sn) = 0.75, Specificity (Sp) = 0.83), for both training and testing set. For fumonisin, the penalized discriminant analysis outperformed the rest of the algorithms, with a training accuracy of 0.89 (Sn = 0.87, Sp = 0.88), and testing accuracy of 0.86 (Sn = 0.78, Sp = 0.87). The present study improves the foundations for single kernel classification of aflatoxin and fumonisin in corn, and can be applied to high throughput screening. This study demonstrates the heterogeneous distribution of aflatoxin and fumonisin contamination at single kernel level, comparing bulk levels calculated from those data to traditional bulk tests, and utilizing a UV–Vis–NIR spectroscopy system to classify single corn kernels by aflatoxin and fumonisin level.

DOI: https://doi.org/10.1016/j.foodcont.2021.108393

Evaluation of the Impact of Skewness, Clustering, and Probe Sampling Plan on Aflatoxin Detection in Corn

Abstract

Probe sampling plans for aflatoxin in corn attempt to reliably estimate concentrations in bulk corn given complications like skewed contamination distribution and hotspots. To evaluate and improve sampling plans, three sampling strategies (simple random sampling, stratified random sampling, systematic sampling with U.S. GIPSA sampling schemes), three numbers of probes (5, 10, 100, the last a proxy for autosampling), four clustering levels (1, 10, 100, 1,000 kernels/cluster source), and six aflatoxin concentrations (5, 10, 20, 40, 80, 100 ppb) were assessed by Monte‐Carlo simulation. Aflatoxin distribution was approximated by PERT and Gamma distributions of experimental aflatoxin data for uncontaminated and naturally contaminated single kernels. The model was validated against published data repeatedly sampling 18 grain lots contaminated with 5.8–680 ppb aflatoxin. All empirical acceptance probabilities fell within the range of simulated acceptance probabilities. Sensitivity analysis with partial rank correlation coefficients found acceptance probability more sensitive to aflatoxin concentration (−0.87) and clustering level (0.28) than number of probes (−0.09) and sampling strategy (0.04). Comparison of operating characteristic curves indicate all sampling strategies have similar average performance at the 20 ppb threshold (0.8–3.5% absolute marginal change), but systematic sampling has larger variability at clustering levels above 100. Taking extra probes improves detection (1.8% increase in absolute marginal change) when aflatoxin is spatially clustered at 1,000 kernels/cluster, but not when contaminated grains are homogenously distributed. Therefore, taking many small samples, for example, autosampling, may increase sampling plan reliability. The simulation is provided as an R Shiny web app for stakeholder use evaluating grain sampling plans.

DOI: https://doi.org/10.1111/risa.13721

Enabling cost-effective screening for antimicrobials against Listeria monocytogenes in ham

Abstract

Ready-to-eat (RTE) meat products, such as deli ham, can support the growth of Listeria monocytogenes (LM) which can cause severe illness in immunocompromised individuals. The objectives of this study were to validate a miniature ham model (MHM) against the ham slice method and screen antimicrobial combinations to control LM on ham using response surface methodology (RSM) as a time- and cost-effective high-throughput screening tool. The effect of nisin (Ni), potassium lactate sodium acetate (PLSDA), lauric arginate (LAG), lytic bacteriophage (P100), and Ɛ-polylysine (EPL) added alone, or in combination, was determined on the MHM over 12 days of storage. Results showed the MHM accurately mimics the ham slice method since no statistical differences were found (p=0.526) in the change of LM cell counts in MHM and slice counts after 12 days of storage at 4°C for treated and untreated hams. The MHM was then used to screen antimicrobial combinations using an on-face design and three center points in a central composite design. The RSM was tested using a cocktail of five LM strains isolated from foodborne disease outbreaks. Three levels of the above mentioned antimicrobials were used in combination for a total of 28 runs performed in triplicate. The change of LM cell counts were determined after 12 days of storage at 4°C. All tested antimicrobials were effective on reducing LM cell counts on ham when added alone. A significant antagonistic interaction (p=0.002) was identified by the RSM between LAG and P100, where this antimicrobial combination caused a 2.2 logCFU/g change of LM cell counts after 12 days of storage. Two interactions, between Ni and EPL (p=0.058), and Ni and P100 (p=0.068), showed possible synergistic effects against LM on the MHM. Other interactions were clearly non-significant, suggesting additive effects. In future work, the developed MHM in combination with RSM can be used as a high-throughput method to analyze novel antimicrobial treatments against LM.

DOI: https://doi.org/10.4315/JFP-20-435

Literature Review Investigating Intersections between US Foodservice Food Recovery and Safety

Abstract

Food waste is increasingly scrutinized due to the projected need to feed nine billion people in 2050. Food waste squanders many natural resources and occurs at all stages of the food supply chain, but economic and environmental costs are highest at later stages due to value and resource addition throughout the supply chain. Food recovery is the practice of preventing surplus food from being landfill disposed. It provides new opportunities to utilize food otherwise wasted, such as providing it to food insecure populations. Previous research suggests that consumer willingness to waste is higher if there is a perceived food safety risk. Yet, segments of the population act in contrast to conservative food safety risk management advice when food is free or extremely discounted. Therefore, food recovery and food safety may be competing priorities. This narrative review identifies the technical, regulatory, and social context relationships between food recovery and food safety, with a focus on US foodservice settings. The review identifies the additional steps in the foodservice process that stem from food recovery – increased potential for cross-contamination and hazard amplification due to temperature abuse – as well as the potential risk factors, transmission routes, and major hazards involved. This hazard identification step, the initial step in formal risk assessment, could inform strategies to best manage food safety hazards in recovery in foodservice settings. More research is needed to address the insufficient data and unclear regulatory guidelines that are barriers to implementing innovative food recovery practices in US foodservice settings.

https://doi.org/10.1016/j.resconrec.2020.105304

 

When to use one-dimensional, two-dimensional, and Shifted Transversal Design pooling in mycotoxin screening

Abstract

While complex sample pooling strategies have been developed for large-scale experiments with robotic liquid handling, many medium-scale experiments like mycotoxin screening by Enzyme-Linked Immunosorbent Assay (ELISA) are still conducted manually in 48- and 96-well plates. At this scale, the opportunity to save on reagent costs is offset by the increased costs of labor, materials, and risk-of-error caused by increasingly complex pooling strategies. This paper compares one-dimensional (1D), two-dimensional (2D), and Shifted Transversal Design (STD) pooling to study whether pooling affects assay accuracy and experimental cost and to provide guidance for when a human experimentalist might benefit from pooling. We approximated mycotoxin contamination in single corn kernels by fitting statistical distributions to experimental data (432 kernels for aflatoxin and 528 kernels for fumonisin) and used experimentally-validated Monte-Carlo simulation (10,000 iterations) to evaluate assay sensitivity, specificity, reagent cost, and pipetting cost. Based on the validated simulation results, assay sensitivity remains 100% for all four pooling strategies while specificity decreases as prevalence level rises. Reagent cost could be reduced by 70% and 80% in 48- and 96-well plates, with 1D and STD pooling being most reagent-saving respectively. Such a reagent-saving effect is only valid when prevalence level is < 21% for 48-well plates and < 13%-21% for 96-well plates. Pipetting cost will rise by 1.3–3.3 fold for 48-well plates and 1.2–4.3 fold for 96-well plates, with 1D pooling by row requiring the least pipetting. Thus, it is advisable to employ pooling when the expected prevalence level is below 21% and when the likely savings of up to 80% on reagent cost outweighs the increased materials and labor costs of up to 4 fold increases in pipetting.

https://doi.org/10.1371/journal.pone.0236668

CRISPR-Based Subtyping Using Whole Genome Sequence Data Does Not Improve Differentiation of Persistent and Sporadic Listeria monocytogenes Strains

Abstract

The foodborne pathogen Listeria monocytogenes can persist in food-associated environments for long periods. To identify persistent strains, the subtyping method pulsed-field gel electrophoresis (PFGE) is being replaced by whole genome sequence (WGS)-based subtyping. It was hypothesized that analyzing specific mobile genetic elements, CRISPR (Clustered Regularly Interspaced Short Palindromic Short Repeat) spacer arrays, extracted from WGS data, could differentiate persistent and sporadic isolates within WGS-based clades. To test this hypothesis, 175 L. monocytogenes isolates, previously recovered from retail delis, were analyzed for CRISPR spacers using CRISPRFinder. These isolates represent 23 phylogenetic clades defined by WGS-based single nucleotide polymorphisms and closely related sporadic isolates. In 174/175 (99.4%) of isolates, at least one array with one spacer was identified. Numbers of spacers in a single array ranged from 1 to 28 spacers. Isolates were grouped into 13 spacer patterns (SPs) based on observed variability in the presence or absence of whole spacers. SP variation was consistent with WGS-based clades forming patterns of (i) one SP to one clade, (ii) one SP across many clades, (iii) many SPs within one clade, and (iv) many SPs across many clades. Unfortunately, SPs did not appear to differentiate persistent from sporadic isolates within any WGS-based clade. Overall, these data show that (i) CRISPR arrays are common in WGS data for these food-associated L. monocytogenes and (ii) CRISPR subtyping cannot improve the identification of persistent or sporadic isolates from retail delis. Practical Application: CRISPR spacer arrays are present in L. monocytogenes isolates and CRISPR spacer patterns are consistent with previous subtyping methods. These mobile genetic artifacts cannot improve the differentiation between persistent and sporadic L. monocytogenes isolates, used in this study. While CRISPR-based subtyping has been useful for other pathogens, it is not useful in understanding persistence in L. monocytogenes. Thus, the food safety community might be able to use CRISPRs in other areas, but CRISPRs do not seem likely to improve the differentiation of persistence in L. monocytogenes isolates from retail delis.

https://doi.org/10.1111/1750-3841.14426

Persistent and sporadic Listeria monocytogenes strains do not differ when growing at 37°C, in planktonic state, under different food associated stresses or energy sources

Abstract

Background: The foodborne pathogen Listeria monocytogenes causes the potentially lethal disease listeriosis. Within food-associated environments, L. monocytogenes can persist for long periods and increase the risk of contamination by continued presence in processing facilities or other food-associated environments. Most research on phenotyping of persistent L. monocytogenes’ has explored biofilm formation and sanitizer resistance, with less data examining persistent L. monocytogenes’ phenotypic responses to extrinsic factors, such as variations in osmotic pressure, pH, and energy source availability. It was hypothesized that isolates of persistent strains are able to grow, and grow faster, under a broader range of intrinsic and extrinsic factors compared to closely related isolates of sporadic strains. Results: To test this hypothesis, 95 isolates (representing 74 isolates of 20 persistent strains and 21 isolates of sporadic strains) from a series of previous studies in retail delis, were grown at 37 °C, in (i) stress conditions: salt (0, 5, and 10% NaCl), pH (5.2, 7.2, and 9.2), and sanitizer (benzalkonium chloride, 0, 2, and 5 μg/mL) and (ii) energy sources: 25 mM glucose, cellobiose, glycogen, fructose, lactose, and sucrose; the original goal was to follow up with low temperature experiments for treatments where significant differences were observed. Growth rate and the ability to grow of 95 isolates were determined using high-throughput, OD600, growth curves. All stress conditions reduced growth rates in isolates compared to control (p < 0.05). In addition, growth varied by the tested energy sources. In chemically defined, minimal media there was a trend toward more isolates showing growth in all replicates using cellobiose (p = 0.052) compared to the control (glucose) and fewer isolates able to grow in glycogen (p = 0.02), lactose (p = 2.2 × 10-16), and sucrose (p = 2.2 × 10-16). Still, at least one isolate was able to consistently grow in every replicate for each energy source. Conclusions: The central hypothesis was rejected, as there was not a significant difference in growth rate or ability to grow for retail deli isolates of persistent strains compared to sporadic strains for any treatments at 37 °C. Therefore, these data suggest that persistence is likely not determined by a phenotype unique to persistent strains grown at 37 °C and exposed to extrinsic stresses or variation in energy sources.

https://doi.org/10.1186/s12866-019-1631-3 

Twenty-two years of U.S. meat and poultry product recalls: Implications for food safety and food waste

Abstract

The U.S. Department of Agriculture, Food Safety and Inspection Service maintains a recall case archive of meat and poultry product recalls from 1994 to the present. In this study, we collected all recall records from 1994 to 2015 and extracted the recall date, meat or poultry species implicated, reason for recall, recall class, and pounds of product recalled and recovered. Of a total of 1,515 records analyzed, the top three reasons for recall were contamination with Listeria, undeclared allergens, and Shiga toxin-producing Escherichia coli. Class I recalls (due to a hazard with a reasonable probability of causing adverse health consequences or death) represented 71% (1,075 of 1,515) of the total recalls. The amounts of product recalled and recovered per event were approximately lognormally distributed. The mean amount of product recalled and recovered was 6,800 and 1,000 lb (3,087 and 454 kg), respectively (standard deviation, 1.23 and 1.56 log lb, respectively). The total amount of product recalled in the 22-year evaluation period was 690 million lb (313 million kg), and the largest single recall involved 140 million lb (64 million kg) (21% of the total). In every data category subset, the largest recall represented .10% of the total product recalled in the set. The amount of product recovered was known for only 944 recalls. In 12% of those recalls (110 of 944), no product was recovered. In the remaining recalls, the median recovery was 29% of the product. The number of recalls per year was 24 to 150. Recall counts and amounts of product recalled over the 22-year evaluation period did not regularly increase by year, in contrast to the regular increase in U.S. meat and poultry production over the same time period. Overall, these data suggest that (i) meat and poultry recalls were heavily skewed toward class I recalls, suggesting recalls were focused on improving food safety, (ii) numbers of products and amounts of each product recalled were highly variable but did not increase over time, and (iii) the direct contribution of recalls to the food waste stream was associated with the largest recalls.

https://doi.org/10.4315/0362-028X.JFP-16-388