Single kernel aflatoxin and fumonisin contamination distribution and spectral classification in commercial corn

Abstract

Aflatoxin and fumonisin contamination distribution in corn is non-homogeneous. Therefore, bulk sample testing may not accurately represent the levels of contamination. Single kernel analysis could provide a solution to these problems and lead to remediation strategies such as sorting. Our study uses extensive single kernel aflatoxin (AF) and fumonisin (FM) measurements to (i) demonstrate skewness, calculate weighted sums of toxin contamination for a sample, and compare those values to bulk measurements, and (ii) improve single kernel classification algorithm performance. Corn kernels with natural contamination of aflatoxin and fumonisin (n = 864, from 9 bulk samples) were scanned individually twice for reflectance between the ultraviolet–visible–near infrared spectrum (304 nm–1086 nm), then ground and measured for aflatoxin and fumonisin using ELISA. Single kernel contamination distribution was non-homogeneous with 1.0% (n = 7) of kernels with ≥20 ppb aflatoxin (range 0 – 4.2×10^5 ppb), and 5.0% (n = 45) kernels with ≥2 ppm fumonisin (range 0 – 7.0×10^2 ppm). A single kernel weighted sum was calculated and compared to bulk measurements. Average difference in mycotoxin levels (AF = 0.0 log(ppb), FM = 0.0 log(ppm), weighted sum – measured bulk levels) calculated no systematic bias between the two methods, though with considerable range of −1.4 to 0.7 log(ppb) for AF and −0.6 to 0.8 log(ppm) for FM. Algorithms were trained on 70% of the kernels to classify aflatoxin (≥20ppb) and fumonisin (≥2ppm), while the remaining 30% of kernels were used for testing. For aflatoxin, the best performing algorithm was stochastic gradient boosting model with an accuracy of 0.83 (Sensitivity (Sn) = 0.75, Specificity (Sp) = 0.83), for both training and testing set. For fumonisin, the penalized discriminant analysis outperformed the rest of the algorithms, with a training accuracy of 0.89 (Sn = 0.87, Sp = 0.88), and testing accuracy of 0.86 (Sn = 0.78, Sp = 0.87). The present study improves the foundations for single kernel classification of aflatoxin and fumonisin in corn, and can be applied to high throughput screening. This study demonstrates the heterogeneous distribution of aflatoxin and fumonisin contamination at single kernel level, comparing bulk levels calculated from those data to traditional bulk tests, and utilizing a UV–Vis–NIR spectroscopy system to classify single corn kernels by aflatoxin and fumonisin level.

DOI: https://doi.org/10.1016/j.foodcont.2021.108393

Evaluation of the Impact of Skewness, Clustering, and Probe Sampling Plan on Aflatoxin Detection in Corn

Abstract

Probe sampling plans for aflatoxin in corn attempt to reliably estimate concentrations in bulk corn given complications like skewed contamination distribution and hotspots. To evaluate and improve sampling plans, three sampling strategies (simple random sampling, stratified random sampling, systematic sampling with U.S. GIPSA sampling schemes), three numbers of probes (5, 10, 100, the last a proxy for autosampling), four clustering levels (1, 10, 100, 1,000 kernels/cluster source), and six aflatoxin concentrations (5, 10, 20, 40, 80, 100 ppb) were assessed by Monte‐Carlo simulation. Aflatoxin distribution was approximated by PERT and Gamma distributions of experimental aflatoxin data for uncontaminated and naturally contaminated single kernels. The model was validated against published data repeatedly sampling 18 grain lots contaminated with 5.8–680 ppb aflatoxin. All empirical acceptance probabilities fell within the range of simulated acceptance probabilities. Sensitivity analysis with partial rank correlation coefficients found acceptance probability more sensitive to aflatoxin concentration (−0.87) and clustering level (0.28) than number of probes (−0.09) and sampling strategy (0.04). Comparison of operating characteristic curves indicate all sampling strategies have similar average performance at the 20 ppb threshold (0.8–3.5% absolute marginal change), but systematic sampling has larger variability at clustering levels above 100. Taking extra probes improves detection (1.8% increase in absolute marginal change) when aflatoxin is spatially clustered at 1,000 kernels/cluster, but not when contaminated grains are homogenously distributed. Therefore, taking many small samples, for example, autosampling, may increase sampling plan reliability. The simulation is provided as an R Shiny web app for stakeholder use evaluating grain sampling plans.

DOI: https://doi.org/10.1111/risa.13721

Enabling cost-effective screening for antimicrobials against Listeria monocytogenes in ham

Abstract

Ready-to-eat (RTE) meat products, such as deli ham, can support the growth of Listeria monocytogenes (LM) which can cause severe illness in immunocompromised individuals. The objectives of this study were to validate a miniature ham model (MHM) against the ham slice method and screen antimicrobial combinations to control LM on ham using response surface methodology (RSM) as a time- and cost-effective high-throughput screening tool. The effect of nisin (Ni), potassium lactate sodium acetate (PLSDA), lauric arginate (LAG), lytic bacteriophage (P100), and Ɛ-polylysine (EPL) added alone, or in combination, was determined on the MHM over 12 days of storage. Results showed the MHM accurately mimics the ham slice method since no statistical differences were found (p=0.526) in the change of LM cell counts in MHM and slice counts after 12 days of storage at 4°C for treated and untreated hams. The MHM was then used to screen antimicrobial combinations using an on-face design and three center points in a central composite design. The RSM was tested using a cocktail of five LM strains isolated from foodborne disease outbreaks. Three levels of the above mentioned antimicrobials were used in combination for a total of 28 runs performed in triplicate. The change of LM cell counts were determined after 12 days of storage at 4°C. All tested antimicrobials were effective on reducing LM cell counts on ham when added alone. A significant antagonistic interaction (p=0.002) was identified by the RSM between LAG and P100, where this antimicrobial combination caused a 2.2 logCFU/g change of LM cell counts after 12 days of storage. Two interactions, between Ni and EPL (p=0.058), and Ni and P100 (p=0.068), showed possible synergistic effects against LM on the MHM. Other interactions were clearly non-significant, suggesting additive effects. In future work, the developed MHM in combination with RSM can be used as a high-throughput method to analyze novel antimicrobial treatments against LM.

DOI: https://doi.org/10.4315/JFP-20-435

Literature Review Investigating Intersections between US Foodservice Food Recovery and Safety

Abstract

Food waste is increasingly scrutinized due to the projected need to feed nine billion people in 2050. Food waste squanders many natural resources and occurs at all stages of the food supply chain, but economic and environmental costs are highest at later stages due to value and resource addition throughout the supply chain. Food recovery is the practice of preventing surplus food from being landfill disposed. It provides new opportunities to utilize food otherwise wasted, such as providing it to food insecure populations. Previous research suggests that consumer willingness to waste is higher if there is a perceived food safety risk. Yet, segments of the population act in contrast to conservative food safety risk management advice when food is free or extremely discounted. Therefore, food recovery and food safety may be competing priorities. This narrative review identifies the technical, regulatory, and social context relationships between food recovery and food safety, with a focus on US foodservice settings. The review identifies the additional steps in the foodservice process that stem from food recovery – increased potential for cross-contamination and hazard amplification due to temperature abuse – as well as the potential risk factors, transmission routes, and major hazards involved. This hazard identification step, the initial step in formal risk assessment, could inform strategies to best manage food safety hazards in recovery in foodservice settings. More research is needed to address the insufficient data and unclear regulatory guidelines that are barriers to implementing innovative food recovery practices in US foodservice settings.

https://doi.org/10.1016/j.resconrec.2020.105304

 

When to use one-dimensional, two-dimensional, and Shifted Transversal Design pooling in mycotoxin screening

Abstract

While complex sample pooling strategies have been developed for large-scale experiments with robotic liquid handling, many medium-scale experiments like mycotoxin screening by Enzyme-Linked Immunosorbent Assay (ELISA) are still conducted manually in 48- and 96-well plates. At this scale, the opportunity to save on reagent costs is offset by the increased costs of labor, materials, and risk-of-error caused by increasingly complex pooling strategies. This paper compares one-dimensional (1D), two-dimensional (2D), and Shifted Transversal Design (STD) pooling to study whether pooling affects assay accuracy and experimental cost and to provide guidance for when a human experimentalist might benefit from pooling. We approximated mycotoxin contamination in single corn kernels by fitting statistical distributions to experimental data (432 kernels for aflatoxin and 528 kernels for fumonisin) and used experimentally-validated Monte-Carlo simulation (10,000 iterations) to evaluate assay sensitivity, specificity, reagent cost, and pipetting cost. Based on the validated simulation results, assay sensitivity remains 100% for all four pooling strategies while specificity decreases as prevalence level rises. Reagent cost could be reduced by 70% and 80% in 48- and 96-well plates, with 1D and STD pooling being most reagent-saving respectively. Such a reagent-saving effect is only valid when prevalence level is < 21% for 48-well plates and < 13%-21% for 96-well plates. Pipetting cost will rise by 1.3–3.3 fold for 48-well plates and 1.2–4.3 fold for 96-well plates, with 1D pooling by row requiring the least pipetting. Thus, it is advisable to employ pooling when the expected prevalence level is below 21% and when the likely savings of up to 80% on reagent cost outweighs the increased materials and labor costs of up to 4 fold increases in pipetting.

https://doi.org/10.1371/journal.pone.0236668

CRISPR-Based Subtyping Using Whole Genome Sequence Data Does Not Improve Differentiation of Persistent and Sporadic Listeria monocytogenes Strains

Abstract

The foodborne pathogen Listeria monocytogenes can persist in food-associated environments for long periods. To identify persistent strains, the subtyping method pulsed-field gel electrophoresis (PFGE) is being replaced by whole genome sequence (WGS)-based subtyping. It was hypothesized that analyzing specific mobile genetic elements, CRISPR (Clustered Regularly Interspaced Short Palindromic Short Repeat) spacer arrays, extracted from WGS data, could differentiate persistent and sporadic isolates within WGS-based clades. To test this hypothesis, 175 L. monocytogenes isolates, previously recovered from retail delis, were analyzed for CRISPR spacers using CRISPRFinder. These isolates represent 23 phylogenetic clades defined by WGS-based single nucleotide polymorphisms and closely related sporadic isolates. In 174/175 (99.4%) of isolates, at least one array with one spacer was identified. Numbers of spacers in a single array ranged from 1 to 28 spacers. Isolates were grouped into 13 spacer patterns (SPs) based on observed variability in the presence or absence of whole spacers. SP variation was consistent with WGS-based clades forming patterns of (i) one SP to one clade, (ii) one SP across many clades, (iii) many SPs within one clade, and (iv) many SPs across many clades. Unfortunately, SPs did not appear to differentiate persistent from sporadic isolates within any WGS-based clade. Overall, these data show that (i) CRISPR arrays are common in WGS data for these food-associated L. monocytogenes and (ii) CRISPR subtyping cannot improve the identification of persistent or sporadic isolates from retail delis. Practical Application: CRISPR spacer arrays are present in L. monocytogenes isolates and CRISPR spacer patterns are consistent with previous subtyping methods. These mobile genetic artifacts cannot improve the differentiation between persistent and sporadic L. monocytogenes isolates, used in this study. While CRISPR-based subtyping has been useful for other pathogens, it is not useful in understanding persistence in L. monocytogenes. Thus, the food safety community might be able to use CRISPRs in other areas, but CRISPRs do not seem likely to improve the differentiation of persistence in L. monocytogenes isolates from retail delis.

https://doi.org/10.1111/1750-3841.14426

Persistent and sporadic Listeria monocytogenes strains do not differ when growing at 37°C, in planktonic state, under different food associated stresses or energy sources

Abstract

Background: The foodborne pathogen Listeria monocytogenes causes the potentially lethal disease listeriosis. Within food-associated environments, L. monocytogenes can persist for long periods and increase the risk of contamination by continued presence in processing facilities or other food-associated environments. Most research on phenotyping of persistent L. monocytogenes’ has explored biofilm formation and sanitizer resistance, with less data examining persistent L. monocytogenes’ phenotypic responses to extrinsic factors, such as variations in osmotic pressure, pH, and energy source availability. It was hypothesized that isolates of persistent strains are able to grow, and grow faster, under a broader range of intrinsic and extrinsic factors compared to closely related isolates of sporadic strains. Results: To test this hypothesis, 95 isolates (representing 74 isolates of 20 persistent strains and 21 isolates of sporadic strains) from a series of previous studies in retail delis, were grown at 37 °C, in (i) stress conditions: salt (0, 5, and 10% NaCl), pH (5.2, 7.2, and 9.2), and sanitizer (benzalkonium chloride, 0, 2, and 5 μg/mL) and (ii) energy sources: 25 mM glucose, cellobiose, glycogen, fructose, lactose, and sucrose; the original goal was to follow up with low temperature experiments for treatments where significant differences were observed. Growth rate and the ability to grow of 95 isolates were determined using high-throughput, OD600, growth curves. All stress conditions reduced growth rates in isolates compared to control (p < 0.05). In addition, growth varied by the tested energy sources. In chemically defined, minimal media there was a trend toward more isolates showing growth in all replicates using cellobiose (p = 0.052) compared to the control (glucose) and fewer isolates able to grow in glycogen (p = 0.02), lactose (p = 2.2 × 10-16), and sucrose (p = 2.2 × 10-16). Still, at least one isolate was able to consistently grow in every replicate for each energy source. Conclusions: The central hypothesis was rejected, as there was not a significant difference in growth rate or ability to grow for retail deli isolates of persistent strains compared to sporadic strains for any treatments at 37 °C. Therefore, these data suggest that persistence is likely not determined by a phenotype unique to persistent strains grown at 37 °C and exposed to extrinsic stresses or variation in energy sources.

https://doi.org/10.1186/s12866-019-1631-3 

Twenty-two years of U.S. meat and poultry product recalls: Implications for food safety and food waste

Abstract

The U.S. Department of Agriculture, Food Safety and Inspection Service maintains a recall case archive of meat and poultry product recalls from 1994 to the present. In this study, we collected all recall records from 1994 to 2015 and extracted the recall date, meat or poultry species implicated, reason for recall, recall class, and pounds of product recalled and recovered. Of a total of 1,515 records analyzed, the top three reasons for recall were contamination with Listeria, undeclared allergens, and Shiga toxin-producing Escherichia coli. Class I recalls (due to a hazard with a reasonable probability of causing adverse health consequences or death) represented 71% (1,075 of 1,515) of the total recalls. The amounts of product recalled and recovered per event were approximately lognormally distributed. The mean amount of product recalled and recovered was 6,800 and 1,000 lb (3,087 and 454 kg), respectively (standard deviation, 1.23 and 1.56 log lb, respectively). The total amount of product recalled in the 22-year evaluation period was 690 million lb (313 million kg), and the largest single recall involved 140 million lb (64 million kg) (21% of the total). In every data category subset, the largest recall represented .10% of the total product recalled in the set. The amount of product recovered was known for only 944 recalls. In 12% of those recalls (110 of 944), no product was recovered. In the remaining recalls, the median recovery was 29% of the product. The number of recalls per year was 24 to 150. Recall counts and amounts of product recalled over the 22-year evaluation period did not regularly increase by year, in contrast to the regular increase in U.S. meat and poultry production over the same time period. Overall, these data suggest that (i) meat and poultry recalls were heavily skewed toward class I recalls, suggesting recalls were focused on improving food safety, (ii) numbers of products and amounts of each product recalled were highly variable but did not increase over time, and (iii) the direct contribution of recalls to the food waste stream was associated with the largest recalls.

https://doi.org/10.4315/0362-028X.JFP-16-388

Classification of aflatoxin contaminated single corn kernels by ultraviolet to near infrared spectroscopy

Highlights

Novel UV–Vis–NIR spectroscopy system built to scan single corn kernels in motion.
Random forest model classifies single kernels by aflatoxin level with high accuracy.
BGYF, discoloration, brokenness associated with aflatoxin contamination.

Abstract

Aflatoxin contamination in corn poses threats to consumer food safety and grower economic stability. Current industrial methods for aflatoxin management in corn focus on the bulk aflatoxin level, which can lead to either acceptance of lots with contaminated corn kernels (consumer food safety risk) or rejection of lots with mostly harmless corn kernels (grower economic loss). This dilemma may be resolved by utilizing spectroscopy to classify single corn kernels. Hence, our research aims to investigate the potential of using a custom-built UltraViolet-Visible-Near InfraRed spectroscopy system (UV–Vis–NIR) to classify single corn kernels by aflatoxin level. Single kernels from cobs inoculated with aflatoxin-producing Aspergillus flavus (240 kernels) and uninoculated cobs (240 kernels) were i) scanned individually for reflectance from 304 nm to 1086 nm by an increment of 0.5 nm; ii) ground; iii) measured for aflatoxin by ELISA. Using the spectra and the aflatoxin concentration, a random forest model was trained on 80% of the kernels to classify single corn kernels above or below 20 ppb of aflatoxin and was tested on the remaining 20% of the kernels. Among 480 kernels, 374 kernels had <20 ppb of aflatoxin and 106 kernels had ≥20 ppb of aflatoxin. The random forest model had a sensitivity of 87.1% and specificity of 97.7% in the training set and a sensitivity of 85.7% and specificity of 97.3% in the test set, which is higher than previous models where kernels were in motion and comparable to models where kernels were stationary. Spectral regions around 390, 540, and 1050 nm are found to be important for classification. This study demonstrated the custom-built UV–Vis–NIR spectroscopy system showed considerable potential in classifying single corn kernels by aflatoxin level while the kernels are in motion.

https://doi.org/10.1016/j.foodcont.2018.11.037

A Review of the Methodology of Analyzing Aflatoxin and Fumonisin in Single Corn Kernels and the Potential Impacts of These Methods on Food Security

Current detection methods for contamination of aflatoxin and fumonisin used in the corn industry are based on bulk level. However, literature demonstrates that contamination of these mycotoxins is highly skewed and bulk samples do not always represent accurately the overall contamination in a batch of corn. Single kernel analysis can provide an insightful level of analysis of the contamination of aflatoxin and fumonisin, as well as suggest a possible remediation to the skewness present in bulk detection. Current literature describes analytical methods capable of detecting aflatoxin and fumonisin at a single kernel level, such as liquid chromatography, fluorescence imaging, and reflectance imaging. These methods could provide tools to classify mycotoxin contaminated kernels and study potential co-occurrence of aflatoxin and fumonisin. Analysis at a single kernel level could provide a solution to the skewness present in mycotoxin contamination detection and offer improved remediation methods through sorting that could impact food security and management of food waste.

https://doi.org/10.3390/foods9030297

(This article belongs to the Special Issue Safeguarding the Global Food Supply: Advances in Mycotoxin Prevention, Surveillance and Mitigation )