Rethinking Food Storage – Part 1, by Anita Bailey

Most of us who read SurvivalBlog likely have some food – maybe a lot of food – stored up for the proverbial “rainy day” or other possible situations. Storing food has increased in public interest over the last few years. An internet search on “food storage” produces 1.6 billion hits at this writing, averaging about 1 search per minute all day and night. That’s a lot of interest. It’s not slowing down, either.

I’ve been storing up for well over five decades. When I started, I lived in a major city. Now I’m rural on a farm. I continue the process because having food set back has been a life-and-stress-saver multiple times over the years. My process of storage has changed significantly over time, too, as I explored and adapted to changes in my personal economy, the world situation, and my dietary preferences.

Now, food storage is changing again – for reasons that I never anticipated, and still can hardly believe is happening.

It started in 2020, with a little incident at home: a relative who had spent his 40 years consuming and enjoying fresh onions, suddenly had a horrific reaction after eating some – severe burning sensation in his throat and stomach, as if he had chowed down on a plate of habanero peppers. It took a week of severe gastric effects for him to recover. His symptom reappears whenever he tries anything with onion in it, but only for commercially-grown onions. Homegrown and wild onions don’t produce the same effect.

It’s unlikely that onions themselves have changed substantially in a year, especially when homegrown and wild ones haven’t. But, the way commercial onions are handled, stored, sprayed, and grown has changed. The same is true of other produce. The use of particularly insidious preservative coatings by large corporations, funded by billionaires, is part of this picture.

But there’s something else: general conversation among friends, people in line at the supermarket, and in online chat forums registers the sense that “food just doesn’t taste the same,” or “food doesn’t taste right,” or “I used to like it, but now it makes me feel awful.” Sometimes people can pinpoint a texture difference (less crunch or more, too thick or chalky, sticky or gummy feeling in one’s mouth), or perhaps a flavor that seems odd (slight bitterness, cloying sweetness, lack of remembered flavor, unexpected blandness). Sometimes it’s hard to be specific; the food just seems to have changed.

One reason for that may be the FDA ruling in May 2020 during the height of the virus-response food shortage crisis. This ruling allowed food manufacturers to utilized “substitute” ingredients in their products if they could not access what they actually listed on the label, as long as that constituted 2% or less of ingredients and that it “should” not potentially be a common allergen (gluten or sulfites, for example). This was done in a claimed effort to ease food shortages, since relabeling products could have caused further food shipment slowdowns.

The rule continued in effect until November 8, 2023. In other words, since 2020 manufacturers were selling products that were often labeled as containing one thing even though they incorporated something else, with those ingredients unknown to the consumer. That’s you and me. Spices, flavorings, emulsifiers, and other components of packaged food might not be whatever is on the label. Perhaps that “pure” olive oil contained canola. That “pure” honey may have included corn syrup. We just don’t know.

Ideally, the substituted ingredient “should” not be a common cause of health issues – but the ruling didn’t say it “must”. That’s a big difference. Might up to 2% of the wheat flour in a cracker be “substituted” with rice flour? Or cellulose fiber? Or cricket powder? That’s unknown. Since the rule continued until just a few months ago, it is likely that most of the products currently on supermarket shelves were manufactured under the “substitution” ruling.

The New “Salt”

Also, in 2020, the FDA ruled that the populace was consuming too much salt in prepared foods, which they said might contribute to high blood pressure. While that concept can be questioned from multiple angles, the ruling was that manufacturers may substitute potassium chloride for common table salt (sodium chloride), in such items as canned tuna and peas, cereals, and cheese. The one caveat added was that producers could call the chemical “potassium salt” rather than the more accurate “potassium chloride.” I suppose the “salt” sounded more friendly than “chloride”.

My palate is not gourmet-level-sensitive, but I can certainly taste the difference between sodium salt and potassium. Potassium, to me, is sharply bitter and sometimes metallic. Salt is not. Substituting potassium chloride for table salt dramatically changes how foods taste to me.

Anyone who has made cheese or butter or sauerkraut at home knows that sodium chloride salt doesn’t only contribute flavor. It also has some preservative action, slowing molds, and helping to reduce excess fluid in the product. Salt is used in fermenting other vegetables. Potassium chloride does not have the same action, though – so are manufacturers using something else to accomplish salt’s action? That is also unknown.

Antibiotics

Stranger still, the use of antibiotics in the food supply has been on an uptick. Sixty-nine percent of antibiotic use in the US is credited to the livestock industry, specifically hogs and cattle – not because of disease, but because the animals tend to grow faster when routinely dosed. There was a significant decline in antibiotic use from 2020 to 2022, but since then it has tended upward. Interestingly, the national livestock numbers have declined at the same time (cattle numbers are near the lowest since the 1950s), which means the doses per individual animal are actually higher than when herd numbers were elevated.

There is also routine use of the antibiotic natamycin, in various products to prevent fungal growth. This includes cheese products including shredded cheese, cream cheese and mixes, cottage cheese, sour cream, yogurt, and cheese slices. In addition, natamycin is used in packaged salad mixes, wines, juices, sausages, and other items as well. As with any drug, some people may be allergic to this one. This makes me wonder how many people think they can’t eat cheese but actually experience a reaction to the particular antibiotic.

The main problem with antibiotics in the food supply is that they contribute to creating antibiotic-resistant diseases. The end result is that it may take more antibiotics, or require a longer dosing time, or mean that an antibiotic no longer is effective against a specific disease. The more antibiotics one must take, or take for a longer time, can contribute to multiple issues including irritable bowel disease. About 35,000 people in the US die annually from drug-resistant infections, and an estimated 5 million around the world.

Few consumers know that imported foods are treated to ionizing radiation. Until a couple years ago, I didn’t know either. That morning avocado toast, with the avocado imported from outside the US borders, has likely been irradiated. The FDA indicates that use of irradiation is done to prevent accidental importation of insects, fungi, and other plant diseases into the country – that’s why a US-grown avocado probably isn’t radiated. However, irradiation is also used to preserve foods for long-term shelf-life. In the US, radiation may be used on products including, beef, pork, lobster, shrimp, crab, lettuce, spinach, poultry, potatoes, spouting seeds, eggs, oysters, mussels, scallops, and spices of all kinds. On the plus side, foods labeled as “organic” cannot be irradiated.

Of course, the FDA also states that irradiated foods are safe to consume. Odds are good that we’ve all been unknowingly eating them in one form or another for quite some time. Those irradiated supermarket eggs, for example, may have been sitting unchanging in cold storage for weeks to months. Interestingly, ionizing radiation not only prevents potatoes from sprouting, it also effectively sterilizes the seeds in fruits.

This explains why some avocado or bean seeds used for kid’s school projects simply never sprout, even though they are babied along exactly as they should be. It used to be possible to grow your own blueberry plants from supermarket berry fruit, but now only domestic organic blueberry seeds have the best chance of actually sprouting. This also explains why a supermarket green pepper can sit on your countertop, unchanged, for weeks – even as the fresh one you picked from your garden starts to dry out and wrinkle up almost immediately.

Adding to the list of ways our food supply has been manipulated, substituted, medicated, and irradiated, is the presence of pesticides and herbicides in grain supplies. Many people are unhappy with genetically modified organisms (GMOs) used for food. It might surprise folks that corn is highly GMO – some strains have added pesticides such as bacillus thuringiensis (BT) to the genetic code of the plant. This must be done in a laboratory; it cannot be naturally bred into a plant. Once the BT is in the genetic code, it will be passed on to future generations. BT is used to prevent corn earworm and cutworm infestations, among other things. It’s also found in other stored grains.

Whenever you see a label that states something in the product has been bioengineered, it means that it is GMO. It may be in the oils, corn products (corn syrup, for example), or even the sweetener such as from  GMO sugar beets.

One company found that over 90% of US corn is contaminated with GMO genetics, even those listed as organic and heirloom. Corn is pollinated by wind, so the pollen can travel miles from a commercial GMO field to other non-GMO plants in neighboring fields. The resulting corn then may carry GMO genes and pass them on to future generations, without the heirloom grower even knowing it’s happening.

(To be concluded tomorrow, in Part 2.)