How Bugs Turn Your Food Waste into Fuel

They are far from appetising, and out-of-date meat and dairy products may even induce the gagging reflex, but the food trimmings, leftovers and beyond-best-befores that go into your garbage are still high in energy. Even though they might be in the process of breaking down, your mouldy oranges, crusts of bread and rancid walnuts are still chock full of calorie-giving molecules such as sugars, starches, fats, proteins and organic acids. In the context of global warming we all want to be carbon neutral (or even good little carbon sinks), so the responsible thing to do is to capture and re-use all that trapped energy.

In the old days when we all lived on farms and when even urban dwellers had pigs and chickens to deal with slops, very little food waste was sent to landfill. At the very least, back in our grandparents’ time potato skins, gristle and the like was put on the compost heap to eventually be recycled back into the soil. At present we are throwing out worrying amounts of food, to the extent that even swivel-eyed Brexit politicians and the Pope are expressing concern. Most of the food waste we dump goes to landfill, meaning that the carbon that cost so much in terms of energy, water and other resources to incorporate into our munchables returns slowly to the atmosphere with nobody benefiting except the dumpsite decomposers (beetles, worms, fungi and bacteria) whose job it is to gobble up macromolecules and spit out water, carbon dioxide and a bit of methane and hydrogen sulphide.

What can be done to recapture the energy we are losing through discarding so much food? One excellent option is to convert it to biogas. And guess who are the stars in the story of the conversion of old soggy spinach and crusty peanut butter to methane? You guessed it, our old friends, bacteria. And not just one bacterium or one group of bacteria, but a whole bunch of them working in co-operative layers (called “trophic” layers). Here’s a helicopter view of the process, which takes place in a huge tank known as an anaerobic digester. Your green waste management company collects you and thousands of others’ food waste and transports it to its site where after some pre-processing it is packed into the temperature-controlled digester. The bugs get to work on it immediately. The first group of bugs, hydrolysers, chop up the big molecules in the waste (e.g. starches) into smaller ones (e.g. sugars). These simpler molecules then become the foodstuff for acidogenic bacteria, who produce acidy compounds like those found in pickles. A third group of bugs work on the acids to form primarily acetic acid (found in vinegar), carbon dioxide and hydrogen. Finally, methanogenic bacteria mop up the waste products of the other three groups and produce — you guessed it — methane. This gas is our biofuel and can be burned to produce heat or drive turbines that generate electricity. It’s not just waste disposal companies who are getting into anaerobic digestion: many factories now have their own on-site anaerobic digestors to convert their own waste into usable energy.

You may be wondering why “anaerobic” digestion? Anaerobic means “in the absence of air. More correctly, we are talking about the absence of oxygen. Aerobic digestion carried out by aerobic bacteria is basically like slow-burning your waste: you oxidise (burn) your macromolecules and generate heat, carbon dioxide and water, but no usable high-energy compounds such as methane are left behind by the process.

Finally, a special form of recycling food waste, is the conversion of cooking oil to biodiesel. Companies such as Olleco collect used cooking oil from chippers, Chineses and caffs and, using a chemical process (no bacteria involved, unfortunately), turn what would be a potent building block for sewer fatbergs into fuel.

Waste not want not!


Some Like it Hot

Milk nowadays is a heck of a lot safer than it used to be. Before pasteurisation became widespread in the period following World War Two, you could have been taking your life into your hands by drinking a glass of raw milk. As well as a whole list of nasties including many that cause food poisoning (E. coli, Staphylococcus aureus, Salmonella and Campylobacter), raw milk also harboured the bacteria behind deadly diseases such as tuberculosis, brucellosis, diphtheria and scarlet fever. As the son of a couple both of whom were affected by tuberculosis as children, I have little time for the irresponsible quackery spouted by those who advocate the benefits of drinking raw milk (see this pseudoscientific article by Darina Allen, for example). Pasteurisation has saved and continues to save lives. Full stop. End of argument.

How does pasteurisation work? Simple: you apply enough heat to a liquid food product to bring about a reduction in the number of bugs it harbours. Pasteurisation is carried out for two reasons: to make a product safe; and to give it a longer shelf life. Beer is pasteurised, not to make it safe (beer is intrinsically safe because no food-poisoning organism studied to date has been able to grow in its high-alcohol, low pH and anaerobic environment) but to kill bacteria that would otherwise shorten its shelf life through the production of off-flavours, “wrong” mouthfeels, gushing or cloudiness. Milk is pasteurised mainly to render the product safe: none of the bugs referred to above make it through the pasteurisation process. But by no means does pasteurisation kill all the bacteria in milk. Milk is not a sterile product. That’s why it has a best-before date and why that distended carton you find in your fridge after you come back from your two weeks in Marbella reeks to high heaven when you gingerly take a whiff.

The microbial lads and lassies who survive the standard pasteurisation applied to milk are referred to as “thermodurics”, from the Greek thermo (heat) and Latin durare (to last). There are six main genera of thermodurics found in milk: Bacillus; Micrococcus; Enterococcus; Lactobacillus; Corynebacterium; Streptococcus. The vast majority of the bugs belonging to these genera are not harmful to your health when present in low numbers. It is generally only when they multiply to such levels that their toxins reach threatening concentrations or the amount of cells themselves cross that threshold referred to as “infectious dose” that milk becomes dangerous from a food safety outlook. But nine hundred and ninety-nine thousand nine-hundred and nine-nine times out of a million your milk will smell so bad by then and look so strange (ropey, clotted, yellow, orange) that even one of The Walking Dead‘s finest would turn their nose up at it. Each of the organisms produces a distinctive off quality in milk as a function of its metabolism and what enzymes it secretes. Bacillus species cause “bitty” or “broken” cream and “sweet curdling” while Lactobacillus will acidify your milk (these are, after all the gals behind the fermentation [of milk] that produces yoghurt).

While it is very, very rare these days for in-date pasteurised milk to be the cause of serious food poisoning, cases have been documented. Rigorous and all as dairies’ quality systems are, there is the odd one-in-a-million batch that escapes all the testing performed on milk. But nothing like the level of disease raw milk would be causing were it still widely consumed. To finish up with a quote from Wendie L.Claey et al. whose paper “Raw or heated cow milk consumption: Review of risks and benefits” makes for sobering reading:

raw milk poses a realistic health threat due to a possible contamination with human pathogens. It is therefore strongly recommended that milk should be heated before consumption. With the exception of an altered organoleptic profile, heating (in particularly ultra high temperature and similar treatments) will not substantially change the nutritional value of raw milk or other benefits associated with raw milk consumption.

Microbiological Miasmata

A miasma as defined by the Oxford English Dictionary is an “an unpleasant or unhealthy smell or vapour”. The word is of Greek origin and refers to a widely held belief that originated in the Middle Ages that noxious vapours or “night airs” were responsible for spreading disease. Miasmata (the plural of miasma) rose from rotting vegetation, soil and brackish water, causing every sort of disease from the common cold to bunions to bubonic plague. The zymotic theory behind miasmata reflected the best medieval medical thinking could offer to explain contagious disease in an era prior to the discovery of bacteria and viruses. Even a century and half after Pasteur and his fellow pioneering microbiologists put miasmata to bed with germ theory there are still vestiges of belief in disease-causing vapours and airs. If I’d a penny for every time my mother complained of catching a cold from a draught or heard an aul’ one wittering on about catching their death from the few raindrops landing on their sleeves I’d be able to afford a Gibson Chet Atkins guitar signed by Robert Smith himself!

While it is scientific heresy of the order of deserving a visit from a modern-day lab-coat-wearing Torquemada to suggest that air is a disease-causing agent, a recent review by Govindaraj Dev Kumar and co-workers gave me pause for thought. In this excellent review the authors offer evidence for air as means of carriage of disease-causing bacteria and viruses — or rather the dust particles carried by the air act as agents of dispersal of microbes.

Imagine this scenario: you run a small cheese-making operation. Your production facility is state-of-the-art, with food-grade stainless steel surfaces and process equipment. Your staff are highly trained, possess all the necessary safe handling certificates. Their hygiene is impeccable. Your HACCP (Hazard Analysis and Critical Control Points) is excellent. Your raw material, milk, is of the highest standard — low total bacterial counts, low somatic cell counts. Your pasteurisers yield milk that meets the most stringent regulatory standards. But, inexplicably, from time to time you produce batches of cheese that fail your quality control. The cheese looks off, tastes off and it has the wrong texture. It cannot be sold. You look for reasons why these batches fail. You find a wild Lactococcus strain has outgrown and displaced your starter strains and you scratch your head and wonder where the “bad” Lactococcus comes from. How has it entered your gleaming, space-age factory? Could it have come in on a stray draught? You examine the environment surrounding your factory. Et violà — five hundred yards from your facility is a barn in which silage is stored. On certain mornings an easterly wind carries the aroma of ripening silage right to the factory door. As well as the smell, you deduce that the breeze is also brining dust particles laden with bacteria. After a bit of classic investigative microbiology you find your wild Lactococcus growing in the silage and you prove that the dust that arrives to your factory also harbours the little mite. Your problem is solved by HEPPA-filtering your factory’s air.

Govindaraj Dev Kumar et al‘s review stresses the importance of wind-driven (technically called “aeolian”) dust-mediated contamination of foodstuffs. While currently there is much focus on water quality for the irrigation and washing of fruits and vegetables, and on the microbiological quality of the soil in which they are grown, as well as on the water used to wash down animal carcasses the authors highlight the risks of dust-borne contamination on foods of plant and animal origin. They cite cases of tree-grown fruit being contaminated with enteric bacteria associated with ruminant faeces. How does cow doo-doo make its way on to trees? Cows can’t fly! Faeces dry out, break down into dust and are blown up into the branches. It’s probably not a good idea to have a cherry orchard beside a cattle ranch!

The authors break down the sources of dust-borne contamination into natural (soil, decaying vegetation, feral animal droppings) and man-made (manure-amended soils, silage, municipal sewage-based biosolids, composting and animal production facilities). The most prominent species found to be involved in dust-borne contamination of food to date include: Escherichia coli O157:H7, Salmonella, Listeria monocytogenes, Campylobacter and methicillin-resistant Staphylococcus aureus (MRSA). The speed and distance of wind-driven carriage of microbes is impressive: the authors cite one instance of Xanthomonas malvacearum being scattered across 40 hectares in 20 minutes by a single whirlwind. The most alarming phenomenon described in the review is the internal contamination of fruits by dust-borne bacteria. Bacteria made their way on to the fruit through dust landing on blossoms and were internalised as the fertilised ovum developed into a fruit. No washing will eliminate bacteria harboured on the inside of a fruit and since much fruit goes uncooked the ingestion of a single contaminated orange or apple could introduce millions of pathogens into an unfortunate consumer’s gastrointestinal system.

In conclusion, there is cause for us microbiologists to start taking an interest in miasmata!

FISH-ing for Bacteria

PMC3061963_1472-6831-11-8-1Just as there are many ways to skin a cat, there are dozens of methods for looking for bacteria. Methods can be divided into a number of categories. There are slow methods, the foremost of which is the traditional “plate and wait” Petri dish-based plate count. There are rapid methods such as PCR or flow cytometry. There are bulk methods, which detect the metabolism or metabolic by-products of large (and undefined) numbers of bugs, the ATP bioluminescence assay being one of the best known. There are methods with allow you to count the numbers of individual cells present, including any number of microscopic methods. Then there are molecular methods, involving the detection of sequences of DNA or RNA. Most DNA-based methods would be also classified as rapid as they do not necessitate the growth of the bugs. If you are looking for a fella who likes to take his time in dividing (notorious strains include Mycobacterium tuberculosis, which causes tuberculosis, and Helicobacter pylori, which causes stomach ulcers) rapid methods for their detection are the holy grail.

One such rapid, molecular method is fluorescent in-situ hybridisation (FISH). Like all DNA or RNA detection methods, FISH is based on the chemistry of these macromolecules; the complementarity of pairs of the nucleotides that make up DNA or RNA. In the case of DNA, A (adenine) sticks to T (thymine), and G (guanine) sticks to C (cytosine). In RNA U (uracil) replaces T. During the great molecular biology revolution of the late 1970s and early 1980s scientists figured out that if they could synthesise short strands of DNA which were complimentary to sequences in the chromosomes of cells, if they stuck a fluorescent tag onto the synthetic molecules and if they could get these tagged sequences into cells, then they could light up very specific spots on chromosomes. So, you have a sequence A-T-T-A-G-G-G-C on chromosome 22 that you want to detect. Perhaps it’s a mutation, perhaps a risk factor for a disease. You design a probe which runs T-A-A-T-C-C-G, stick a fluorescent molecule such as fluorescein isothyocyanate onto it, et violà, if you can get your probe into the cell and focus on it using a fluorescent microscope, you are detecting your sequence of interest using FISH. You’re directly detecting a mutation or risk factor. Initially, FISH was used to detect mutations in human chromosomes and proved a great advance for the diagnosis of genetic conditions and certain types of cancers. It wasn’t long, though before microbiologists got in on the act. They saw the benefits of a rapid, non-growth-based method. But microbiologists weren’t primarily interested in looking for mutations in their bugs — they were just looking for them.

As the DNA code of more and more bacteria came to be known, microbiologists began using this data to design FISH probes specific to their species (or group of species) of interest. A whole new world was opened up to us. Because FISH does not require the cultivation of bacteria, bugs who we had never succeeded in getting to grow on plates before lit up our microscopes and flow cytometers like the fourth of July. Where no evidence had previously existed of the presence of certain strains in water, soil, desert sands, marine sludges, biofilms in food-processing environments, et cetera, FISH was showing the little critters in all their fluorescence-tagged glory. Bacteria were found in the most unexpected niches — weird and wonderful bacteria, for which we didn’t even have names. It is fair to say that FISH turned microbiology upside down, especially environmental microbiology. It is because of FISH and other molecular methods that we know that us microbiologists have only managed to grow up and study in the lab far less than 1% of all the species that are out there.

FISH is not the panacea, however, for slow and tedious growth-based microbiology. It is a difficult technique. Probe design must be done thoughtfully and meticulously. If your probe is too specific it may only stick to its target under very favourable conditions. If it’s too loose, you’ll get lots of false positives. Probes must enter the cell to stick to their DNA or RNA target. This means permeabilising the bacteria’s not inconsiderable outer layers. Some Gram-positive bugs are a dream to permeabilise. Other lads, with mucous layers and thick walls and extra membranes are a nightmare. You also have to fix your cells (not in the sense of fixing your car) but in the sense of using a chemical such as formaldehyde to dehydrate them and cross-link their proteins so they maintain their contents and structure. Fixing kills cells, which means you have no idea if the FISH-ed cell you are seeing on your microscope or cytometer is was alive or dead at the time of sampling. Not good for food or water quality analysis. There are ways around this, but it adds an extra level of complexity to your analysis.

FISH, while not being a routine method for looking for bugs, has found in niche in environmental microbiology. It is a technique which has shown us that there were many more species in our surroundings than the lassies we were able to grow in the lab. In combination with other techniques such as cell sorting and single cell sequencing, FISH will allow the capture, growth and genetic sequencing of specific bugs from any environment one might wish. The way things are going FISH may be used some day to show that there is bacterial life on Mars!

Some Good Sporeformers

Sporeforming bacteria have a bad rep. The food industry hates them; there are entire conferences dedicated to the elimination of Bacillus and Clostridium from milk powders and infant formula. Doctors and hospital managers quake at reports of high levels of C. difficile in their hospitals. There’s nothing to instill more fear in the general public than talk of an anthrax outbreak, be it related to farm animals or bioterrorism. It is spores’ extreme resistance to all the usual treatments that we use to eliminate bacteria, combined with their ability to survive for years in a dormant state that causes our fear and loathing of them. But we must not forget that, in the case of certain sporeformers, these properties can be harnessed by biotechnologists. Not all spores are villains.

The first sporeforming good ‘un that I came across in my study of microbiology all those years ago was Bacillus thuringiensis. Many readers will have encountered the terms “Bt” or “Bt proteins” or “Bt crystals”, and perhaps some of you will have applied a product containing these terms in its list of ingredients to a cabbage patch or a flower bed. B. thuringiensis spores contain protein crystals called δ-endotoxins which have insecticidal action and have been used for a hundred years as biocontrol agents for the caterpillars of moths and butterflies. You spray your field with a preparation of Bt spores. When the caterpillars get a-munching they ingest the spores whose protein crystals then proceed to turn the poor wigglers’ insides into mush. The great thing about applying Bt spores to your land is that they persist in the soil for years. They don’t wash into the groundwater like conventional chemical pesticides, and they do not cause any damage to the environment or the man or woman charged with applying them. Bt preparations are the insecticide of choice for organic farmers. Genetically modified crops containing the gene sequence for a Bt protein, and which therefore produced endogenous Bt protein, were a cause celebre in the GM wars in the 1990s. I’m not going there!

Another triplet of sporeformers are being looked at as natural pesticides which would see use similar to Bt. Lysinibacillus sphaericus (formerly Bacillus sphaericus — these crazy taxonomists can’t leave any poor bug in peace!) is a mosquitocidal bacterium which harbours its insecticidal crystal in a mesh-like parasporal envelope and could be applied to swamplands close to towns and villages to reduce the impact of malaria-carrying mosquitoes. Imagine the lives that could be saved by such a simple and safe intervention. Pasteuria penetrans is a parasite of root-knot nematodes — pests to which are attributed 5% of annual global crop loss. The spores have been commercialized as a biological control agent. A cousin of P. penetrans, P. ramosa infects water fleas of the Daphnia genus. Infected hosts are completely sterilized and have a reduced life span.

Now, it’s time to describe a more esoteric and technical use for bacterial spores. Spores of the genus, Bacillus, all display a protein on their outer surface, the exosporium, called BclA. Genetic engineering of this protein — splicing a gene from another species onto the anchor region of BclA — allows scientists to stick any protein they want on the outside of these spores. What use is this, you may ask. Because of spores’ unique stability in the environment and inside living organisms, as well as their amenability to mass production processes such as freeze drying and inclusion in “wet” formulations, bioengineered spores offer huge possibilities. Recombinant protein-decorated spores can be purified and used as microparticles for industrial, bioremediation, or vaccine applications. Imagine a Bacillus spore genetically modified to present an antigen from the malaria parasite on their surface. You have a candidate for a malaria vaccine there. What’s more, an orally delivered vaccine (spores will survive passage through the low pH environment of the stomach). Because it has been shown that enzymes retain their activity when tethered to the exosporium, GM Bacillus are ideal for a range of biocatalytic processes; the clean-up of oil slicks, waste water treatment or the manufacture of bioethanol.

Anthrax and Argi(nase)-bargy

I’ve p2pha_humanarginasereviously referred to the evolutionary toing and froing between microbes and their hosts as an “arms race“. A microbe develops a sneaky new way to attack or evade its host. A couple of millennia later the host species evolves a way to overcome its foe. A new strain of the dastardly bug crops up, replete with a mutation that allows it slink past its host’s beefed-up and shiny new defences. The host species finds a way to block this latest ruse. Et cetera, et cetera. And so the deadly dance escalates.

The more you learn about disease-causing microbes (or, to be fancy, pathogens) the more one comes to appreciate the extent to which evolution has honed these suckers into becoming laser-guided killers. It would seem as if every single gene in their genome, every single structure from wall to flagellum — heck, every single molecule they possess — is purposed with doing damage to their host in order to win the ageless battle between pathogens and us decent folk.

Let’s take one example — that of Bacillus anthracis and one (just one of many!) of the weapons in its disease-causing arsenal (technically known as virulence factors): arginase. Everybody knows that B. anthracis is the causative agent of the deadly disease anthrax. It forms resistant spores that can persist in the environment for centuries and which can be freeze-dried and stored as a powder, making it the bug of choice for back room bio-terrorists. In a nutshell, B. anthracis infects the body by worming its way into cells, hiding out there, dividing, spreading to uninfected cells, and of course producing its lethal toxins. It (or evolution) has developed a myriad of ways to hide from or counteract the immune system. Arginase is just one of these.

One of the main cells charged with the elimination of B. anthracis and other invaders from the body is the macrophage. In Greek, macrophage means “big eater”, which pretty much encapsulates what these immune cells do — they travel around the body gobbling up bugs. But they don’t use teeth to masticate the microbial malfeasants — they use enzymes and chemicals to digest and obliterate them. One of the most effective chemicals employed by macrophages to scuttle invading bugs is nitric oxide, which acts on them in the same manner as a solar wind on a spaceship without deflector shields. Fizzle, fizzle, fizzle! Not very many bugs can withstand a blast of the old NO, and not very many bugs have developed a defence against it. Except, that is, B. anthracis.

Which is where arginase comes in. B. anthracis spores have significant amounts of the enzyme in their outer layer (the exosporium). The arginase competes with the macrophage’s nitric oxide synthase (the enzyme responsible for making NO) for its L-arginine substrate, snatching the cell’s raw materials before it gets the chance to stock up on the gas. No NO means the spores and the vegetative cells that germinate from them are spared a blast of solar wind and so have a higher chance of survival and reproduction. And killing us.

What better example of the immunological arms race could there be? One bug, one host, one defence mechanism (NO) and one way around it (arginase).

Staph aureus — An Undercover Bug

Bacteria have devised all kinds of means to evade detection, capture and destruction by the immune system. Some cover themselves in gloop to be able to fend off the phagocyctes that would otherwise gobble them up (e.g. Mycobacterium tuberculosis), others hide out inside the very cells who are responsible for their elimination and use these cells to ferry them around the body (Listeria moncytogenes ), while the subject of today’s blog, Staphylococcus aureus, prefers to disguise itself. Yes folks, you heard right, as well as being the perp behind a whole swathe of diseases from the common pimple to fatal bacteremia, Staph aureus is partial to a bit of fancy dress.

We all know the basis of how the immune system works — the discrimination of self from non-self. Using a complex system of signal and detector molecules, the immune system can tell if a molecule or a cell (or an entire organ in the case of transplants) is of the body or not, a native or a foreign invader. Every second of every day, every square inch of our bodies is being monitored for the presence of a host of nasties that could see us six feet under faster than you could say immunoglobulin. So, the bacteria or viruses that make their way into the bloodstream via a cut or lesion are literally marked men: they are whooshing around frantically trying to evade the millions of molecules (antibodies mostly) and cells (T-cells, neutrophils, macrophages, among others) that recognise them as foreign and are capable of perforating, exploding, scoffing, splodging, splidging and generally eliminating them.

But what if a bacterium could disguise itself as a human cell? And not just any old human cell, but a cell of the very person it has just infected?

Well, enter Perp A, Mr Staph Aureus.

As part of the arms race between host and parasite, infectee and infecter, Mr Aureus has evolved a devilishly ingenious method of hiding from the body’s forces of law and order. Harnessing the very molecules whose job it is to specifically latch on to foreign invaders and mark them for elimination by phagocytes, Staph coats itself in antibodies (a group of molecular heat-seeking missiles) from our own bodies, so that to the cells responsible for telling friend from foe the bacterium looks as meek and harmless as one of the Flanders boys in Sunday school. And as a wolf dressed in sheep’s clothing Staph is at liberty to travel around the body replicating, spewing out toxins, and nibbling away at our tissues.

Staphylococcus aureus‘ strategy is highly effective. Scientists have shown that strains of the bug lacking the protein responsible for its disguise — the creatively named Protein A — do not persist in the bloodstream as long as their antibody-clad brethren. Among the hundreds of bacteria that prey on us humans, Staph is one of the most virulent and difficult to treat. We all know about the dangers of MRSA — methicillin-resistant Staphylococcus aureus — but there are many more iterations of Staph-associated disease: pneumonia, meningitis, osteomyelitis, endocarditis, toxic shock syndrome and sepsis. It is the most common cause of hospital-acquired infections and causes hundreds of thousands of deaths worldwide each year.

There is nothing in nature that man has not turned to good use. So potent a molecular magnet for antibodies is Staph aureus Protein A that biotechnologists use it to purify antibodies for commercial or research purposes.

The Complex Mixture of Bugs Behind Cider Making

Cider is an alcoholic beverage made from apples and, depending on country or region of origin, ranges from a clear, amber colour, to turbid brown-green. It is usually highly carbonated. The powerhouses of cider production in Europe are Ireland and Britain, France, Slovenia and Spain. While many might regard cider as the poor man of the fermentation world, lacking the subtleties associated with beer formulation such as choice of malts to include, which, when and how much hops to use, the microbiology of the traditional cider fermentation rivals only that of wine in its complexity. Indeed, many of the yeasts involved in the conversion of apple must (a fancy word for juice) into cider are also present in the context of winemaking, the source for these yeasts being the skins of apple and grape respectively.

Some of the yeasts in question are Brettanomyces anomalus, B. bruxellensis, Dekkera polymorphus, Hanseniaspora uvarum, Metschnikowia pulcherrima; Hansenula anomola, H. fermentas, H. guilliermondii, Saccharomycodes ludwigii and Saccharomyces cerevisiae (the yeast behind beer and wine’s primary fermentation). As with many of the fermentations where non-Saccharomyces yeast form an important component of the microflora, the fermentation of a traditionally brewed cider is sequential, with one trophic group of organisms kicking things off, altering the must chemically so that their own survival becomes compromised. Another layer then takes over the fermentation, using the metabolites produced by the first group as energy and nutrient sources, until they themselves are replaced.

In a study of the role of indigenous yeasts in traditional Irish cider, Morrissey et al. (2004) divided the fermentation in three broad phases. During the first phase, the fruit yeast phase, species that were shown to arise from the apples’ surface dominated the fermentation. Among these was H. uvarum, which accounted for 80% of the yeast cells during the fermentation’s short lag phase and which reached counts of six million per millilitre. As fermentation proceeded and other species took over, H. uvarum was no longer detectable after twelve days. S. cerevisiae became the dominant yeast during the second fermentation phase. This phase witnessed a rise in levels of ethanol, and numbers of S. cerevisiae reaching eight million cells per millilitre. Among the possible reasons attributed to the overgrowth of S. cerevisiae at the expense of other organisms were: S. cerevisiae‘s excretion of ethanol; oxygen depletion in the must; increasing levels of must carbon dioxide; S. cerevisiae‘s faster growth rate and outcompeting of other yeasts for nutrients and sugars; and the flocculation (clumping together) of non-Saccharomyces species. During the third maturation phase, Dekkera/Brettanomyces species came to replace others, accounting for 90% of the yeast by day twenty-two of the fermentation. These species were shown to enter the must from installations in the press house (termed “resident yeast” by the authors), and to a lesser extent the surfaces of the apples, and contribute greatly to the organoleptic quality (taste and flavour) of the finished product. This is why there are commonalities in the flavour profiles of traditional ciders and Belgian lambic beer, where Brettanomyces also play an important role. Dekkera/Brettanomyces species are also encountered at this stage in French cider.

This traditional form of cider making incorporates neither temperature control nor deliberate inoculation. The organisms entering the must all come from the apples and what could be considered to be the “conditioned” environment of the cider house. Temperatures vary widely during the fermentation, determining as well as being determined by the particular yeasts dominating the fermentation at any point of time. Because of the loose control the traditional cider brewer exerts over the process, quality control is compromised, with large batch-batch variation. Additionally, seasonal variations in the microflora present on apples (as well as seasonal variations in ambient temperature) lend an extra layer of chance to the process. So, just as in the case of winemaking, we could talk about a good vintage of a particular brand of traditionally made cider.


Morrissey W.F., Davenport B., Querol A., Dobson A.D.W. (2004) ‘The role of indigenous yeasts in traditional Irish cider fermentations’, Journal of Applied Microbiology, 97,647–655.


ConArainbowsquareWe often think that it is only us animals that have sophisticated defence mechanisms against the microbes that would like to make a meal of us. While the mammalian immune system is highly complex and functions as a co-ordinated network of different cell types and the molecules they produce, shows the ability to learn and boasts a memory function, the defences plants possess are no less complex and effective. Plants do not have a circulatory system as we mammals have — they have no blood or lymph to move soldier cells to zones under attack — but they do possess an array of molecules which could only be described as agents of chemical warfare to protect them from viruses, bacteria, moulds and insects.

One category of molecule the plant uses to defend itself are lectins. Plant seeds, especially legumes, are packed with these. Your lectin is like a cross between a molecular limpet and super glue. Lectins recognise the sugar molecules out of which the capsids of viruses, the walls of bacteria and the exoskeletons of insects are made of and stick to them like barnacles. Many thousands of lectins working together form a glue around invading bugs, immobilising them and preventing their movement and growth. Some lectins will also enter the cells to which they have attached and work their toxic magic on them. Each lectin will recognise a specific sugar group: wheat germ agglutinin sticks to N-acetyl-D-glucosamine and sialic acid; concanavalin A binds to α-D-mannosyl and α-D-glucosyl residues; peanut agglutinin has a liking for Galβ1-3GalNAcα1-Ser/Thr.

As has been the case with other immunity molecules such as antibodies, biotechnologists have harnessed the specificity of lectins for their experimental systems. There is an array of bioassays based on the ability of lectins to bind to a specific pattern of carbohydrate (known as a ligand). In a complex system a researcher might wish to light up specific cells or molecules with a fluorescent lectin and so gain an insight into structure or function. I myself have used concanavalin A tagged with a fluorescent marker to bind with reasonable specificity to enteric bacteria, and so light them up for cytometry and microscopy. Many blood group antigens are recognised by lectins, which form the basis of assays for typing. It is very common for lectins to be used as tools for the of imaging proteins for microscopy. Just check out Thermo Fisher’s lectin catalogue if you want an idea of the extent to which lectins are used by research scientists.

Of course, human beings being what there are, some lectins have been put to nefarious uses. We are all familiar with ricin — a lectin from the seed of the castor oil plant and highly toxic. Less than 2 milligrams can see off the average person. Many governments over the years have developed weapons systems for delivering ricin to the battlefield or unsuspecting civilian populations. During the Cold War, both blocks had stores of weaponised ricin. Thankfully, it has never been used in a wartime setting. Given that it is relatively easy to produce (compared to say botulism or synthetic nerve toxins) and that castor beans are easy to acquire or grow (in fact, the castor oil plant is a common ornamental species), ricin is something of the agent of choice for terrorist organisations, death cults and disturbed would-be mass murderers.

Colony Forming Units

img_0685.jpgMicrobiologists are always interested in knowing how many bacteria are present in a sample. You could say they’re obsessed with establishing how many invisible little bugs are growing in whatever system they happen to be studying. My job is to know how many nasties are in milk, its derivatives — whey powder, cheese, skim milk, et cetera — and the equipment used to process it. Numbers of bacteria per millilitre or gram or square centimetre is not a particularly easy thing to establish, and sometimes how to do so, gives me sleepless nights. But that’s for another blog.

When microbiologists speak about levels of bacteria in food or water or soil we do not refer to X number of bacteria per gram: we use colony forming units (CFUs). The CFU has been used since the very start of analytical microbiology, and when you learn of how the vast majority of quantitative assays are carried out by microbiologists as well as the nature of microbial growth you will understand just how rational it is to use CFUs. From the late nineteenth century, when agar-based solid media began to be used for growing bugs, and with the Petri dish becoming established as the format of choice for holding the media, microbiologists have observed that bacteria tend to form colonies on such nutritive solid surfaces.

A colony is simply the visible glob made up of billions of bacteria and which arises from the rapid, ceaseless exponential growth of a single founding cell (or cluster of cells — more anon). If you put any* live microbial cell on a solid growth medium which contains the correct balance of salts, proteins, energy sources and vitamins and incubate at the bug’s optimal temperature and with the right mix of gases (some bugs don’t like oxygen — the strict anaerobes) you will have a visible colony in a day or two. Now, some bugs are incredibly fussy, and it has taken the best part of a century to work out what media suit their growth (species of Mycobacterium are notoriously hard to grow), but for the majority of the bugs of interest to your average food microbiologist there are media which will have your lads forming colonies in double quick time.

The discovery of colonial growth was a huge step forward for microbiology. Prior to this, the only way to count bugs was by eye. “But they’re microscopic!” I hear you say. Indeed and they are! Prior to the days of the Petri dish, you put your sample on a special gridded slide, stained the bugs as best you could, hunkered down over your microscope and got counting until your eyeball dried up! Not pleasant work. And not very accurate. We humans are subjective and tend to make mistakes. Was that a rod or a coccus or just a salt crystal? It is far, far easier to take a sample, say a swab from the surface of a meat counter, get the bacteria into suspension in water or a solution such as Ringer’s which has salts that will help keep the bugs alive, and put that suspension to grow on a dish. You come back the next day and count the colonies and you know how many bugs were in the sample. Simple, isn’t it?

Yes, but it comes with a proviso. While bacteria are single-celled organisms, not all bacteria exist as single cells. Some bugs grow in chains, others in clusters (Staphylococcus aureus is characterised by “bunch of grape” clusters under microscopic examination), and others exist as pairs or triplets. And then there are biofilms: huge, sometimes macroscopic amalgamations of bacteria trapped in their extracellular secretions. So, when you get your sample of suspended bacteria and plate it out on a Petri dish you are generally not dealing of a uniform suspension of single cells. And when these lads land on the surface of the agar and begin to grow it can be anything from one cell to dozens that give rise to your colony. Hence, to be strictly correct, each colony that has grown up is not evidence of the outgrowth of a single cell — it is evidence of the outgrowth of a colony forming unit. And the number that your microbiologist comes up with is an estimate, because they can never know how many bugs make up a colony forming unit.


*Not really any. Molecular studies have shown that perhaps less than 1% of microbial species have been ever grown on artificial media. It is really only the bacteria that are capable of growth on artificial media that us microbiologists know much about. But luckily, it is the ones of interest to us medically, environmentally and biotechnologically that have tended to be amenable to our trying to grow them. The rest of the microbial world are something resembling dark matter: we know they are there, but not very much else.