One Flash of Light, One Vision: Carrots, Colour and Chemistry

“White” light is made up of all the colours of the rainbow.

Sometimes you have one of those weeks when the universe seems to be determined to yell at you about a certain thing. That’s happened to me this week, and the shouting has been all about light and vision (earworm, anyone?).

I started the week writing about conjugated molecules and UV spectrometry for one project, was asked a couple of days ago if I’d support a piece of work on indicators for the RSC Twitter Poster Conference that’s happening from 2-3rd March, and then practically fell over a tweet by Dr Adam Rutherford about bacteria that photosynthesise from infrared light in a hydrothermal vent*.

Oh well, who am I to fight the universe?

Light is awesome. The fact that we can detect it is even awesome-er. The fact that we’ve evolved brains clever enough built all sorts of machines to measure other kinds of light that our puny human eyes cannot detect is, frankly, astonishing.

The electromagnetic spectrum covers all the different kinds of light. (Image source)

Let’s start with some basics. You probably met the electromagnetic (EM) spectrum at some point in school. Possibly a particularly enthusiastic physics teacher encouraged you to come up with some sort of mnemonic to help you remember it. Personally I like Rich Men In Vegas Use eXpensive Gadgets, but maybe that’s just me.

The relevant thing here is that the EM spectrum covers all the different wavelengths of light. Visible light, the stuff that’s, well, visible (to our eyes), runs from about 400 to 700 nanometres.

A colour wheel: when light is absorbed, we see the colour opposite the absorbed wavelengths. (Image source)

Now, we need another bit of basic physics (and biology): we see light when it enters our eyes and strikes our retinas. We see colours when only certain wavelengths of light make it into our eyes.

So-called “white” light is made up of all the colours of the rainbow. Take one or more of those colours away, and we see what’s left.

For example, if something looks red, it means that red light made it to our eyes, which in turn means that, somewhere along the way, blue and green were filtered out.

(Before I go any further, there are actually several causes of colour, but I’m about to focus on one in particular. If you really want to know more, there’s this book, although it is a tad expensive…)

Back to chemistry. Certain substances absorb coloured light. We know them as pigments. Carrots are orange, for example, largely because they contain a pigment called beta-carotene (or β-carotene). This stuff appears, to our eyes, as red-orange, and the reason for that is that it absorbs green-blue light, the wavelengths around 400-500 nm.

β-Carotene is a long molecule with lots of C=C double bonds. (Image source.)

Why does it absorb light at all? Well, β-carotene is a really long molecule, with lots of C=C double bonds. These bonds form what’s called a conjugated system. Without getting into the complexities of molecular orbital theory, that means the double bonds alternate along the chain, and they basically overlap and… smoosh into one long thing. (Look, as the saying goes, “all models are wrong, but some are useful,” – it’ll do for now.)

When molecules with conjugated systems are exposed to electromagnetic light, they absorb it. Specifically, they absorb in the ultraviolet region – the wavelengths between about 200 and 400 nanometres. Here’s the thing, though, those wavelengths are right next to the violet end of the visible spectrum – that’s why it’s called ultraviolet after all.

Molecules with really long conjugated systems start to absorb in the coloured light region, as well. And because they’re absorbing violet and blue, possibly a smidge of green, they look… yup! Orangey, drifting into red.

So now you know why carrots are orange. Most brightly coloured fruit, of course, is that way to attract animals and birds to eat it, and thus spread its seeds. As fruit ripens, it usually changes colour, making it stand out better against green foliage and easier to find. This is the link with indicators that I mentioned at the start: many fruits contain anthocyanin pigments, and these often have purple-red colours in neutral-acidic environments, and yellow-green at the more alkaline end. In other words, the colour change is quite literally an indicator of ripeness.

But the bit of the carrot that we usually eat is underground, right? Not particularly easy to spot, and they don’t contain seeds anyway. Why are carrots bright orange?

Modern carrots are mostly orange, but purple and yellow varieties also exist.

Well, they weren’t. The edible roots of wild plants almost certainly started out as white or cream-coloured, as you might expect for something growing underground, but the carrots which were first domesticated and farmed by humans in around 900 CE were, most probably, purple and yellow.

As carrot cultivation became popular, orange roots began to appear in Spain and Germany in the 15th/16th centuries. Very orange carrots, with high levels of β-carotene, appeared from the 16th/17th centuries and were probably first cultivated in the Netherlands. Some have theorised that they were particularly selected for to honour William of Orange, but the evidence for this seems to be a bit slight. Either way, most modern European carrots do descend from a variety that was originally grown in the Dutch town of Hoorn.

In other words, brightly-coloured carrots are a mutation which human plant breeders selected for, probably largely for appearances.

But wait! There was an advantage for humans, too – even if we didn’t realise it straight away. β-carotene (which, by the way, has the E number E160a – many natural substances have E numbers, they’re nothing to be frightened of) is broken up in our intestines to form vitamin A.

Vitamin A is essential for good eye health.

Vitamin A, like most vitamins, is actually a group of compounds, but the important thing is that it’s essential for growth, a healthy immune system and – this is the really clever bit – good vision.

We knew that. Carrots help you see in the dark, right?

Hah. Well. The idea that carrot consumption actually improves eyesight seems to be the result of a World War II propaganda campaign. During the Blitz, the Royal Air Force had (at that time) new, secret radar technology. They didn’t want anyone to know that, of course, so they spread the rumour that British pilots could see exceptionally well in the dark because they ate a lot of carrots, when the truth was that those pilots were actually using radar.

But! It’s not all a lie – there is some truth to it! Our retinas, at the back of our eyes, have two types of light-sensitive cells. Cone cells help us distinguish colours, while rod cells help us detect light in general.

In those rod cells, a molecule called 11-cis-retinal is converted into another molecule called rhodopsin. This is really light-sensitive. When it’s exposed to light it photobleaches (stops being able to fluoresce), but then regenerates. This process takes about thirty minutes, and is a large part of the reason it takes a while for your eyes to “get used to the dark.”

Guess where 11-cis-retinal comes from? Yep! From vitamin A. Which is why one of the symptoms of vitamin A deficiency is night blindness. So although eating loads of carrots won’t give you super-powered night vision, it does help to maintain vision in low light.

Our brain interprets electrical signals as vision.

How do these molecules actually help us to see? Well, when rhodopsin is exposed to light, the molecule changes, which ultimately results in an electrical signal being transmitted along the optic nerve to the brain, which interprets it as vision!

In summary, not only is colour all about molecules, but our whole visual system depends on some clever chemistry. I told you chemistry was cool!

Just gimme fried chicken 😉

*Ah. I sort of ran out of space for the weird hydrothermal bacteria thing. At least one of the relevant molecules seems to be another carotenoid, probably chlorobactene. The really freaking amazing thing is that there seems to be an absorption at 775 nm, which is beyond red visible light and into the infrared region of the EM spectrum. Maybe more on this another day…

If you’re studying chemistry, have you got your Pocket Chemist yet? Why not grab one? It’s a hugely useful tool, and by buying one you’ll be supporting this site – it’s win-win! If you happen to know a chemist, it would make a brilliant stocking-filler! As would a set of chemistry word magnets!

Like the Chronicle Flask’s Facebook page for regular updates, or follow @chronicleflask on Twitter. Content is © Kat Day 2021. You may share or link to anything here, but you must reference this site if you do. If you enjoy reading my blog, and especially if you’re using information you’ve found here to write a piece for which you will be paid, please consider buying me a coffee through Ko-fi using the button below.
Buy Me a Coffee at

Want something non-sciency to distract you from, well, everything? Why not check out my fiction blog: the fiction phial.


Onerous ovens: why is cleaning the cooker such a chore?

As I write Thanksgiving was a few days ago, when most Americans traditionally cook a very large meal based around roasted turkey. Most Brits – and other countries of course – have the same thing coming up soon in the form of Christmas, and there are lots of other celebrations around this time of year that seem to feature cooking and food quite heavily.

Whatever your traditions, then, it’s a time when many of us frown critically at the dark, sticky depths of our oven and wonder if, perhaps, we should attempt to give it a clean. Or at least pay someone else to come and clean it.

Why is oven cleaning such a difficult and unpleasant job, anyway? It’s not that hard to clean other surfaces, is it? Why are ovens so particularly awful?

Well, to explain this, we first need to understand fats.

Fats vaporise during cooking.

Most of the grime in your oven is fat, combined with the carbonised remains of… something or other. The sorts of fats that are common in animal and plant products have boiling points around the 300 oC mark (animal fats typically having higher values than plant oils), but they start to form vapours at much lower temperatures, and certainly at typical cooking temperatures there’s plenty vaporised oil around. Besides, under typical conditions most oils will “smoke” – i.e. start to burn – long before they get close to boiling.

We’re all familiar with the idea that fats don’t mix well with water, and herein lies the problem: all that fatty gloop that’s stuck to the inside of your oven just doesn’t want to come off with standard cleaning methods, particularly when it’s built up over time.

Can chemistry help us here? What are fats, chemically? Well, they’re esters. Which may or may not mean anything to you, depending on how much chemistry you can remember from school. But even if you don’t remember the name, trust me, you know the smell. In particular, nail polishes and nail polish removers contain the simple ester known as ethyl acetate, otherwise known as ethyl ethanoate. (Some people say this chemical smells like pear drops which… only really helps if you know what pear drops smell like. Look, it smells of nail polish, okay?)

Fats are esters (image source)

Anyway, the point is that esters have a particular sequence of atoms that has a carbon bonded to an oxygen, which is bonded to another carbon, which is in turn double-bonded to oxygen. This is a bit of a mouthful, so chemists often write it as COOC. In the diagram here, oxygen atoms are red while carbon atoms are black.

There are actually three ester groups in fat molecules – which explains why fats are also known as triglycerides.

In terms of general chemistry, esters form when a carboxylic acid (a molecule which contains a COOH group) reacts with an alcohol (a molecule that contains an OH group). And this is where it all starts to come together – honest – because you’ve probably heard of fatty acids, right? If nothing else, the words turn up in certain food additive names, in particular E471 mono- and diglycerides of fatty acids, which is really common in lots of foods, from ice cream to bread rolls.

Glycerol is a polyol — a molecule that contains several alcohol groups (image source)

Well, this reaction is reversible, and as a result fats (which are esters, remember) break up into fatty acids and glycerol – which is a polyol, that is, a molecule with several alcohol groups. Or, to look at it the other way around: fats are made by combining fatty acids with glycerol.

And the reason it’s useful to understand all this is that the way you break up esters, and therefore fat, is with alkalis. (Well, you can do it with acid, too, but let’s not worry about that for now.)

Strong alkalis break up fats in a chemical reaction called hydrolysis — the word comes from the Greek for water (hydro) and unbind (lysis) and so literally means “split up with water”. Humans have known about this particular bit of chemistry for a long time, because it’s fundamental to making soap. As I said a few months ago when I was banging on about hand-washing, the ancient Babylonians were making soap some 4800 years ago, by boiling fats with ashes – which works because alkaline compounds of calcium and potassium form when wood is burnt at high temperatures.

The grime in ovens is mostly fat.

The really clever thing about all this is that two things are happening when we mix alkali with fat: not only are we breaking up the fat molecules, but also the substances they break up into are water-soluble (whereas fats, as I said at the start, aren’t). Which makes them much easier to clean away with water. Obviously this is the very point of soap, but it’s also handy when trying to get all that baked-on gunk off your oven walls.

Now, in theory, this means you could get some lye (aka sodium hydroxide, probably), smear it all over your oven and voilà. But I don’t recommend it, for a few reasons. Firstly, it’s going to be difficult to apply, since sodium hydroxide is mostly sold as pellets or flakes (it’s pretty easy to buy, because people use it to make soap).

Sodium hydroxide, sometimes called lye, is often sold in the form of pellets.

But, you say, couldn’t I just dissolve it in water and spray or spread it on? Yes, yes you could. But it gets really, really hot when you mix it with water. So you need to be incredibly careful. Because, and this is my next point, chemically your skin is basically fat and protein, and this reaction we’re trying to do on oven sludge works equally well on your skin. Only, you know, more painfully, and with scarring and stuff. In short, if you’re handing lye, wear good nitrile on vinyl gloves and eye protection.

Actually, regardless of how you’re cleaning your oven you should wear gloves and eye protection, because the chemicals are still designed to break down fats and so… all of the above applies. It’s just that specially-designed oven cleaners tend to come with easier (and safer) ways to apply them. For example, they might come as a gel which you can paint on, and/or with bags that you can put the racks into, and may also be sold with gloves and arm protectors (but rarely goggles – get some separately). They might also have an extra surfactant, such as sodium laureth sulfate, added to help with breaking down grease. The main ingredient is still either potassium hydroxide or sodium hydroxide, though.

Well, possibly, but also not really, if you’re sensible.

As an aside, it makes me smile when I come across an article like this which talks about the “serious” chemicals in oven cleaners and more “natural” ways to clean your oven. The “natural” ways are invariably weak acids or alkalis such as lemon juice or baking soda, respectively. They’re essentially ineffective ways of trying to do exactly the same chemistry.

And okay, sure, the gel and the bag and so on in the modern kits are newer tech, but the strong alkali? Nothing more natural than that. As I said at the start, humans have literally been using it for thousands of years.

A point which really cannot be repeated enough: natural does not mean safe.

Fumes can be irritating to skin, eyes and lungs.

Speaking of which, you will get fumes during oven cleaning. Depending on the exact cleaning mixture involved, these will probably be an alkaline vapour, basically (haha) forming as everything gets hot. Such vapour is potentially irritating to skin, eyes and lungs, but not actually deadly toxic. Not that I recommend you stick your head in your freshly-scrubbed oven and inhale deeply, but you take my point. It might give food a soapy, possibly bitter (contrary to what’s stated in some text books, not all alkalis taste bitter, but do not experiment with this) taste if you really over-do it.

In short, if you’re cleaning your oven yourself: follow the manufacturer’s instructions, make sure your kitchen is well-ventilated, leave the oven door open for a while after you’ve finished and, to be really sure, give all the surfaces an extra wash down with plenty of water.

Put the cleaning off until January – after all, the oven’s only going to get dirty again.

And that’s… it, really. Whether you’re cleaning your own oven or getting someone else to do it for you, the chemistry involved is really, really old. And yes, the chemicals involved are hazardous, but not because they’re not “natural”. Quite the opposite.

Or you could just leave it. I mean, it’s only going to get dirty again when you cook Christmas dinner, right?

If you’re studying chemistry, have you got your Pocket Chemist yet? Why not grab one? It’s a hugely useful tool, and by buying one you’ll be supporting this site – it’s win-win! If you happen to know a chemist, it would make a brilliant stocking-filler! As would a set of chemistry word magnets!

Like the Chronicle Flask’s Facebook page for regular updates, or follow @chronicleflask on Twitter. Content is © Kat Day 2020. You may share or link to anything here, but you must reference this site if you do. If you enjoy reading my blog, and especially if you’re using information you’ve found here to write a piece for which you will be paid, please consider buying me a coffee through Ko-fi using the button below.
Buy Me a Coffee at

Want something non-sciency to distract you from, well, everything? Why not check out my fiction blog: the fiction phial.

Sunshine, skin chemistry, and vitamin D

The UK is on the same latitude as Northern Canada (Image Source: Wiki Commons)

As I write this it’s the last day of September in the U.K., which means we’re well into meteorological autumn and summer is, at least here, a distant memory. The weather is cooler and the days are getting shorter. Soon, the clocks will go back an hour, and we’ll shift from BST (British Summer Time) to GMT (Greenwich Mean Time).

Seasons in the U.K. are particularly marked because of our northerly latitude. British weather tends to be fairly mild (thanks, Gulf Stream), and it’s easy to forget just how far north we are – but a quick look at a globe makes it clear: London is actually further north than most of the major Canadian cities, while the Polar Bear Provincial Park in Ontario is roughly on the same latitude as Scotland’s capital city, Edinburgh.

Yes, I hear you say, but what on Earth (hoho) does this have to do with chemistry?

Well, a clever little piece of chemistry happens in human skin, and, if you live in the U.K., it’s about to stop. At least, until next spring.

Some clever chemistry happens in human skin.

There’s a substance in your skin called 7-dehydrocholesterol (7-DHC). It is, as the name suggests, something to do with cholesterol (which, despite its bad press, is an essential component of animal cell membranes). In fact, 7-DHC is converted to cholesterol in the body, but it’s also converted to something else.

You will have heard of vitamin D. It helps us to absorb calcium and other minerals, and if children, in particular, don’t get enough it can lead to rickets – which leads to weak bones, bowed legs and stunted growth. Vitamin D deficiency has also been linked to lots of other health problems, including increased risk of certain cancers, heart disease, arthritis and even type one diabetes.

More recently, vitamin D has been linked to COVID-19. It’s estimated that around 80-85% of people who contract COVID-19 experience mild or no symptoms, while the rest develop severe symptoms and, even if they recover, may suffer life-altering after-effects for many months. Early data suggest that patients with low vitamin D levels are much more likely to experience those severe symptoms. There’s a plausible mechanism for this: vitamin D helps to regulate the immune system and, in particular, helps to reduce the production of cytokines.

It’s possible that having inadequate levels of vitamin D may increase your chances of a severe response to COVID-19.

Cytokines are small proteins which are important in cell signalling, but if the body starts to produce too many in response to a virus it can cause something called a cytokine storm, which can lead to organ failure and death.

It’s proposed that having the right levels of vitamin D might help to prevent such cytokine storms, and therefore help to prevent a severe COVID-19 response. This is all early stages, because everyone is still learning about COVID-19, and it may turn out to be correlation without causation, but so far it looks promising.

One thing you many not know is that vitamin D is, technically, misnamed. Vitamins are, by definition, substances which are required in small quantities in the diet, because they can’t be synthesised in the body.

But vitamin D, which is actually a group of fat soluble molecules rather than a single substance, can be synthesised in the body, in our skin. The most important two in the group are ergocalciferol (vitamin D2) and cholecalciferol (vitamin D3), sometimes known collectively as calciferol.

Shiitake mushrooms are a good source of vitamin D2.

Vitamin D2 is found in fungi, but it’s cleared more quickly from the body than D3, so needs to be consumed in some form daily. Mushrooms are a good source (especially if they’ve been exposed to UV light), so if you like mushrooms, that’s one way to go. Vitamin D3 is hard to obtain from diet – the only really good source is oily fish, although other foods are fortified – but that’s okay because, most of the time, we don’t need to eat it.

Which brings us back to 7-DHC. It’s found in large quantities in the skin, although exactly how it gets there has been the subject of some debate. It used to be thought it was formed from cholesterol via an enzymatic reaction in the intestine wall and then transported to the skin via the bloodstream. But the trouble with this idea is that the blood would pass through the liver, and 7-DHC would be reconverted to cholesterol, never having a chance to build up in skin. A more robust theory is it’s actually synthesised in the skin in the first place, particularly since higher levels are found in a layer closer to the surface (the stratum spinosum) than in the deeper dermis.

We make vitamin D in our skin when we’re exposed to UVB light from the sun.

Anyway, the important thing is that 7-DHC absorbs UV light, particularly wavelengths between 290 and 320 nm, that is, in the UVB range, sometimes called “intermediate” UV (in contrast with “soft” UVA, and “hard” UVC). When exposed to UVB light, one of the rings in the 7-DHC molecule breaks apart, forming something known pre-D3, that then converts (isomerises) to vitamin D3 in a heat-sensitive process.

In short, we make vitamin D3 in our skin when we’re in the sunshine. Obviously we need to avoid skin damage from UV light, but the process doesn’t take long: 10-15 minutes of midday sunlight three times a week, in the U.K. in the summer, is enough to keep our levels up.

Sun exposure is by far the quickest, and certainly the cheapest, way to get your vitamin D. If you live somewhere where that’s possible.

Here’s the thing, though, if you live in the U.K., for a chunk of the year, it’s just not. I’ve pinched the graph here from my husband, whose work involves solar panels, because it makes a nice visual point.

The amount of sunlight we’re exposed to in the U.K. drops sharply in autumn and winter.

From April – September, there’s plenty of energy available from sunlight. But look at what happens from October – March. The numbers drop drastically. And here’s the thing: it turns out that vitamin D production in human skin only occurs when UV radiation exceeds a certain level. Below this threshold? Well, no photocoversion takes place.

In short: if you live in the U.K. you can’t make vitamin D in your skin for a few months of the year. And those few months are starting… round about now.

The NILU has a web page where you can calculate how much vitamin D you can synthesise in your skin on a given day.

If you want to experiment, there’s a website here, published by the Norwegian Institute for Air Research (NILU), where you can enter various parameters – month, longitude, cloudiness etc – and it will tell you how many hours during a given a day it’s possible to synthesise vitamin D in your skin.

Have a play and you’ll see that, for London, vitamin D synthesis drops off to zero somewhere around the end of November, and doesn’t restart until sometime after the 20th of January. In Edinburgh, the difference is even more marked, running from the first week or so of November to the first week of February.

It’s important to realise that it tails off, too, so during the days either side of these periods there’s only a brief period during midday when you can synthesise vitamin D. And all this assumes a cloudless sky which in this country… is unlikely.

The skin pigment, melanin, absorbs UVB. (Image Source: Wiki Commons)

The situation is worse still if you have darker skin because the skin pigment, melanin, absorbs UVB. On the one hand, this is a good thing, since it protects skin cells from sun-related damage. But it also reduces the ability to synthesise vitamin D. In short, wimpy autumn and winter sunshine just isn’t going to cut it.

Likewise, to state the obvious, anyone who covers their skin (with clothing or sunblock), also won’t be able to synthesise vitamin D in their skin.

Fortunately, there’s a simple answer: supplements. The evidence is fairly solid that vitamin D supplements increase blood serum levels as well as, if not better than, sunshine – which, for the reasons mentioned above, can be difficult to obtain consistently.

Now, as I’ve said many times before, I’m not a medical doctor. However, I’m on fairly safe ground here, because Public Health England do actually recommend everyone take a vitamin D supplement from October to May. That is, from now. Yes, now.

I do need to stress one point here: DO NOT OVERDO IT. There always seems to be someone whose reasoning goes along the lines of, “if one tablet is good, then ten will be even better!” and, no. No. Excessive doses of vitamin D can cause vomiting and digestive problems, and can lead to hypercalcemia which results in weakness, joint pain confusion and other unpleasant symptoms.

If you live in the U.K. you should be taking a vitamin D supplement from October-May.

Public Health England recommend everyone in the U.K. take 10 micrograms per day in autumn and winter. Babies under one year should also be given 8.5–10 micrograms of vitamin D in the form of vitamin drops, unless they’re drinking more than 500 ml of infant formula a day (because that’s already fortified).

Amounts can get a little confusing, because there are different ways to measure vitamin D doses, and in particular you may see IU, or “international units“. However, if you buy a simple D3 supplement, like this one that I picked up at the supermarket, and follow the dose instructions on the label, you won’t go far wrong.

So, should you (and everyone else in your family) be taking a simple vitamin D supplement right around now? If you live in the U.K., or somewhere else very northerly, then yes. Well, unless you’re really keen to eat mushrooms pretty much every day. At worst, it won’t make much difference, and at best, well, there’s a chance it might help you to avoid a really unpleasant time with COVID-19, and that’s got to be a good thing.

But, look, it’s not toilet roll. Don’t go and bulk buy vitamin D, for goodness sake.

Until next time, take care, and stay safe.

If you’re studying chemistry, have you got your Pocket Chemist yet? Why not grab one? It’s a hugely useful tool, and by buying one you’ll be supporting this site – it’s win-win!

Like the Chronicle Flask’s Facebook page for regular updates, or follow @chronicleflask on Twitter. Content is © Kat Day 2020. You may share or link to anything here, but you must reference this site if you do. If you enjoy reading my blog, and especially if you’re using information you’ve found here to write a piece for which you will be paid, please consider buying me a coffee through Ko-fi using the button below.
Buy Me a Coffee at

Want something non-sciency to distract you from, well, everything? Why not check out my fiction blog: the fiction phial.

Lovely lollipops: the chemistry of sugary things

20th July is National Lollipop Day!

Today, July 20th, is apparently national lollipop day in the United States, and general news is… *waves hands* so it seems like a good excuse to write something with lots of pictures of brightly coloured sweets, right? Plus, sugar!

The idea of putting something sugary on a stick to hold and eat is an ancient one. The very earliest humans probably used sticks to collect honey from beehives. Later, the Chinese, Egyptians and people from the Middle East dipped fruits and nuts in honey and used sticks to make them easier to eat.

In the 17th century, boiled sugar sweets were made in England and, again, sticks inserted to make eating easier. This may be where the name “lollipop” originates, since “lolly” is a dialect word for tongue. Later, in the American Civil War era (early 1860s), some sources say hard candy was put on the tips of pencils for children. In 1931 an American named George Smith started making hard candies on sticks, and trademarked the name lollipop — but he reportedly took the name from a racehorse named “Lolly Pop”.

Table sugar is sucrose

Enough history, let’s get to the chemistry! Lollipops are made of sugar, with added colours and flavours. I’ve talked about sugar before, and it’s always worth remembering that we tend to use the word rather loosely in everyday speech.

There’s more than one type of sugar: in particular, the three that are probably most familiar are glucose, fructose and sucrose. Glucose is a simple sugar, and the one you might remember from photosynthesis and respiration equations. It’s essential for life, and you quickly run into serious trouble if your blood glucose levels drop too low (just ask a diabetic).

Like glucose, fructose is a monosaccharide (the simplest form of sugar), and is often called “fruit sugar” because, guess what, it’s common in fruits. Sucrose is what we know as “table sugar” and is a disaccharide, made up of a unit of glucose joined to a unit of fructose. In the body, sucrose is broken up into glucose and fructose.

Rock candy is made from sucrose but, unlike in most lollipops and hard candy, the sugar is allowed to form large crystals

The primary ingredient in lollipops is usually sucrose, which can be persuaded (more in a minute) to set nicely to produce a hard, shiny surface. However, commercial lollipops often also include corn syrup, or glucose syrup, which contains oligosaccharides: larger sugar molecules made from a number of simple sugar molecules joined together. Typically, as the name “glucose syrup” might suggest, these molecules contain units of glucose.

It’s worth mentioning here that corn syrup/glucose syrup isn’t the same as “high fructose corn syrup” or HFCS, in which the glucose molecules have been converted into fructose. This product is cheap, sweet and commercially easy to use, but it’s also controversial. Excessive consumption has been linked to obesity and non-alcoholic fatty liver disease, although the actual evidence is weak: a systematic review in 2014 concluded that there was little evidence it was worse than other forms of sugar. It’s really a problem of quantity: it’s easy and cheap for food manufacturers to throw HFCS into foods and drinks, and of course it tastes delicious, so as a consequence consumers end up eating too much of the stuff. In short: more water and fruit, less cake and fizzy drinks.

But having done the obligatory “eat healthily” thing, one lollipop isn’t going to hurt, is it? So back to that…

Fudge, perhaps surprisingly, contains the crystalline form of sugar

When it cools, sugar forms two different types of solid: crystalline and glassy amorphous (sometimes described as ‘amorphous solid’). Now, you might imagine that sugar as a crystalline solid is found in hard sweets/candies, but, no — it mostly turns up in soft things like fudge and fondant, which contain lots of very tiny crystals, giving an ever-so slightly granular texture. (An exception is rock candy, where the sugar is encouraged to form large crystals.)

The glassy amorphous form of sugar, on the other hand, can be literally like glass: hard, brittle, and transparent. In fact, “sugar glass” has in the past been used to make windows, bottles and so on for special effects in film and television, because it’s much less likely to cause injury than “real” glass. However, it’s very fragile and hygroscopic (meaning it absorbs water, causing it to soften over time) so these days it’s largely been replaced by synthetic resins.

Honey can be used as an inhibitor, to prevent crystallisation

The glassy amorphous form of sugar is achieved by starting with a 50% sugar solution which also contains an inhibitor, to prevent crystals forming spontaneously. Common inhibitors are the corn syrup I mentioned earlier, or cream of tartar (potassium bitartrate), honey or butter.

Exactly which you use depends on the recipe, but they all do essentially the same thing, namely, get in the way of the glucose molecules and prevent them ordering themselves into a regular (crystalline) structure. The mixture is heated to a high temperature (about 155 oC) until almost all the water evaporates — the final candy will only have about 1-2% water — and then cooled until glass transition occurs.

At the glass transition point, the sugar mixture becomes solid.

This is the clever bit, and only happens if crystallisation is inhibited (else crystals form instead). Glass transition happens around 100-150 oC below the melting point of the pure substance. For example, the melting point of pure sucrose is 186 oC, but it undergoes glass transition at around 60 oC.

Glass transition is a reversible change, which we might (if I didn’t generally dislike the concept) call a physical change. It’s a change of phase, where the sugar mixture changes from liquid to solid, but it’s different from crystallisation, because instead of the molecules becoming more ordered, they simply ‘freeze’ in their random, liquid positions. (It is, for the record, annoyingly difficult to show this in diagram form.)

Amorphous solid structures are sometimes called “supercooled liquids”. This isn’t wrong, but personally I think it’s unhelpful (and can lead to nonsense about glass flowing very slowly over time). Once cooled and set, glass, whether window glass or sugar glass, is absolutely not a liquid; it’s a solid.

Of course, to make lollipops, all sorts of colours and flavours are added to the mixture as well, and sometimes more than one mixture is used to create intricate, layered effects. There are even medicinal lollipops which contain, for example, the powerful painkiller fentanyl — the idea being that the patient can administer the dose gradually as needed.

Which brings me to the end. Happy National Lollipop Day! My favourites are Chupa Chups — if you’ve enjoyed this, how about popping over to Ko-fi so I can stock up? And if you’ve been eating sweets, do remember to clean your teeth!

If you’re studying from home, have you got your Pocket Chemist yet? Why not grab one? It’s a hugely useful tool, and by buying one you’ll be supporting this site – it’s win-win!

Like the Chronicle Flask’s Facebook page for regular updates, or follow @chronicleflask on Twitter. Content is © Kat Day 2020. You may share or link to anything here, but you must reference this site if you do. If you enjoy reading my blog, and especially if you’re using information you’ve found here to write a piece for which you will be paid, please consider buying me a coffee through Ko-fi using the button below.
Buy Me a Coffee at

Want something non-sciency to distract you from, well, everything? Why not check out my fiction blog: the fiction phial.

Easy Indicators

Indicator rainbow, reproduced with kind permission of Isobel Everest, @CrocodileChemi1

Recently on Twitter CrocodileChemist (aka Isobel Everest), a senior school science technician (shout out to science technicians, you’re all amazing) shared a fabulous video and photo of a “pH rainbow”.

The effect was achieved by combining various substances with different pH indicators, that is, substances that change colour when mixed with acids or alkalis.

Now, this is completely awesome, but, not something most people could easily reproduce at home, on account of their not having methyl orange or bromothymol blue, or a few other things (that said, if you did want to try, Isobel’s full method, and other indicator art, can be found here).

But fear not, I’ve got this. Well, I’ve got a really, really simple version. Well, actually, I’ve got more of an experiment, but you could make it into more of a rainbow if you wanted. Anyway…

This is what you need:

  • some red cabbage (one leaf is enough)
  • boiling water
  • mug
  • white plate, or laminated piece of white card, or white paper in a punched pocket
  • cling film/clear plastic wrap (if you’re using a plate)
  • mixture of household substances (see below)
  • board marker (optional) or pen
  • plastic pipettes (optional, but do make it easier – easily bought online)

First, make the indicator. There are recipes online, but some of them are over-complicated. All you really need to do is finely chop the red cabbage leaf, put it in a mug, and pour boiling water over it. Leave it to steep and cool down. Don’t accidentally drink it thinking it’s your coffee. Pour off the liquid. Done.

If you use a plate, cover it with cling film

Next, if you’re using a plate, cover it with cling film. There are two reasons for this: firstly, cling film is more hydrophobic (water-repelling) than most well-washed ceramic plates, so you’ll get better droplets. Secondly, if you write on a china plate with a board marker it doesn’t always wash off. Ask me how I know.

Next step: hunt down some household chemicals. I managed to track down oven cleaner, plughole sanitiser, washing up liquid, lemon juice, vinegar, limescale remover and toilet cleaner (note: not bleach – don’t confuse these two substances, one is acid, one is alkali, and they must never be mixed).

Label your plate/laminated card/paper in punched pocket with the names of the household substances.

Place a drop of cabbage indicator by each label. Keep them well spaced so they don’t run into each other. Also, at this stage, keep them fairly small. Leave one alone as a ‘control’. On my plate, it’s in the middle.

Add a drop of each of your household substances and observe the colours!

Red cabbage indicator with various household substances

IMPORTANT SAFETY NOTE: some of these substances are corrosive. The risk is small because you’re only using drops, but if working with children, make sure an adult keeps control of the bottles, and they only have access to a tiny amount. Drip the more caustic substances yourself. Take the opportunity to point out and explain hazard warning labels. Use the same precautions you would use when handling the substance normally, i.e. if you’d usually wear gloves to pick up the bottle, wear gloves. Some of these substances absolutely must not be mixed with each other: keep them all separate.

Here’s a quick summary of what I used:

A useful point to make here is that pH depends on the concentration of hydrogen ions (H+) in the solution. The more hydrogen ions, the more acidic the solution is. In fact, pH is a log scale, which means a change of x10 in hydrogen concentration corresponds to a change of one pH point. In short, the pH of a substance changes with dilution.

Compound Interest’s Cabbage Indicator page (click image for more info)

Which means that if you add enough water to acid, the pH goes up. So, for example, although the pH of pure ethanoic acid is more like 2.4, a dilute vinegar solution is probably closer to 3, or even a bit higher.

Compound Interest, as is usually the case, has a lovely graphic featuring red cabbage indicator. You can see that the colours correspond fairly well, although it does look like my oven cleaner is less alkaline (closer to green) than the plughole sanitiser (closer to yellow).

As the Compound Interest graphic mentions, the colour changes are due to anthocyanin pigments. These are red/blue/purple pigments that occur naturally in plants, and give them a few advantages, one of which is to act as a visual ripeness indicator. For example, the riper a blackberry is, the darker it becomes. That makes it stand out against green foliage, so it’s easier for birds and animals to find it, eat it and go on to spread the seeds. Note that “unripe” colours, yellow-green, are at the alkaline end, which corresponds to bitter flavours. “Ripe” colours, purple-red, are neutral to acidic, corresponding with much more appealing sweet and tart flavours. Isn’t nature clever?

You can make a whole mug full of indicator from a single cabbage leaf (don’t drink it by mistake).

Which brings me to my final point – what if you can’t get red cabbage? Supermarkets are bit… tricky at the moment, after all. Well, try with some other things! Any dark-coloured plant/fruit should work. Blueberries are good (and easy to find frozen). The skins of black grapes or the very dark red bit of a rhubarb stalk are worth a try. Blackberries grow wild in lots of places later in the year. Tomatoes, strawberries and other red fruits will also give colour changes (I’ve talked about strawberries before), although they’re less dramatic.

For those (rightly) concerned about wasting food – you don’t need a lot. I made a whole mug full of cabbage indicator from a single cabbage leaf, and it was the manky brown-around-the-edges one on the outside that was probably destined for compost anyway.

So, off you go, have fun! Stay indoors, learn about indicators, and stay safe.

EDIT: after I posted this, a few people tried some more experiments with fruits, vegetables and plants! Beaulieu Biology posted the amazing grid below, which includes everything from turmeric to radishes:

Image reproduced with kind permission of Beaulieu Biology (click for larger version)

And Compound Interest took some beautiful photos of indicator solutions extracted from a tulip flower, while CrocodileChemist did something similar and used the solutions to make a gorgeous picture of a tree. Check them out!

If you’re studying from home, have you got your Pocket Chemist yet? Why not grab one? It’s a hugely useful tool, and by buying one you’ll be supporting this site – it’s win-win!

Want something non-sciency to distract you? Why not check out my fiction blog: the fiction phial. There are loads of short stories, and even (recently) a couple of poems. Enjoy!

Like the Chronicle Flask’s Facebook page for regular updates, or follow @chronicleflask on Twitter. Content is © Kat Day 2020. You may share or link to anything here, but you must reference this site if you do. If you enjoy reading my blog, please consider buying me a coffee through Ko-fi using the button below.
Buy Me a Coffee at

Marvellous Mushroom Science

Glistening ink caps produce a dark, inky substance.

Yesterday I had the fantastic experience of a “fungi forage” with Dave Winnard from Discover the Wild, organised by Incredible Edible Oxford. There are few nicer things than wandering around beautiful Oxfordshire park- and woodland on a sunny October day, but Dave is also an incredibly knowledgeable guide. I’ve always thought mushrooms and fungi were interesting – living organisms that are neither plants nor animals and which we rely on for everything from antibiotics to soy sauce – but I had lots to learn.

Did you know, for example, that fungi form some of the largest living organisms on our planet? And that without them most of our green plants wouldn’t have evolved and probably wouldn’t be here today?

And from a practical point of view, what about the fact that people once used certain fungi to light fires? I’ve always imagined fungi as being quite wet things with a high water content (unless they’re deliberately dried, of course), but some are naturally very dry. Ötzi, the mummified man thought to have lived between 3400and 3100 BCE, was found with two types of fungus on him: birch fungus, which has antiparasitic properties, and a type of tinder fungus which can be ignited with a single spark and will smolder for days.

Coprine causes unpleasant symptoms, including nausea and vomiting, when consumed with alcohol.

Then, of course, there’s all the interesting chemistry. Early on in the day, we came across some glistening ink caps.The gills of these disintegrate to produce a black, inky liquid which contains a form of melanin and can be used as ink. And there’s more to this story: as I’ve already mentioned, fungi are not plants and they can’t photosynthesise, but it seems that some fungi do use melanin to harness gamma rays as energy for growth. Extra mushrooms for the Hulk’s breakfast, then?

Moving away from pigments for a moment, a related species to the glistening ink cap, the common ink cap, contains a chemical called coprine. This causes lots of unpleasant symptoms if it’s consumed with alcohol, similar to Disulfiram, the drug used to treat alcoholism. For this reason one of this mushroom’s other names is tippler’s bane. The coprine in the mushrooms effectively causes an instant hangover by accelerating the formation of acetaldehyde (also known as ethanal) from alcohol. Definitely don’t pair that mushroom omelette with a nice bottle of red and, worse, you’ll need to stay off the booze for a while: apparently the effects can linger for a full three days.

Yellow stainer mushrooms look like field mushrooms, but are poisonous.

We also came across some yellow stainer mushrooms. These look a lot like field mushrooms, but be careful – they aren’t edible. They cause nasty gastric sympoms and are reportedly responsible for most cases of mushroom poisoning in this country, although some people seem to be able to eat them without ill effect. They had a slightly chemically scent that reminded me “new trainer” smell – sort of rubbery and plasticky. It’s often described as phenolic, but I have to say I didn’t detect that myself – although yellow stainers have been shown to contain phenol and this could account for their poisonous nature. Anyway, it was an aroma that wouldn’t be entirely unpleasant if I were opening a new shoebox, but it wasn’t something I’d really want to eat. Apparently the smell gets stronger as you cook them, so don’t ignore what your nose is telling you if you think you have a nice pan of field mushrooms.

4,4′-Dimethoxyazobenzene is an azo dye.

The real giveaway with yellow stainers, though, is their tendency to turn yellow when bruised or scratched, hence the name. This, it seems, is due to 4,4′-dimethoxyazobenzene. The name might not be familiar, but A-level Chemistry students will recognise the structure: it’s an azo-dye. Quite apart from being a very useful word in Scrabble, azo compounds are well-known for their characteristic orange/yellow colours. It’s not really clear whether it forms in the mushroom due to some sort of oxidation reaction, or whether it’s in the cells anyway but only becomes visible when the cells are damaged. Either way, it’s something to look out for if you spot a patch of what look like field mushrooms.

The blushing wood mushroom.

We also came across several species which are safe to eat. One I might look out for in future is the blushing wood mushroom. As is often the way with fungi, the name is literal rather than merely poetic. These mushrooms have a light brown cap, beige gills, and a pale stem, but they turn bright red when cut or scratched due to the formation of an ortho-quinone. It’s quite a dramatic colour-change, and makes them pretty easy to identify. Apparently they’re normally uncommon here, but we found quite a lot of them, which might be something to do with this year’s unusally hot and dry summer.

Red ortho-quinone causes blushing wood mushrooms to literally blush.

I tried to find out the reasons for these colour-changes. In the plant and animal kingdoms pigments are usually there for good reason: camouflage, signalling and communication or, as with chlorophyll, as a way of making other substances. Fruits, for example, often turn bright red as they ripen because it makes them stand out from the green foilage and encourages animals to eat them so that the seeds can be spread. Likewise, they’re green when they’re unripe because it makes them less obvious and less appealing. But what’s the advantage for the mushroom to change colour once it’s already damaged? Perhaps there isn’t one, and it’s just an accident of their biology, but if so it seems strange that it’s a feature of several species. I couldn’t find the answer; if any mycologists are reading this and know, get in touch!

Velvet shank mushrooms.

Other edible species we met were fairy ring champignons, field blewits and jelly ear fungus – which literally looks like a sort of transparent ear. I’ll definitely be looking out for all of these in the future, but it’s important to watch out for dangerous lookalikes. Funeral bell mushrooms, for example, look like the velvet shank mushrooms we found but, once again, the name is quite literal – funeral bells contain amatoxins and eating them can cause kidney and liver failure. As Dave was keen to remind us: never eat anything you can’t confidently name!

Like the Chronicle Flask’s Facebook page for regular updates, or follow @chronicleflask on Twitter. Content is © Kat Day 2018. You may share or link to anything here, but you must reference this site if you do.

If you enjoy reading my blog, please consider buying me a coffee through Ko-fi using the button below.
Buy Me a Coffee at

Spectacular Strawberry Science!

Garden strawberries

Yay! It’s June! Do you know what that means, Chronicle Flask readers? Football? What do you mean, football? Who cares about that? (I jest – check out this excellent post from Compound Interest).

No, I mean it’s strawberry season in the U.K.! That means there will be much strawberry eating, because the supermarkets are full of very reasonably-priced punnets. There will also be strawberry picking, as we tramp along rows selecting the very juiciest fruits (and eating… well, just a few – it’s part of the fun, right?).

Is there any nicer fruit than these little bundles of red deliciousness? Surely not. (Although I do also appreciate a ripe blackberry.)

And as if their lovely taste weren’t enough, there’s loads of brilliant strawberry science, too!

This is mainly (well, sort of, mostly, some of the time) a chemistry blog, but the botany and history aspects of strawberries are really interesting too. The woodland strawberry (Fragaria vesca) was the first to be cultivated in the early 17th century, although strawberries have of course been around a lot longer than that. The word strawberry is thought to come from ‘streabariye’ – a term used by the Benedictine monk Aelfric in CE 995.

Woodland strawberries

Woodland strawberries, though, are small and round: very different from the large, tapering, fruits we tend to see in shops today (their botanical name is Fragaria × ananassa – the ‘ananassa’ bit meaning pineapple, referring to their sweet scent and flavour.

The strawberries we’re most familiar with were actually bred from two other varieties. That means that modern strawberries are, technically, a genetically modified organism. But no need to worry: practically every plant we eat today is.

Of course, almost everyone’s heard that strawberries are not, strictly, a berry. It’s true; technically strawberries are what’s known as an “aggregate accessory” fruit, which means that they’re formed from the receptacle (the thick bit of the stem where flowers emerge) that holds the ovaries, rather than from the ovaries themselves. But it gets weirder. Those things on the outside that look like seeds? Not seeds. No, each one is actually an ovary, with a seed inside it. Basically strawberries are plant genitalia. There’s something to share with Grandma over a nice cup of tea and a scone.

Anyway, that’s enough botany. Bring on the chemistry! Let’s start with the bright red colour. As with most fruits, that colour comes from anthocyanins – water-soluble molecules which are odourless, moderately astringent, and brightly-coloured. They’re formed from the reaction of, similar-sounding, molecules called anthocyanidins with sugars. The main anthocyanin in strawberries is callistephin, otherwise known as pelargonidin-3-O-glucoside. It’s also found in the skin of certain grapes.

Anthocyanins are fun for chemists because they change colour with pH. It’s these molecules which are behind the famous red-cabbage indicator. Which means, yes, you can make strawberry indicator! I had a go myself, the results are below…

Strawberry juice acts as an indicator: pinky-purplish in an alkaline solution, bright orange in an acid.

As you can see, the strawberry juice is pinky-purplish in the alkaline solution (sodium hydrogen carbonate, aka baking soda, about pH 9), and bright orange in the acid (vinegar, aka acetic acid, about pH 3). Next time you find a couple of mushy strawberries that don’t look so tasty, don’t throw them away – try some kitchen chemistry instead!

Peonidin-3-O-glucoside is the anthocyanin which gives strawberries their red colour. This is the form found at acidic pHs

The reason we see this colour-changing behaviour is that the anthocyanin pigment gains an -OH group at alkaline pHs, and loses it at acidic pHs (as in the diagram here).

This small change is enough to alter the wavelengths of light absorbed by the compound, so we see different colours. The more green light that’s absorbed, the more pink/purple the solution appears. The more blue light that’s absorbed, the more orange/yellow we see.

Interestingly, anthocyanins behave slightly differently to most other pH indicators, which usually acquire a proton (H+) at low pH, and lose one at high pH.

Moving on from colour, what about the famous strawberry smell and flavour? That comes from furaneol, which is sometimes called strawberry furanone or, less romantically, DMHF. It’s the same compound which gives pineapples their scent (hence that whole Latin ananassa thing I mentioned earlier). The concentration of furaneol increases as the strawberry ripens, which is why they smell stronger.

Along with menthol and vanillin, furaneol is one of the most widely-used compounds in the flavour industry. Pure furaneol is added to strawberry-scented beauty products to give them their scent, but only in small amounts – at high concentrations it has a strong caramel-like odour which, I’m told, can actually smell quite unpleasant.

As strawberries ripen their sugar content increases, they get redder, and they produce more scent

As strawberries ripen their sugar content (a mixture of fructose, glucose and sucrose) also changes, increasing from about 5% to 9% by weight. This change is driven by auxin hormones such as indole-3-acetic acid. At the same time, acidity – largely from citric acid – decreases.

Those who’ve been paying attention might be putting a few things together at this point: as the strawberry ripens, it becomes less acidic, which helps to shift its colour from more green-yellow-orange towards those delicious-looking purpleish-reds. It’s also producing more furaneol, making it smell yummy, and its sugar content is increasing, making it lovely and sweet. Why is all this happening? Because the strawberry wants (as much as a plant can want) to be eaten, but only once it’s ripe – because that’s how its seeds get dispersed. Ripening is all about making the fruit more appealing – redder, sweeter, and nicer-smelling – to things that will eat it. Nature’s clever, eh?

There we have it: some spectacular strawberry science! As a final note, as soon as I started writing this I (naturally) found lots of other blogs about strawberries and summer berries in general. They’re all fascinating. If you want to read more, check out…

Like the Chronicle Flask’s Facebook page for regular updates, or follow @chronicleflask on Twitter. All content is © Kat Day 2018. You may share or link to anything here, but you must reference this site if you do.

If you enjoy reading my blog, please consider buying me a coffee (I might spend it on an extra punnet of strawberries, mind you) through Ko-fi using the button below.
Buy Me a Coffee at


Where did our love of dairy come from?

The popularity of the soya latte seems to be on the rise.

A little while ago botanist James Wong tweeted about the myriad types of plant ‘milk’ that are increasingly being offered in coffee shops, none of which are truly milk (in the biological sense).

This generated a huge response, probably rather larger than he was expecting from an off-hand tweet. Now, I’m not going to get into the ethics of milk production because it’s beyond the scope of this blog (and let’s keep it out of the comments? — kthxbye) but I do want to consider one fairly long thread of responses which ran the gamut from ‘humans are the only species to drink the milk of another animal’ (actually, no) to ‘there’s no benefit to dairy’ (bear with me) and ending with, in essence, ‘dairy is slowly killing us‘ (complicated, but essentially there’s very little evidence of any harm).

Humans have been consuming dairy products for thousands of years.

But wait. If dairy is so terrible for humans, and if there are no advantages to it, why do we consume it at all? Dairy is not a new thing. Humans have been consuming foods made from one type of animal milk or another for 10,000 years, give or take. That’s really quite a long time. More to the point (I don’t want to be accused of appealing to antiquity, after all), keeping animals and milking them is quite resource intensive. You have to feed them, look after them and ensure they don’t wander off or get eaten by predators, not to mention actually milk them on a daily basis. All that takes time, energy and probably currency of some sort. Why would anyone bother, if dairy were truly detrimental to our well-being?

In fact, some cultures don’t bother. The ability to digest lactose (the main sugar in milk) beyond infancy is quite low in some parts of the world, specifically Asia and most of Africa. In those areas dairy is, or at least has been historically, not a significant part of people’s diet.

But it is in European diets. Particularly northern European diets. Northern Europeans are, generally, extremely tolerant of lactose into adulthood and beyond.

Which is interesting because it suggests, if you weren’t suspicious already, that there IS some advantage to consuming dairy. The ability to digest lactose seems to be a genetic trait. And it seems it’s something to do, really quite specifically, with your geographic location.

Which brings us to vitamin D. This vitamin, which is more accurately described as a hormone, is a crucial nutrient for humans. It increases absorption of calcium, magnesium and phosphate, which are all necessary for healthy bones (not to mention lots of other processes in the body). It’s well-known that a lack of vitamin D leads to weakened bones, and specifically causes rickets in children. More recently we’ve come to understand that vitamin D also supports our immune system; deficiency has been meaningfully linked to increased risk of certain viral infections.

What’s the connection between vitamin D and geographic location? Well, humans can make vitamin D in their skin, but we need a bit of help. In particular, and this is where the chemistry comes in, we need ultraviolet light. Specifically, UVB – light with wavelengths between 280 nm to 315 nm. When our skin is exposed to UVB, a substance called 7-dehydrocholesterol (7-DHC to its friends) is converted into previtamin D3, which is then changed by our body heat to vitamin D3, or cholecalciferol – which is the really good stuff. (There’s another form, vitamin D2, but this is slightly less biologically active.) At this point the liver and kidneys take over and activate the chloecalciferol via the magic of enzymes.

We make vitamin D in our skin when we’re exposed to UVB light.

How much UVB you’re exposed to depends on where you live. If you live anywhere near the equator, no problem. You get UVB all year round. Possibly too much, in fact – it’s also linked with skin cancers. But if you live in northerly latitudes (or very southerly), you might have a problem. In the summer months, a few minutes in the sun without sunscreen (literally a few minutes, not hours!) will produce more than enough vitamin D. But people living in UK, for example, get no UVB exposure for 6 months of the year. Icelanders go without for 7, and inhabitants of Tromsø, in Norway, have to get by for a full 8 months. Since we can only store vitamin D in our bodies for something like 2-4 months (I’ve struggled to find a consistent number for this, but everyone seems to agree it’s in this ballpark), that potentially means several months with no vitamin D at all, which could lead to deficiency.

In the winter northern Europeans don’t receive enough UVB light from the sun to produce vitamin D in their skin.

In the winter, northern Europeans simply can’t make vitamin D3 in their skin (and for anyone thinking about sunbeds, that’s a bad idea for several reasons). In 2018, this is easily fixed – you just take a supplement. For example, Public Health England recommends that Brits take a daily dose of 10 mcg (400 IU) of vitamin D in autumn and winter, i.e. between about October and March. It’s worth pointing out at this point that a lot of supplements you can buy contain much more than this, and more isn’t necessarily better. Vitamin D is fat-soluble and so it will build up in the body, potentially reaching toxic levels if you really overdo things. Check your labels.

Oily fish is an excellent source of vitamin D.

But what about a few thousand years ago, before you just could pop to the supermarket and buy a bottle of small tablets? What did northern Europeans do then? The answer is simple: they had to get vitamin D from their food. Even if it’s not particularly well-absorbed, it’s better than nothing.

Of couse it helps if you have access to lots of foods which are sources of vitamin D. Which would be…  fatty fish (tuna, mackerel, salmon, etc) – suddenly that northern European love of herring makes so much more sense – red meat, certain types of liver, egg yolks and, yep, dairy products. Dairy products, in truth, contain relatively low levels of vitamin D (cheese and butter are better than plain milk), but every little helps. Plus, they’re also a good source of calcium, which works alongside vitamin D and is, of course, really important for good bone health.

A side note for vegans and vegetarians: most dietry sources of vitamin D come from animals. Certain mushrooms grown under UV can be a good source of vitamin D2, but unless you’re super-careful a plant-based diet won’t provide enough of this nutrient. So if you live in the north somewhere or you don’t, or can’t, expose your skin to the sun very often, you need a supplement (vegan supplements are available).

Fair skin likely emerged because it allows for better vitamin D production when UVB levels are lower.

One thing I haven’t mentioned of course is skin-colour. Northern Europeans are generally fair-skinned, and this is vitamin D-related, too. The paler your skin, the better UVB penetrates it. Fair-skinned people living in the north had an advantage over those with darker skin in the winter, spring and autumn months: they could produce more vitamin D. In fact, this was probably a significant factor in the evolution of fair skin (although, as Ed Yong explains in this excellent article, that’s complicated).

In summary, consuming dairy does have advantages, at least historically. There’s a good reason Europeans love their cheeses. But these days, if you want to eat a vegan or vegetarian diet for any reason (once again, let’s not get into those reasons in comments, kay?) you really should take a vitamin D supplement. In fact, Public Health England recommends that everyone in the UK take a vitamin D supplement in the autumn and winter, but only a small amount – check your dose.

By the way, if you spot any ‘diary’s let me know. I really had to battle to keep them from sneaking in…

Like the Chronicle Flask’s Facebook page for regular updates, or follow @chronicleflask on Twitter. All content is © Kat Day 2018. You may share or link to anything here, but you must reference this site if you do.

If you enjoy reading my blog, please consider buying me a coffee through Ko-fi using the button below. Black, though – I like dairy, just not in my coffee!

Buy Me a Coffee at

Chemical du jour: how bad is BPA, really?

BPA is an additive in many plastics

When I was writing my summary of 2017 I said that there would, very probably, be some sort of food health scare at the start of 2018. It’s the natural order of things: first we eat and drink the calorie requirement of a small blue whale over Christmas and New Year, and then, lo, we must be made to suffer the guilt in January. By Easter, of course, it’s all forgotten and we can cheerfully stuff ourselves with chocolate eggs.

Last year it was crispy potatoes, and the year before that it was something ridiculous about sugar in ketchup causing cancer (it’s the same sugar that’s in everything, why ketchup? Why?). This year, though, it seems that the nasty chemical of the day is not something that’s in our food so much as around it.

Because this year the villain of the piece appears to be BPA, otherwise known as Bisphenol A or, to give it its IUPAC name, 4,4′-(propane-2,2-diyl)diphenol.

BPA is an additive in plastics. At the end of last year an excellent documentary aired on the BBC called Blue Planet II, all about our planet’s oceans. It featured amazing, jaw-dropping footage of wildlife. It also featured some extremely shocking images of plastic waste, and the harm it causes.

Plastic waste is a serious problem

Plastic waste, particularly plastic waste which is improperly disposed of and consequently ends up in the wrong place, is indisputably something that needs to be addressed. But this highlighting of the plastic waste problem had an unintended consequence: where was the story going to go? Everyone is writing about how plastic is bad, went (I imagine) editorial meetings in offices around the country – find me a story showing that plastic is even WORSE than we thought!

Really, it was inevitable that a ‘not only is plastic bad for the environment, but it’s bad for you, too!’ theme was going to emerge. It started, sort of, with a headline in The Sun newspaper: “Shopping receipts could ‘increase your cancer risk’ – as 93% contain dangerous chemicals also linked to infertility. Shopping receipts are, of course, not made of plastic – but the article’s sub-heading stated that “BPA is used to make plastics”, so the implication was clear enough.

Then the rather confusing: “Plastic chemical linked to male infertility in majority of teenagers, study suggests” appeared in The Telegraph (more on this in a bit), and the whole thing exploded. Search for BPA in Google News now and there is everything from “5 Ways to Reduce Your Exposure to Toxic BPA” to “gender-bending chemicals found in plastic and linked to breast and prostate cancer are found in 86% of teenagers”.

Yikes. It’s all quite scary. It’s true that right now you can’t really avoid plastic. Look around you and it’s likely that you’ll immediately see lots of plastic objects, and that’s before you even try to consider all the everyday things which have plastic coatings that aren’t immediately obvious. If you have young children, you’re probably drowning in plastic toys, cups, plates and bottles. We’re pretty much touching plastic continually throughout our day. How concerned should we be?

As the Hitchiker’s Guide to the Galaxy says, Don’t Panic. Plastic (like planet Earth in the Guide) can probably be summed up as mostly harmless, at least from a BPA point of view if not an environmental one.

BPA is a rather pleasingly symmetrical molecule with two phenol groups. (A big model of this would make a wonderfully ironic pair of sunglasses, wouldn’t it?) It was first synthesized by the Russian chemist Alexander Dianin in the late 19th century. It’s made by reacting acetone – which is where the “A” in the name comes from – with two phenol molecules. It’s actually a very simple reaction, although the product does need to be carefully purified, since large amounts of phenol are used to ensure a good yield.

It’s been used commercially since the fifties, and millions of tonnes of BPA are now produced worldwide each year. BPA is used to make plastics which are clear and tough – two characteristics which are often valued, especially for things like waterproof coatings, bottles and food containers.

The concern is that BPA is an endocrine disruptor, meaning that it interferes with hormone systems. In particular, it’s a known xenoestrogen, in other words it mimics the female hormone estrogen. Animal studies have suggested possible links to certain cancers, infertility, neurological problems and other diseases. A lot of the work is fairly small-scale and, as I’ve mentioned, focused on animal studies (rather than looking directly at effects in humans). Where humans have been studied it’s usually been populations that are exposed to especially high BPA levels (epoxy resin painters, for example). Still, it builds up into quite a damning picture.

BPA has been banned from baby bottles in many countries, including the USA and Europe

Of course, we don’t normally eat plastic, but BPA can leach from the plastic into the food or drink that’s in the plastic, and much more so if the plastic is heated. Because of these concerns, BPA has been banned from baby bottles (which tend to be heated, both for sterilisation and to warm the milk) in several countries, including the whole of Europe, for some years now. “BPA free” labels are a fairly common sight on baby products these days. BPA might also get onto our skin from, for example, those thermal paper receipts The Sun article mentioned, and then into our mouths when we eat. Our bodies break down and excrete the chemical fairly quickly, in as little as 6 hours, but because it’s so common in our environment most of us are continually meeting new sources of it.

How much are we getting, though? This is a critical question, because as I’m forever saying, the dose makes the poison. Arsenic is a deadly poison at high levels, but most of us – were we to undergo some sort of very sensitive test – would probably find we have traces of it in our systems, because it’s a naturally-occuring mineral. It’s nothing to worry about, unless for some reason the levels become too high.

When it comes to BPA, different countries have different guidelines. The European Food Safety Authority recommended in January 2015 that the TDI (tolerable daily intake) should be reduced from 50 to 4 µg/kg body weight/day (there are plans for a new assessment in 2018, so it might change again). For a 75 kg adult, that translates to about 0.0003 g per day. A USA Federal Drug and Administration document from 2014 suggests a NOAEL (no-observed-adverse-effect-level) of 5 mg/kg bw/day, which translates to 0.375 g per day for the same 75 kg adult. NOAEL values are usually much higher than TDIs, so these two figures aren’t as incompatible as they might appear. Tolerable daily intake values tend to have a lot of additional “just in case” tossed into them – being rather more guidance than science.

The European Food Standards Authority published a detailed review of the evidence in 2015 (click for a summary)

So, how much BPA are we exposed to? I’m going to stick to Europe, because that’s where I’m based (for now…), and trying to look at all the different countries is horribly complicated. Besides, EFSA produced a really helpful executive summary of their findings in 2015, which makes it much easier to find the pertinent information.

The key points are these: most of our exposure comes from food. Infants, children and adolescents have the highest dietary exposures to BPA, probably because they eat and drink more per kilogram of body weight. The estimated average was 0.375 µg/kg bw per day.  For adult women the estimated average was 0.132 µg/kg bw per day, and for men it was 0.126 µg/kg bw per day.

When it came to thermal paper and other non-dietary exposure (mostly from dust, toys and cosmetics), the numbers were smaller, but the panel admitted there was a fair bit of uncertainty here. The total exposure from all sources was somewhere in the region of 1 µg/kg bw per day for all the age groups, with adolescents and young children edging more toward values of 1.5 µg/kg bw per day (this will be important in a minute).

Note that all of these numbers are significantly less than the, conservative, tolerable daily intake value of 4 µg/kg bw per day recommended by EFSA.

Here’s the important bit: the panel concluded that there is “no health concern for BPA at the estimated levels of exposure” as far as diet goes. They also said that this applied “to prenatally exposed children” (in other words, one less thing for pregnant women to worry about).

When it came to total exposure, i.e. diet and exposure from other sources such as thermal paper they concluded that “the health concern for BPA is low at the estimated levels of exposure”.

The factsheet that was published alongside the full document summarises the results as follows: “BPA poses no health risk to consumers because current exposure to the chemical is too low to cause harm.”

Like I said: Don’t Panic.

What about those frankly quite terrifying headlines? Well, firstly The Sun article was based on some work conducted on a grand total of 208 receipts collected in Southeast Michigan in the USA from only 39 unique business locations. That’s a pretty small sample and not, I’d suggest, perhaps terribly relevant to the readership of a British newspaper. Worse, the actual levels of BPA weren’t measured in the large majority of samples – they only tested to see if it was there, not how much was there. There was nothing conclusive at all to suggest that the levels in the receipts might be enough to “increase your cancer risk”. All in all, it was pretty meaningless. We already knew there was BPA in thermal receipt paper – no one was hiding that information (it’s literally in the second paragraph of the Wikipedia page on BPA).

The Telegraph article, and the many others it appeared to spawn, also weren’t based on especially rigorous work and, worse, totally misrepresented the findings in any case. Firstly, let’s consider that headline: “Plastic chemical linked to male infertility in majority of teenagers, study suggests”. What does that mean? Are they suggesting that teenagers are displaying infertility? No, of course not. They didn’t want to put “BPA” in the headline because that, apparently, would be too confusing for their readers. So instead they’ve replaced “BPA” with “plastic chemical linked to male infertility”, which is so much more straightforward, isn’t it?

And they don’t mean it’s linked to infertility in the majority of teenagers, they mean it’s linked to infertility and it’s in the majority of teenager’s bodies. I do appreciate that journalists rarely write headlines – this isn’t a criticism of the poor writer who turned in perfectly good copy – but that is confusing and misleading headline-writing of the highest order. Ugh.

Plus, as I commented back there, that wasn’t even the conclusion of the study, which was actually an experiment carried out by students under the supervision of a local university. The key finding was not that, horror, teenagers have BPA in their bodies. The researchers assumed that almost all of the teenagers would have BPA in their bodies – as the EFSA report showed, most people do. No, the conclusion was actually that the teenagers – 94 of them – had been unable to significantly reduce their levels of BPA by changing their diet and lifestyle. Although the paper admits the conditions weren’t well-controlled. Basically, they asked a group of 17-19 year-olds to avoid plastic, and worked on the basis that their account of doing so was accurate.

And how much did the teenagers have in their samples? The average was 1.22 ng/ml, in urine samples (ng = nanogram). Now, even if we assume that these levels apply to all human tissue (which they almost certainly don’t) and that therefore the students had roughly 1.22 ng per gram of body weight, that only translates to, very approximately, 1.22 micrograms (µg) per kilogram of body weight.

Wait a second… what did EFSA say again…. ah yes, they estimated total exposures of 1.449 µg/kg bw per day for adolescents.

Sooooo basically a very similar value, then? And the EFSA, after looking at multiple studies in painstaking detail, concluded that “BPA poses no health risk to consumers”.

Is this grounds for multiple hysterical, fear-mongering headlines? I really don’t think it is.

It is interesting that the teenagers were unable to reduce their BPA levels. Because it’s broken down and excreted quite quickly by the body, you might expect that reducing exposure would have a bigger effect – but really all we can say here is that this needs to be repeated with far more tightly-controlled conditions. Who knows what the students did, and didn’t, actually handle and eat. Perhaps their school environment contains high levels of BPA in dust for some reason (new buildings or equipment, maybe?), and so it was virtually impossible to avoid. Who knows.

In summary, despite the scary headlines there really is no need to worry too much about BPA from plastics or receipts. It may be worth avoiding heating plastic, since we know that increases the amound of BPA that makes its way into food – although it’s important to stress that there’s no evidence that microwaving plastic containers causes levels to be above safe limits. Still, if you wanted to be cautious you could choose to put food into a ceramic or glass bowl, covered with a plate rather than clingfilm. It’ll save you money on your clingfilm bills anyway, and it means less plastic waste, which is no bad thing.

Roll on Easter…

Like the Chronicle Flask’s Facebook page for regular updates, or follow @chronicleflask on Twitter. All content is © Kat Day 2018. You may share or link to anything here, but you must reference this site if you do.

All comments are moderated. Abusive comments will be deleted, as will any comments referring to posts on this site which have had comments disabled.

Just what is blk water, and should you drink it?

Christmas is almost here! Are you ready yet? Are you fed up with people asking if you’re ready yet? Have you worked out what to buy for Great-uncle Nigel, who says he neither needs nor wants anything? Always a tricky scenario, that. Consumables are often a safe fallback position. They don’t clutter up the house, and who doesn’t enjoy a nice box of luxury biscuits, or chocolates, or a bottle of champagne, or spirts, or a case of blk water.

Wait, what?

Yes, this mysterious product turned up in my feed a few weeks ago. It’s water (well, so they say), but it’s black. Actually black. Not just black because the bottle’s black, black because the liquid inside it is… black.

It’s black water.

A bit like… cola. Only blacker, and not fizzy, or sweet, or with any discernable flavour other than water.

It raises many questions, doesn’t it? Let’s start with why. Obviously it’s a great marketing gimmick. It definitely looks different. It also comes with a number of interesting claims. The suppliers claim it contains “no nasties” and “only 2 ingredients”, namely spring water and “Fulvic Minerals” (sic). (Hang on, I hear you say, if it’s minerals, plural, surely that’s already more than two ingredients? Oh, but that’s only the start. Stay with me.)

It claims to “balance pH levels” and help “to regulate our highly acidic diets”. Yes, well, I think I’ve covered that before. Absolutely nothing you drink, or eat, does anything to the pH in any part of your body except, possibly, your urine – where you might see a small difference under some circumstances (but even if you do it doesn’t tell you anything significant about the impact of your diet on your long-term health). And bear in mind that a few minutes after you drink any kind of alkaline water it mixes with stomach acid which has a pH of around 2. Honestly, none of that alkaline “goodness” makes it past your pyloric sphincter.

Finally, blk water apparently “replenishes electrolytes”. Hm. Electrolytes are important in the body. They’re ionic species, which means they can conduct electricity. Your muscles and neurons rely on electrical activity, so they are quite important. Like, life or death important. But because of that our bodies are quite good at regulating them, most of the time. If you run marathons in deserts, or get struck down with a nasty case of food poisoning, or have some kind of serious health condition (you’d know about it) you might need to think about electrolytes, but otherwise most of us get what we need from the food and drink we consume normally every day.

Besides which, didn’t they say “only 2 ingredients”? The most common electrolytes in the body are sodium, potassium, magnesium, chloride, hydrogen phosphate and hydrogen carbonate. Most spring waters do contain some, if not all, of these, in greater or smaller amounts, but it’s not going to be enough to effectively “replenish” any of them. If, say, you are running marathons in the desert, the advice is actually to keep a careful eye on your water intake because drinking too much water can dangerously lower your sodium levels. Yes, there are sports drinks that are specifically designed to help with this, but they taste of salt and sugar and/or flavourings which have been added in a desperate attempt to cover up the salty taste. This is apparently not the case with blk water which, to repeat myself, contains “only 2 ingredients”.

And, according to the blk website the drink contains “0 mg of sodium per 500ml” so… yeah.

Speaking of ingredients, what about those so-called fulvic minerals? Maybe they’re the source of those all-important electrolytes (but not sodium)? And maybe they’re magically tasteless, too?

And perhaps, like other magical objects and substances, they don’t actually exist – as geologist @geolizzy told me on Twitter when I asked.

It’s not looking good for blk water (£47.99 for a case of 24 bottles) at this point. But hang on. Perhaps when they said fulvic minerals, what they meant was fulvic acid – which is a thing, or possibly several things – in a the presence of oh, say, some bicarbonate (*cough* 2 ingredients *cough*).

That could push the pH up to the stated 8-9, and didn’t we learn in school that:
acid + alkali –> salt + water
and maybe, if we’re being generous, we could call the salts of fulvic acids minerals? It’s a bit shaky but… all right.

So what are fulvic acids?

That’s an interesting question. I had never heard of fulvic acids. They do, as it turns out, have a Wikipedia page (Wikipedia is usually very reliable for chemical information, since no one has yet been very interested in spoofing chemical pages to claim things like hydrochloric acid is extracted from the urine of pregnant unicorns) but the information wasn’t particularly enlightening. The page did inform me that fulvic acids are “components of the humus” (in soil) and are  “similar to humic acids, with differences being the carbon and oxygen contents, acidity, degree of polymerization, molecular weight, and color.” The Twitter hive-mind, as you can see, was sending me down the same path…

A typical example of a humic acid.

Next stop, humic acids. Now we’re getting somewhere. These are big molecules with several functional groups. The chemists out there will observe that, yes, they contain several carboxylic acid groups (the COOH / HOOC ones you can see in the example) so, yes, it makes sense they’d behave as acids.

“No nasties”, blk said. “Pure” they said. When you hear those sorts of things, do you imagine something like this is in your drink? Especially one that, let’s be clear, is a component of soil?

Oh, hang on, I should’ve checked the “blk explained” page on the blk water website. There’s a heading which actually says “what are Fulvic Minerals”, let’s see now…

“Fulvic minerals are plant matter derived from millions of years ago that have combined with fulvic acid forming rare fulvic mineral deposits. They deliver some of the most powerful electrolytes in the world.”

“Fulvic minerals contain 77 other trace minerals, most of which have an influence on the healthiness of our body. They are very high in alkaline and when sourced from the ground contain a pH of 9.”

I don’t know about you, but I’m not totally convinced. I mean, as @geolizzy says in her tweet here (excuse the minor typo, she means humic, not humid),  it sounds a bit like… water contaminated with hydrocarbon deposits?


And, by the way, the phrase “very high in alkaline” is utterly meaningless. Substances are alkaline, or they contain substances which are alkaline. “Alkaline” is not a thing in itself. This is like saying my tea is high in hot when sourced from the teapot.

There’s one more thing to add. So far this might sound a bit weird but… probably safe, right? What could be more wholesome than a bit of soil? Didn’t your granny tell you to eat a pinch of soil to boost your immune system, or something? At worst it’s harmless, right?

Tap water is chlorine-treated to keep it free of nasty bacteria.

Maybe. But then again… water is often treated with chlorine compounds to keep it bacteria-free. Now, blk water is supposedly spring water, which isn’t usually treated. But hypothetically, let’s consider what happens when humic acids, or fulvic acids, or whatever we’re calling them, come into contact with chlorine-treated water.

Oh dear. It seems that dihaloacetonitriles are formed. (See also this paper.) This is a group of substances (possibly the best known one is dichloroacetonitrile) which are variously toxic and mutagenic. Let’s hope that spring water is totally unchlorinated, 100% “we really got it from out of a rock” spring water, then.

To sum up: it is black, and that’s kind of weird and a fun talking point – although if you like the idea of a black drink you can always drink cola. It doesn’t balance your pH levels – nothing does. I don’t believe it replenishes electrolyte levels either – how can it when it doesn’t contain sodium? – and I’m dubious about the “2 ingredients” claim (could you tell?). And the oh-so-healthy-sounding fulvic minerals are most likely due to contamination from coal deposits.

All in all, whilst it might not be quite such a conversation piece, I think it would be better to get Great-uncle Nigel a nice box of chocolates this year.

Like the Chronicle Flask’s Facebook page for regular updates, or follow @chronicleflask on Twitter. All content is © Kat Day 2017. You may share or link to anything here, but you must reference this site if you do.

All comments are moderated. Abusive comments will be deleted, as will any comments referring to posts on this site which have had comments disabled.

Buy Me a Coffee at