Feet of clay? The science of statues

Concept art for the Terry Pratchett statue (c) Paul Kidby

Concept art for the Terry Pratchett statue (c) Paul Kidby

Yesterday we received the exciting news that a statue to commemorate Sir Terry Pratchett and his work has been approved by Salisbury City Council. Hurrah! So, even if we don’t quite manage to get octarine into the periodic table (and thus into every science textbook for ever more), it’s looking very likely that there will still be something permanent to help keep his memory alive.

But this got me thinking about everyday chemistry (who am I kidding, I’m always thinking about everyday chemistry!) and, in particular, bronze – the material from which the statue will be made.

Bronze, I hear you say, what’s that good for apart from, well, statues? And maybe bells? Is it really that interesting?

Well, let’s see. Bronze is an alloy. Alloys are mixtures that contain at least one metal, but they’re stranger than the word ‘mixture’ might perhaps suggest. Imagine combining, say, sand and stones. You still be able to see the sand. You could see the stones. You could, if you could be bothered to do it, separate them out again. And you’d expect the mixture to behave like, well, stony sand.

Alloys aren’t like this. Alloys (other well-known examples include steel, brass and that silver-coloured stuff dentists use for filling teeth) look, on all but the atomic level, like pure metals. They’re bendy and shiny, they make pleasing ringing sounds when you hit them and they’re good electrical conductors. And unlike more simple mixtures, they’re difficult (though not impossible) to separate back into their constituents.

Perhaps the most interesting this about alloys is that their properties are often very different to any of the elements that went into making them. Bronze, in particular, is harder than either tin or copper, and hence The Bronze Age is so historically significant. Copper is one of the few metals that can (just about) be found in its pure form, and so is one of the oldest elements we know, going back at least as far as 9000 BC. But while quite pretty to look at, copper isn’t ideal for making tools, being fairly soft and not great at keeping an edge. Bronze, on the other hand, is much more durable, and was therefore a much better choice for for building materials, armour and, of course, weapons. (War, what is it good for? Er, the development of new materials?)

Hephaestus was the God of fire and metalworking; according to legend he was lame.

Hephaestus was the God of metalworking. According to legend he was lame, could it have been because of exposure to arsenic fumes?

Today we (well, chemists anyway) think of bronze as being an alloy of tin and copper, but the earliest bronzes were made with arsenic, copper ores often being naturally contaminated with this element. Arsenical bronzes can be work-hardened, and the arsenic could, if the quantities were right, also produce a pleasing a silvery sheen on the finished object. Unfortunately, arsenic vaporises at below the melting point of bronze, producing poisonous fumes which attacked eyes, lungs and skin. We know now that it also causes peripheral neuropathy, which might be behind the historical legends of lame smiths, for example Hephaestus, the Greek God of smiths. Interestingly, the Greeks frequently placed small dwarf-like statues of Hephaestus near their hearths, and this is might be where the idea of dwarves as blacksmiths and metalworkers originates.

Tin bronze required a little more know-how (not to mention trade negotiations) than arsenical bronze, since tin very rarely turns up mixed with copper in nature. But it had several advantages. The tin fumes weren’t toxic and, if you knew what you were doing, the alloying process could be more easily controlled. The resulting alloy was also stronger and easier to cast.

teaspoon in mugOf course, as we all know, bronze ultimately gave way to iron. Bronze is actually harder than wrought iron, but iron was considerably easier to find and simpler to process into useful metal. Steel, which came later, ultimately combined superior strength with a relatively lower cost and, in the early 20th century, corrosion resistance. And that’s why the teaspoon sitting in my mug is made of stainless steel and not some other metal.

Bronze has a relatively limited number of uses today, being a heavy and expensive metal, but it is still used to make statues, where heaviness and costliness aren’t necessarily bad things (unless, of course, someone pinches the statue and melts it down – an unfortunately common occurrence with ancient works). It has the advantages of being ductile and extremely corrosion resistant; ideal for something that’s going to sit outside in all weathers. A little black copper oxide will form on its surface over time, and eventually green copper carbonate, but this is superficial and it’s a really long time before any fine details are lost. In addition, bronze’s hardness and ductility means that any pointy bits probably won’t snap off under the weight of the two-millionth pigeon.

So how are bronze statues made? For this I asked Paul Kidby, who designed the concept art for the statue. He told me that he sculpts in Chavant, which is an oil-based clay. It’s lighter than normal clay and, crucially, resists shrinking and cracking. He then sends his finished work away to be cast in bronze at a UK foundry, where they make a mould of his statue and from that, ultimately (skipping over multiple steps), a bronze copy. Bronze has another nifty property, in that it expands slightly just before it sets. This means it fills the finest details of moulds which produces a very precise finish. Conveniently, the metal shrinks again as it cools, making the mould easy to remove.

And just for completeness, Paul also told me that the base of the statue will most likely be polished granite, water jet cut with the design of the Discworld sitting on the back of Great A’Tuin. I can just imagine it – it’s going to be beautiful.

Follow The Chronicle Flask on Facebook and Twitter for regular updates.

Advertisements

Are artificial preservatives really that bad?

Are preservatives really such a bad thing?

But are preservatives really such a bad thing?

As something of a skeptic, I am fond of myth- and hoax-busting type things.  I find them reassuring.  If I had to accept that absolutely everyone swallowed stories about poisonous bottled water, free Disneyland tickets, and the Pope coming out as gay without a second thought, I really would lose all faith in humanity.  But occasionally, just occasionally, a bizarre story pops up that actually turns out to be true.

And so it was a few days ago, when the Hoax Slayer feed on Facebook threw up a story about the luminous, foil-packed beverage Capri-Sun.  It would appear that mould (or, indeed mold – never mind fungi growing in children’s drinks, the story generated far more upset over American versus British spelling) has actually been found growing in Capri-Sun containers, in some cases in some really rather spectacular shapes and sizes.  This was no hoax.  It wasn’t even, unlike the story of the giant snake hanging around a mechanical digger, a twisted misrepresentation of the facts.  No, mould really has been found growing in more than one Capri-Sun container.

In a statement, Kraft, who make Capri-Sun, said:

Among the many, many millions of pouches we sell each year, it does happen from time to time because the product is preservative free. A statement is included on all cartons telling consumers to discard any leaking or damaged packages. If mold does occur, we completely agree that it can be unsightly and gross, but it is not harmful and is more of a quality issue rather than a safety issue.

This got me thinking, and funnily enough my thoughts were less “never, ever buy Capri-Sun” but rather “why is ‘preservative free’ such a good thing”?

salt-sugar-fatHumans have been preserving food for a very long time.  In fact, arguably since we first learned that holding bits of dead mammoth over that new-fangled fiery stuff makes it taste nicer and a bit less chewy.  The earliest preservatives are, of course, those oh-so-healthy staples of salt, sugar and fat.  And they’re still in use today.

Salt, otherwise known as sodium chloride, found in rocks and seawater.  We all like our salty foods, but how often do you stop and wonder why that delicious slice of ham is traditionally so salty?  It’s not just for flavour.  Salt is an excellent preservative, and humans have been using it for that reason for at least eight thousand years.  It’s a drying agent, drawing moisture from cells by osmosis, and since bacteria and fungi need moisture to grow salting food keeps them at bay.  Adding salt to food allowed people to travel over long distances and reduced the problem of seasonal availability.  As such it was an important commodity, even being used as a form of currency.  These days of course it’s far less valuable, until Britain suffers a dusting of snow that is.

Salt may help to keep our food fresh, but it’s not great when it comes to keeping us healthy.  In recent years too much salt has been increasingly associated with certain health problems.  Salt appears to raise blood pressure, and raised blood pressure puts you at increased risk of heart disease and stroke.  There is some controversy over exactly how causal this link is, but most health professionals agree that we could do with eating a bit less NaCl.

Next on the list, sugar.  Again, it’s been used since ancient times.  Preserves aren’t called preserves for nothing.  Jam (for our American cousins, jelly) wasn’t invented purely because it was delicious on toast.  No, jam, marmalade and the like are a handy way of making the summer fruit glut last all through the year.  Sugar works in a similar, although sweeter, way to salt: drawing water from cells by osmosis and producing an environment that’s hostile to bacteria.  Of course, as we all know, too much sugar isn’t great for our waistlines and it’s really bad for our teeth.  And tooth decay is far more than a cosmetic problem: in extreme cases infection can spread from the tooth to the surrounding tissues and lead to potentially fatal (really) complications such as cavernous sinus thrombosis and Ludwig’s angina.

What about fat?  Traditionally used as a layer on top of foods such as shrimp, chicken liver and pâté, it produces an air- and water-tight seal that makes a very effective barrier to bacteria.  Very high-fat foods, such as butter and cream, aren’t bacteria-friendly because, again, they have a low water content and bacteria need water to grown and reproduce.  Such foods also have fewer sugars, in particular lactose, that provide bacteria with their lunch.  This is why the use-by date on the cream is longer than the one on the milk, and why you can safely store the butter out of the fridge (you can, honestly).  Funnily enough, fat is probably the most controversial ‘additive’ from a health point of view.  Increasingly various groups are questioning the conventional wisdom that a high intake of saturated fat leads to cardiovascular disease, and of course there are essential fatty acids that are, well, essential.  We definitely need fat, at least certain kinds of fat, in our diet.  But there’s no doubt it’s high in calories, and it’s clear that being overweight is bad your health, so moderation is key.

So, salt, sugar and fat are all natural preservatives which are all associated with genuine health concerns.  What about artificial preservatives?  Well there are quite a few, and it would take a while to list them all (I’m not going to).  Some of them are definitely controversial.  Nitrates and nitrites, for example, form nitrosamines when foods are cooked, and these have been linked to an increased cancer risk.  But on the other hand, nitrates and nitrites prevent the growth of botulinum toxin, and if you ingest that, we’re not talking about a small increased risk, we’re talking about dead.  Plus, unlike fat, sugar and salt, their addition to foods is strictly regulated, so you’re unlikely to consume dangerously high quantities unless you’re practically living off processed meat.  In which case… well we’re back to salt and saturated fat again.

Sulfites, such as potassium and sodium sulfite, are common food additives which are known to be problematic for certain individuals, particularly if they have asthma or aspirin sensitivity.  But then, some people are allergic to peanuts and they haven’t been banned, yet.  There’s no evidence that sulfites are dangerous to everyone.

Sodium benzoate is another preservative that’s been linked with health problems, in particular hyperactivity in children.  But, and it’s quite a big but, only in combination with certain artificial colours.  And the effects observed weren’t consistent.  The Food Standards Agency concluded that, if real, the observed increases in hyperactive behaviour were more likely to be linked to the colours rather than the preservative.  Professor Jim Stevenson, author of the report, commented that “parents should not think that simply taking these additives out of food will prevent hyperactive disorders”.

dscf28802And this brings us back to soft drinks, because sodium benzoate is, or at least was, a fairly common ingredient in flavoured beverages.  Although, not Capri-Sun, as we’ve already established.

But Capri-Sun does contain sugar.  Admittedly, it’s main purpose isn’t preservative – there’s not quite enough for that – but still it’s an ingredient, and a significant one.  A quick glance at the nutritional information reveals that Capri-Sun contains 10.5 g of sugar per 100 g.  That’s 21 g in one of the foil packs, or roughly 5 teaspoons.  Some of this comes from the fruit juices the drink is made of, but not all.  Sugar is clearly listed as an added ingredient.  NHS guidelines suggest we shouldn’t be eating more than about 50 g (for women) or 70 g (for men) of sugar a day, so that one, really quite small, packet of Capri-Sun contains about half of a woman’s recommended daily sugar intake.

Make no mistake, sugar is bad.  It’s really bad.  Quite apart from dental decay and obesity, excessive sugar exposure has been firmly linked to type 2 diabetes.  And, guess what, eating less sugar cuts the risk of developing this potentially life-threatening illness.  Want to look after your family’s health?  You could do a lot worse than cutting back on sugar.

Let’s briefly consider some other favourite sticky beverages.  The Coca Cola Company is in the process of phasing sodium benzoate out of its products — including Coke, Sprite, Fanta, and Oasis — as soon as a “satisfactory alternative” is developed, and a quick look at some cans in my fridge (oh the shame) suggests they’ve already done it, in this country anyway.  Sugar, not so much (diet alternatives aside, obviously).  Sprite contains 6.6 g of sugar per 100 g (less than Capri-Sun, hmmm) and Coke contains 10.6 g per 100 g.

Now, I find this very interesting.  We have a small risk from sodium benzoate, when it’s combined with other additives, maybe.  And suddenly food companies are desperate to get it out of their products, and to prominently label everything as “free from artificial preservatives”.  It’s a real sales point.  Sugar, on the other hand, is definitively bad.  No argument.  Over-consumption of sugar is definitely associated with a number of negative health outcomes.  But we don’t seem to see quite so much enthusiasm for lowering the sugar content of foods or drinks, unless they’re being marketed as diet options.

Why so keen to get rid of one but not the other?  Sugar is cheap and tasty, and consumers like sweetness.  Artificial preservatives, on the other hand, cost money, don’t add anything to the taste (until the product goes off, that is) and make products last longer.  Preservative-free products have shorter use-by dates, and so people throw more away with the result that… they end up buying more.

A cynical person might wonder who really benefits from these “free from artificial preservatives” policies…. especially when the result is freaky lumps of mould in your sugary orange drink.