How many scientists does it take to discover five elements? More than you might think…

My last post chronicled (see what I did there?) a meandering stroll through all 118 elements in the periodic table. As I read through all the pieces of that thread, I kept wanting to find out more about some of the stories. This is the international year of the periodic table, after all — what better time to go exploring?

But, here’s the thing: 118 is a lot. It took ages even just to collect all the (mostly less than) 280-character tweets together. Elemental stories span the whole of human existence and are endlessly fascinating, but telling all of them in any kind of detail would take whole book (not a small one, either) and would be a project years in the making. So, how about instead having a look at some notable landmarks? A sort of time-lapse version of elemental history and discovery, if you will…


The word “carbon” comes from the Latin “carbo”, meaning coal and charcoal.

Let’s begin the story with carbon: fourth most abundant element in the universe and tenth most abundant in the Earth’s crust (give or take). When the Earth first formed, about 4.54 billion years ago, volcanic activity resulted in an atmosphere that was mostly carbon dioxide. The very earliest forms of life evolved to use carbon dioxide through photosynthesis. Carbon-based compounds make up the bulk of all life on this planet today, and carbon is the second most abundant element in the human body (after oxygen).

When we talk about discovering elements, our minds often leap to “who”. But, as we’ll see throughout this journey, that’s never an entirely straightforward question. The word “carbon” comes from the Latin carbo, meaning coal and charcoal. Humans have known about charcoal for many thousands of years — after all, if you can make a fire, it’s not long before you start to wonder if you can do something with this leftover black stuff. We’ll never know who first “discovered” carbon. But we can be sure of one thing: it definitely wasn’t an 18th century European scientist.

Diamond is a form of carbon used by humans for over 6000 years.

Then there are diamonds, although of course it took people a bit longer to understand how diamonds and other forms of carbon were connected. Human use of diamonds may go back further than we imagine, too. There’s evidence that the Chinese used diamonds to grind and polish ceremonia tools as long as 6,000 years ago.

Even the question of who first identified carbon as an element isn’t entirely straightforward. In 1722, René Antoine Ferchault de Réaumur demonstrated that iron was turned into steel by absorbing some substance. In 1772, Lavoisier showed for the first time that diamonds could burn (contrary to a key plot point in a 1998 episode of Columbo).

In 1779, Scheele demonstrated that graphite wasn’t lead, but rather was a form of charcoal that formed aerial acid (today known as carbonic acid) when it was burned and the products dissolved in water. In 1786 Claude Louis Berthollet, Gaspard Monge and C. A. Vandermonde again confirmed that graphite was mostly carbon, and in 1796, Smithson Tennant showed that burning diamond turned limewater milky — the established test for carbon dioxide gas — and argued that diamond and charcoal were  chemically identical.

Even that isn’t quite the end of the story: fullerenes were discovered 1985, and Harry Kroto, Robert Curl, and Richard Smalley were awarded a Nobel Prize for: “The discovery of carbon atoms bound in the form of a ball” in 1996.

Type “who discovered carbon” into a search engine and Lavoisier generally appears, but really? He was just one of many, most of whose names we’ll never know.


Brass, an alloy of zinc, has been used for thousands of years.

Now for the other end of the alphabet: zinc. It’s another old one, although not quite as old as carbon. Zinc’s history is inextricably linked with copper, because zinc ores have been used to make brass alloys for thousands of years. Bowls made of alloyed tin, copper and zinc have been discovered which date back to at least 9th Century BCE, and many ornaments have been discovered which are around 2,500 years old.

It’s also been used in medicine for a very long time. Zinc carbonate pills, thought to have been used to treat eye conditions, have been found on a cargo ship wrecked off the Italian coast around 140 BCE, and zinc is mentioned in Indian and Greek medical texts as early as the 1st century CE. Alchemists burned zinc in air in 13th century India and collected the white, woolly tufts that formed. They called it philosopher’s wool, or nix alba (“white snow”). Today, we know the same thing as zinc oxide.

The name zinc, or something like it, was first documented by Paracelsus in the 16th century — who called it “zincum” or “zinken” in his book, Liber Mineralium II. The name might be derived from the German zinke, meaning “tooth-like” — because crystals of tin have a jagged, tooth-like appearance. But it could also suggest “tin-like”, since the German word zinn means tin. It might even be from the Persian word سنگ, “seng”, meaning stone.

These days, zinc is often used as a coating on other metals, to prevent corrosion.

P. M. de Respour formally reported that he had extracted metallic zinc from zinc oxide in 1668, although as I mentioned above, in truth it had been extracted centuries before then. In 1738, William Champion patented a process to extract zinc from calamine (a mixture of zinc oxide and iron oxide) in a vertical retort smelter, and Anton von Swab also distilled zinc from calamine in 1742.

Despite all that, credit for discovery of zinc usually goes to Andreas Marggraf, who’s generally considered the first to recognise zinc as a metal in its own right, in 1746.


Evidence of helium was first discovered during a solar eclipse.

Ironically for an element which is (controversially) used to fill balloons, helium’s discovery is easier to pin down. In fact, we can name a specific day: August 18, 1868. The astronomer Jules Janssen was studying the chromosphere of the sun during a total solar eclipse in Guntur, India, and found a bright, yellow line with a wavelength of 587.49 nm.

In case you thought this was going to be simple, though, he didn’t recognise the significance of the line immediately, thinking it was caused by sodium. But then, later the same year, Norman Lockyer also observed a yellow line in the solar spectrum — which he concluded was caused by an element in the Sun unknown on Earth. Lockyer and Edward Frankland named the element from the Greek word for the Sun, ἥλιος (helios).

Janssen and Lockyer may have identified helium, but they didn’t find it on Earth. That discovery was first made by Luigi Palmieri, analysing volcanic material from Mount Vesuvius in 1881. And it wasn’t until 1895 that William Ramsay first isolated helium by treating the mineral cleveite (formula UO2) with acid whilst looking for argon.

Mendeleev’s early versions of the periodic table, such as this one from 1871, did not include any of the noble gases (click for image source).

Interestingly, Mendeleev’s 1869 periodic table had no noble gases as there was very little evidence for them at the time. When Ramsay discovered argon, Mendeleev assumed it wasn’t an element because of its unreactivity, and it was several years before he was convinced that any of what we now call the noble gases should be included. As a result, helium didn’t appear in the periodic table until 1902.

Who shall we say discovered helium? The astronomers, who first identified it in our sun? Or the chemists, who managed to collect actual samples on Earth? Is an element truly “discovered” if you can’t prove you had actual atoms of it — even for a brief moment?


So far you may have noticed that all of these discoveries have been male dominated. This is almost certainly not because women were never involved in science, as there are plenty of records suggesting that women often worked in laboratories in various capacities — it’s just that their male counterparts usually reported the work. As a result the men got the fame, while the women’s stories were, a lot of the time, lost.

Marguerite Perey discovered francium (click for image source).

Of course, the name that jumps to mind at this point is Marie Curie, who famously discovered polonium and radium and had a third element, curium, named in honour of her and her husband’s work. But she’s famous enough. Let’s instead head over to the far left of the periodic table and have a look at francium.

Mendeleev predicted there ought to be an element here, following the trend of the alkali metals. He gave it the placeholder name of eka-caesium, but its existence wasn’t to be confirmed for some seventy years. A number of scientists claimed to have found it, but its discovery is formally recorded as having been made in January 1939 by Marguerite Perey. After all the previous failures, Perey was incredibly meticulous and thorough, carefully eliminating all possibility that the unknown element might be thorium, radium, lead, bismuth, or thallium.

Perey temporarily named the new alkali metal actinium-K (since it’s the result of alpha decay of 227Ac), and proposed the official name of catium (with the symbol Cm), since she believed it to be the most electropositive cation of the elements.

But the symbol Cm was assigned to curium, and Irène Joliot-Curie, one of Perey’s supervisors, argued against the name “catium”, feeling it suggested the element was something to do with cats. Perey then suggested francium, after her home country of France, and this was officially adopted in 1949.

A sample of uraninite containing perhaps 100,000 atoms of francium-223 (click for image source).

Francium was the last element to be discovered in nature. Trace amounts occur in uranium minerals, but it’s incredibly scarce. Its most stable isotope has a half life of just 22 minutes, and bulk francium has never been observed. Famously, there’s at most 30 g of francium in the Earth’s crust at any one time.

Of all the elements I’ve mentioned, this is perhaps the most clear-cut case. Perey deservedly takes the credit for discovering francium. But even then, she wouldn’t have been able to prove so conclusively that the element she found wasn’t something else had it not been for all the false starts that came before. And then there are all the other isotopes of francium, isolated by a myriad of scientists in the subsequent years…


All of which brings us to one of the last elements to be discovered: tennessine (which I jokingly suggested ought to be named octarine back in 2016). As I mentioned above, francium was the last element to be discovered in nature: tessessine doesn’t exist on Earth. It has only ever been created in a laboratory, by firing a calcium beam into a target made of berkelium (Bk) and smashing the two elements together in a process called nuclear fusion.

Element 117, tennessine, was named after Tennessee in the USA.

Like tennessine, berkelium isn’t available on Earth and had to be made in a nuclear reactor at Oak Ridge National Laboratory (ORNL) in Tennessee — the reason for the new element’s name. One of the scientists involved, Clarice E. Phelps, is believed to be the first African American to discover a chemical element in recent history, having worked on the purification of the 249Bk before it was shipped to Russia and used to help discover element 117.

Tennessine’s discovery was officially announced in Dubna in 2010 — the result of a Russian-American collaboration — and the name tennessine was officially adopted in November 2016.

Who discovered it? Well, the lead name on the paper published in Physical Review Letters is Yuri Oganessian (for whom element 118 was named), but have a look at that paper and you’ll see there’s a list of over 30 names, and that doesn’t even include all the other people who worked in the laboratories, making contributions as part of their daily work.

From five to many…

There’s a story behind every element, and it’s almost always one with a varied cast of characters.

As I said at the start, when we talk about discovering elements, our minds often leap to “who” — but they probably shouldn’t. Scientists really can’t work entirely alone: collaboration and communication are vital aspects of science, because without them everyone would have to start from scratch all the time, and humans would never have got beyond “fire, hot”. As Isaac Newton famously said in a letter in 1675: “If I have seen further it is by standing on the shoulders of giants.”

There’s a story behind every element, and it’s almost always one with a varied cast of characters.

This post was written by with the help of Kit Chapman (so, yes: it’s by Kit and Kat!). Kit’s new book, ‘Superheavy: Making and Breaking the Periodic Table‘, will be published by Bloomsbury Sigma on 13th June.

Like the Chronicle Flask’s Facebook page for regular updates, or follow @chronicleflask on Twitter. Content is © Kat Day 2019. You may share or link to anything here, but you must reference this site if you do. If you enjoy reading my blog, please consider buying me a coffee through Ko-fi using the button below.
Buy Me a Coffee at



The acid that really does eat through everything

acid burnThanks to the big screen, many of us think of acids as dangerous, burn-through-anything substances.  Think of those scenes in the Alien movies, where the alien’s blood drips through solid metal, destroying everything in its path.

Of course the vast majority of acids are much more boring.  Vinegar (which contains ethanoic acid) and citric acid (found in, guess what, citrus fruits) are common acids that we eat all the time, and they don’t burn holes in your mouth.  There’s an even stronger acid, hydrocholoric acid (HCl), in your stomach and not only does it not burn you from the inside out (usually), it actually helps you to digest your food and keeps you safe from nasty bacteria.

But there is an acid that’s really, properly scary.  And its name is hydrofluoric acid.

Hydrofluoric acid has the chemical formula HF, but unlike HCl you won’t find this one in a school laboratory, and if it turns up in your stomach you’re in very big trouble.  In true movie-acid style it’s capable of dissolving many materials, and is particularly well-known for its ability to dissolve glass (which is mainly silicon dioxide).  It will also dissolve most ceramics (which contain aluminosilicates: compounds made of chemically-bonded aluminium, silicon and oxygen).  And, like many other acids, it also reacts with metals, so storing it is a bit tricky.  Where do you put something that eats through its container? Well, these days it’s stored in special plastic bottles, but in the 17th century when it was first discovered chemists had to use glass bottles coated inside with wax, and hope the coating was a good one.

HF has been an important industrial chemical for centuries.  It’s used to etch patterns into, and clean, glass and ceramics, and also to dissolve rock samples, for example to extract chemicals or fossils from rocks.  It’s also used to clean stainless steel and, in more recent times, to prepare silicon wafers (used to make silicon chips) in the electronics industries.

The chemist Carl Wilhelm Scheele (him again – he just keeps turning up doesn’t he?) was the first person to produce HF in large quantities in 1771.  Scheele is particularly famous for his bad habit of sniffing and tasting any new substances he discovered.  Cumulative exposure to mercury, arsenic, lead, their compounds, hydrofluoric acid, and other substances took their toll on him and he died on 21 May 1786 at the age of just 43.  And that’s why your science teacher was endlessly telling you not to eat or drink in the laboratory.

So why is hydrogen fluoride so nasty?  For starters the gas is a severe poison that immediately and permanently damages the lungs and the corneas of the eyes – lovely. Hydrofluoric acid solution is a contact-poison that causes deep, initially painless burns which result in permanent tissue death. It also interferes with calcium metabolism, which means that exposure to it can and does cause cardiac arrest (heart attack) and death.  Contact with as little as 160 square centimeters (25 square inches) of skin can kill – that’s about the area of the palm of your hand.

And now for a gruesome and tragic tale: in 1995 a chemist working in Australia was sitting working at a fume cupboard and knocked over a small quantity (100-230 millilitres, about the equivalent of a drinking glass full of water) of hydrofluoric acid onto his lap, splashing both thighs.  He immediately washed his legs with water, jumped into a chlorinated swimming pool at the rear of the workplace, and stayed there for about 40 minutes before an ambulance arrived.  (Should you ever need to know, the proper treatment for HF exposure is calcium gluconate gel: calcium gluconate reacts very quickly with hydrofluoric acid to form non-toxic calcium fluoride, rendering it harmless.)  Sadly, his condition deteriorated in hospital and, despite having his right leg amputated 7 days after the accident, he died from multi-organ failure 15 days after hydrofluoric acid spill.  Remember, that was a spill the size of a glass of water.

Because hydrofluoric acid interferes with nerve function, burns from it often aren’t painful to begin with. Small accidental exposures can go unnoticed, which means that people don’t seek treatment straight away, making the whole thing worse.  Do a Google image search on ‘hydrogen fluoride burns’ and you’ll see some images that will really turn your stomach.

So which would you rather meet?  An alien with acid blood and a habit of laying eggs in your stomach or an invisible gas that destroys your tissues and leaves you, if not dead from multiple organ failure, then suffering with horribly disfiguring burns?  You might stand a better chance against the alien…

Comments have been turned off for this post. If you’re planning a DIY project, hydrofluoric acid is probably not your friend. Try Google and/or YouTube; there are almost certainly umpteen safer ways to do the thing you’re trying to do.

All content is © Kat Day 2017. You may share or link to anything here, but you must reference this site if you do.
Buy Me a Coffee at

Chemical catastrophes – who were the biggest baddies of chemistry’s past?

As a big fan of chemistry I like to encourage students to believe that it will be a huge force for good in the future, providing us with solutions to problems such as sustainable energy, currently incurable diseases and new materials.  And I hope I’m right about this.  But there’s no escaping the fact that chemistry has a dark, dirty and dangerous past.  In the days before health and safety – oh we take the mickey, but trust me you wouldn’t actually want to be without it – proper regulations and rigorous testing, chemists threw dangerous chemicals around like sweeties.  Quite literally in some cases.  They tasted and smelled toxic and dangerous substances and, worse, they released them on an unsuspecting population with barely a second thought.

baddySo with that in mind, who are my top three biggest baddies of chemistry’s past?

Fritz Haber (1868 – 1934)
The German chemist Fritz Haber gets the number three slot.  In some ways, he’s a bit of double-edged sword.  He did good along with the bad, inventing – along with Carl Bosch – the Haber Bosch process for making ammonia.  No matter what your feelings about inorganic fertilisers, we have to accept that without them we wouldn’t be able to feed the population of this country, let alone the world.  There just isn’t that much pooh out there.  Some people would argue that the population rise Haber’s process facilitated has been a disaster in itself.  But this is to conveniently forget that, had he not developed it, they probably wouldn’t be here to complain about it.

Haber wasn’t entirely a misunderstood genius though.  He’s also been described as the father of chemical warfare for his work on the use of chlorine and other poisonous gases during World War I.  His work included the development of gas mask filters, but he also led teams developing deadly chlorine gas used in trench warfare.  He was even there to supervise its release.  He was a patriotic German and believed he was doing the right thing, supporting his country in the war effort.  During the second world war Haber’s skills were initially sought out by the Nazis, who offered him special funding to continue his work on weapons.  However Haber was Jewish, so in common with other scientists in a similar position he ended up leaving Germany in 1933.

Famously, Haber’s first wife disagreed vehemently with his work on chemical warfare.  In fact, perhaps unable to cope with the fact that he had personally overseen the successful use of chlorine in 1915, she committed suicide by shooting herself with his service revolver.  That same morning, Haber left again to oversee gas release against the Russians, leaving behind his grieving 13 year-old son.

Haber was awarded the Nobel prize for Chemistry in 1919 for his work on the Haber Process.  So the story goes, other scientists at the ceremony refused to shake his hand in protest at his work with chemical weapons.  A tragic story all round.

Carl Wilhelm Scheele (1742 – 1786)
We’ve seen Scheele’s name come up before of course.  In his short – thanks to his bad habit of tasting and sniffing toxic chemicals – life he made a lot of chemical discoveries, but didn’t get the recognition for many of them because he always seemed to publish after someone else.  The ones he is remembered for always seem to be the horribly dangerous ones (maybe no one else wanted the credit).  For example he discovered hydrogen cyanide (a poison beloved of many an Agatha Christie villain, hydrogen fluoride (a highly toxic gas that forms the incredibly dangerous hydrofluoric acid when dissolved in water) and hydrogen sulfide (toxic, highly flammable and stinks of rotten eggs).

But his most harmful contribution to the world was undoubtedly Scheele’s Green, the arsenic-based yellow-green dye that was used to colour fabrics, paints, candles, toys and even, most tragically of all, foodstuffs in the 1800s.  It’s impossible to count but it was undoubtably responsible, directly and indirectly, for huge numbers of deaths in the 19th century.  Essentially he invented a deadly poison that ended up in thousands of homes all around the world.  Aren’t you glad we have safety testing these days?

Thomas Midgely (1889-1944)
Who was worse, Scheele or Midgely?  It’s a tough call, but I think Midgely takes it, particularly because he had some inkling exactly how damaging at least some of his work might turn out to be.

Midgely is famously responsible for synthesising the first CFC, freon.  CFCs, or chlorofluorocarbons, are neither toxic nor flammable, so were considered much safer than other propellants and refrigerants used at the time.  In fact, he was even awarded the Perkin Medal in 1937 for his work.  This, of course, was some time before the terrible consequences of CFCs were realised.  As we now know, they turned out to very damaging to the ozone layer, and in 1989 twelve European Community nations agreed to ban their production, and they have since been phased out across the world.

Although CFCs were a disaster, Midgely could at least be defended for having no way of knowing how disastrous they would ultimately turn out to be.  Not so for his other famous invention.  Whilst working at General Motors he discovered that adding tetraethyllead, or TEL, to petrol (aka gasoline) prevented ‘knocking‘ in internal combustion engines, which is when the air/fuel mixture ignites at slightly the wrong time.  Knocking makes the engine much less efficient, and so preventing it was a big issue.  You’d think Midgely might have accepted that lead in petrol was a bad idea when he had to take a vacation to recover from severe lead poisoning, but no.  In fact he appeared to have been pretty cynical about the whole thing, pouring TEL over his hands at a press conference in 1924 to demonstrate its apparent safety (its not, and he had to take more time off afterwards to recover).

Unfortunately burning fuel with TEL in it disperses lead into the air where it’s readily inhaled by innocent bystanders, and it’s particularly harmful to children.  Lead exposure has been linked to low IQ and antisocial behaviour, and recently researchers suggested that the ban on leaded petrol across the world in the early 2000s might now be leading to a reduced crime rate.

So for knowingly poisoning people worldwide with lead, and unknowingly taking out a chunk of the ozone layer, Midgely gets my award as biggest chemical baddy.

Would you pick someone else?

Bronze, humbugs, wallpaper and electronics: what’s your favourite element?

As a chemistry teacher I’m sometimes asked for my favourite element. Don’t tell anyone, but I don’t really have a single favourite. That would be a terribly boring answer though, so I usually pick something to make a relevant point. Carbon, for example, for being the stuff of life, for having a whole third of chemistry – organic chemistry – devoted to its compounds, and because diamonds are fascinating and really very pretty things. Or sometimes I go for xenon, for being a noble gas, for its potential use as an anaesthetic, and just because its name starts with an X (have a go at this: name five words that start with X without googling*).

And then, if I think we’ve got time for a story, I might go for the famous and much-maligned element number 33: arsenic (As).  After all, if it weren’t one of the world’s most famous poisons you’d have to love it just for having the word ‘arse’ in its name.

arsenic poison bottle

So, a little background. It’s the 20th most common element in the Earth’s crust, and is actually one of the oldest known elements. It was officially first documented around 1250 by a Dominican friar called Alvertus Magnus but it’s been used for more than 3000 years, going back as far as the bronze age when it was added to bronze to make it harder. It’s a metalloid, which means it’s neither quite metal nor non-metal, and these days its most important use is in the electronics industry.

There are many, many interesting stories associated with arsenic. One of my personal favourites, if that’s the right word, is the story of the Bradford Sweets Poisoning. Back in 1858 a Bradford confectioner known as ‘Humbug Billy’ was buying his mint humbugs from another local character called Joeseph Neal. At the time, sugar was expensive so Neal was in the habit of cutting it with something called ‘daft’, a mysterious substance that could contain anything from limestone to plaster of Paris. Neal sent his lodger to the local pharmacy to collect the daft. The druggist was ill, and somehow or other his assistant managed to sell Neal’s lodger 12 pounds of arsenic trioxide (you might imagine this was an expensive error, but arsenic was actually surprisingly cheap: half an ounce cost about the same as a cup of tea).

The mistake went undetected, despite the sweetmaker who worked for Neal suffering symptoms of illness during the sweet-making process, and despite the resulting humbugs looking so different from normal that Humbug Billy managed to buy them from Neal at a discount. Humbug Billy himself promptly became ill after eating the sweets, but nevertheless still sold 5 pounds of them from his market stall that day. Subsequently about 20 people died and a further 200 became ill. To start with the deaths were blamed on cholera, common at the time, but soon they were traced to the sweet stall. Later analysis showed that each humbug contained enough arsenic to kill two people.

This tragic tale led to The Pharmacy Act 1868 and the requirement for proper record keeping by pharmacists. Ultimately it also led to legislation preventing the adulteration of foodstuffs, such as for example, oh I don’t know, sneaking horse into something labelled beef.

Historically arsenic was also used in dyes and pigments, perhaps most famously Scheele’s Green – also known as copper arsenite and invented by Carl Wilhelm Scheele in 1775 – produced a wonderful green colour that was used to dye wallpaper, fabrics, added to paints, children’s toys and even sweets. Many poisonings in Victorian times were linked to toxic home furnishings and clothing. In fact, this probably explains the superstition that green is an unlucky colour, especially for children’s furnishings and clothes. Arsenic poisoning being very unlucky indeed. Next time you’re near a baby store, have a look: even today (arsenic pigments now long defunct, thank goodness) you still don’t see that many green things.

One of the most famous people to die from arsenic poisoning was probably Napoleon. Originally thought to have been deliberately poisoned, analysis of his hair samples in 2008 demonstrated that his exposure had been long-term rather than sudden, and was probably due to the lovely green wallpaper and paint decorating the room in which he’d been confined.

Then there’s George III, the famously ‘mad King George’. His episodes of madness and physical symptoms were linked to the disease porphyria, and 2004 studies of samples of his hair also found very high levels of arsenic which may well have triggered his symptoms. Ironically, he may have been exposed to arsenic as part of his medical treatment.

In fact historically arsenic was used to treat many medical complaints. It’s even been used as an aphrodisiac, thanks to the fact that small doses stimulate blood flow. In 1851 it was reported that peasants in Styria, a remote region in Austria, were in the habit of swallowing solid lumps of the stuff that, fortunately, passed through their digestive system relatively intact. However they absorbed just enough to given the women a rosy glow and the men an increased libido – resulting in something of a population boom. Upon hearing about this British manufacturers immediately began selling arsenic-containing beauty products, including soap and skin treatments, with predictably tragic results.

Thanks to its toxicity arsenic is used in pesticides, herbicides and insecticides, although these uses are gradually being phased out. Despite being notoriously poisonous to most organisms, there are interestingly some species of bacteria whose metabolism relies on arsenic. Arsenic turns up naturally in groundwater and is absorbed by plants such as rice, as well turning up, in the form of arsenobetaine, in mushrooms and fish. Don’t worry though, this particular arsenic compound is virtually non-toxic.

Today gallium arsenide, with the brilliant chemical formula GaAs, is one of biggest uses of arsenic. It’s a semiconductor, used in the manufacture of many electronic devices, including solar cells. Its electronic properties are, in some ways, superior to silicon so despite its inherent dangers its important stuff.

So it definitely has one of the most fascinating histories of any of the elements, and I’ve only mentioned a tiny number of the many, many arsenic-related stories out there.  From the bronze age to the computer age, arsenic has been with us, both friend and foe, and will be with us for a lot longer yet.

So, what’s your favourite element? Tell me and maybe I’ll write about it in a future post!


* betcha said xenon (of course), xylophone, xi and xu if you play Scrabble, x-ray and maybe xylem. Am I right?