Current science topics from a Christian viewpoint

These short articles will be updated approximately every two months. Up to July 2020 they were published in the Tyne Valley Express, a local events and advertising magazine distributed free in the Tyne valley area (some minor changes and updates have been made here); there was no article for May 2020 because the coronavirus pandemic meant the magazine was not produced. The article written for September was not published because the editor disagreed with its contents; you can read an account of this disagreement on the Home Page. Articles after this have not been sent to TVE for publication.

22: November 2020

Following the science?

I’ve just had an article accepted by an international science journal: ‘Distortions, deviations and alternative facts: reliability in crystallography‘.  The discussion concerns a particular specialist subject, but what about the reliability of science in general?  The claims that governments are ‘following the science’ in their approach to the coronavirus pandemic leave some people unconvinced (they don’t believe it), while others are confused (they don’t know what it means), and yet others object because they’re anti-science anyway.

Part of the problem is that many people don’t really know what science is and how it works.  It isn’t a set of fixed, unchanging hard facts.  It’s an attempt to make sense of what we observe in terms of theories and models that are as simple as possible but as complex as necessary, providing explanations that best fit the available evidence.  As more evidence is found, science develops and changes these explanations, and sometimes has to throw them away and find new ones.  Scientists don’t know all facts with 100% certainty; our models are imperfect, incomplete, and we have to know and respect their limitations.  We’re constantly learning, and so we have to keep revising our ideas to fit what we find out.

If you’re not sure what I mean by scientific models, two of the recorded talks on this website share some common material and provide more explanation and examples: ‘What makes a scientist tick?‘, uploaded recently on the Home Page as a powerpoint-based video, combines a discussion of science with some autobiography; ‘Science and Christian faith – age-old enemies or natural allies?‘ has separate audio recording and presentation slides from 2017 and is on the Previous Events page.

Examples of scientific models that have been discarded or amended include the suggestion of the substance phlogiston – credited with negative weight! – in the air before oxygen was discovered and provided a better explanation of combustion; the earth at the centre of the universe (this was a scientific, not just a religious, belief); the existence of only four elements (earth, air, fire and water); the indivisibility of atoms (that’s what the word ‘atom’ originally means).  Scientists argued for years about the nature of light (waves or particles?) until aspects of both these theories were combined into a single more complex model that we use today.  And just think how doctors used to treat many ailments by removing blood before its vital function was well understood!

We see the same progress of scientific understanding with the SARS-COV-2 virus.  When it first appeared about a year ago, it was little understood – it was commonly referred to as ‘a novel coronavirus’.  We now know much more about its structure, behaviour and effects, and are still learning: for example, ‘Long Covid’ is a fairly recently recognised issue.  So it shouldn’t be surprising that scientific advice has changed, such as regarding the usefulness of face coverings or the impact of Covid-19 on younger people; it doesn’t mean that scientists have ‘got it wrong’ or ‘don’t know what they’re talking about’.  The imperfection and uncertainty of our knowledge, together with the fact that biology and medicine are inherently less predictable and more variable than physics and chemistry, means there’s also room for a range of possible conclusions and opinions, with statistical uncertainties about each of them, so it’s unreasonable to expect a 100% definite and unanimous viewpoint from all qualified scientists anyway.  And don’t forget that all scientists, like the rest of us, are fallible human beings who can make mistakes, far removed from the characters we see in some science fiction.

Add to this that medical considerations have to be taken together with psychological, emotional, economic and other competing factors for which nobody can claim to be experts on all of them, and we have the real-life situation in which policy and practice can only be ‘guided by science’ rather than led, driven, or dictated by it.

It’s still the case that we will all be much better served by putting a conscious and reasonable degree of trust in the current – and developing – scientific understanding of our pandemic situation and the advice based on it than on the fiction promoted by too much of today’s social and other media, the sort of stuff that I discussed in the previous article a couple of months ago.  There’s a saying that ‘all truth is God’s truth’; science is an important part of that truth, to be valued alongside other parts.  We may not know all the facts, but our knowledge keeps growing, and fact beats fantasy when it comes to our safety and wellbeing.

21: September 2020

It’s a conspiracy!

Some governments claim they’re ‘following the science’ in their response to the coronavirus pandemic, while others are blatantly ignoring all scientific and health advice in favour of politics and economics.  Unfortunately the public are often left confused and ill-informed, many choosing to follow the ‘alternative facts’ provided by unreliable social media.  These include a range of crazy ideas that have been thoroughly debunked and discredited but still attract attention.

A recent reputable survey of a representative cross-section of Americans found that half of regular Fox News viewers believed the pandemic is being exploited by Bill Gates and other super-rich moguls to gain control of the world by using vaccination as a trick to insert microchips into everyone.  This Matrix-like plot combines features of several so-called conspiracy theories, completely unfounded viewpoints based on anti-authority attitudes and a rejection of objective evidence-based truth.

The ‘world domination’ theory is believed by many who accept that the pandemic and the virus causing it are real, but others think the whole thing is a hoax, with reports of infections and deaths invented by those in authority in order to restrict our freedom, or undermine democracy, or some other evil scheme.  This is conspiracy on a massive scale, with huge numbers of officials, scientists and health workers complicit in the fraud!  Believing it must take an enormous suspension of reason and common sense.  Social media carry lots of claims of supposedly positive Covid-19 test results sent to people who’ve never actually been tested, but these always seem to be at best second-hand reports of what happened to someone else, never a personal experience that can be proved.

We all know that advice on face coverings has been variable and confused, but we can certainly reject any notion that their use is dangerous because it prevents oxygen from reaching our lungs or forces us to breathe our own waste carbon dioxide; these ideas have no supporting evidence, they contradict many years of experience of health professionals using PPE (personal protective equipment), and they are ridiculous in view of the size of atmospheric gas molecules compared with the dimensions of fabric fibres.  A simple cloth face covering is not supposed to protect the wearer from infection (unlike clinical PPE); it reduces the risk of transmission from the wearer, who may be infected but showing no symptoms, to other people, for which there is plenty of scientific evidence.

Untrue claims of danger are also made about vaccination, one of the biggest health improvements and lifesavers ever introduced into medical science.  Unfounded scares about vaccines keep cropping up, rather like outbreaks of the nasty diseases they’re supposed to prevent, and those who spread them usually refer constantly to the same old claims that have long been proved false.  The reality is that viruses such as C-19, measles, and even the familiar winter flu, are far more dangerous than any vaccine that has been developed, properly tested, and approved for use against them.  Granted, I’d hesitate to accept the new rushed Russian vaccine, but any approved for use in the UK will be as safe and effective as possible, with an insignificant risk compared with the virus.

Among the most ridiculous theories is that C-19 is spread by 5G technology, either in direct transmission or by suppressing our immune system.  I’ve discussed 5G scares in a previous article, but these ideas tend to be supported by claims like the introduction of 5G in Wuhan, China before anywhere else and just before the start of the pandemic, which simply isn’t true. 

I haven’t yet come across any widespread claim that the coronavirus is a result of alien abductions, but it wouldn’t surprise me.  People who accept and promote these conspiracy theories tend to go for a whole package of them as an expression of personal freedom and a rejection of conventional truth claims.  You’ll find them popular, for example, among members of the Flat Earth Society and other odd belief systems.  Unfortunately they’re also far too common among those with some strong forms of religious conviction, including many so-called ‘fundamentalist Christians’ in the USA and elsewhere.  This is not mainstream Christian belief!

A recent virtual conference run by the organisation Christians in Science sought to bring Christian scientific perspectives to the current pandemic.  If you’re interested, you can find video recordings at cis.org.uk/online-conference-god-and-pandemics/.  And I’ve just come across ‘a Christian statement on science for pandemic times’ and have added my own signature to it as a reasoned, balanced summary of what I think is the right approach.  It’s at this address: statement.biologos.org and there are already several thousand signatures.

20: July 2020 (a double-length article)

Going viral

Not much doubt about the choice for topical science this time round!  The ‘novel coronavirus SARS-CoV-2’ that causes the disease Covid-19 has an impact on every single one of us, and there’s a huge amount of misinformation, ignorance (even at high levels) and resulting fear making the situation worse.  So let’s take a look at some facts about viruses in general and this current troublemaker in particular.

A virus is a submicroscopic infectious agent that has no independent life of its own but has to invade the cells of a living host in order to replicate (make copies of itself) to continue to exist: it is a parasite.  The host can be human, animal, plant, or lower form of life such as a bacterium.  Viruses and bacteria (among other agents) can both cause disease and death, though by no means all of them do, but they are quite different beasts: bacteria are independent one-cell organisms, they are far more complex, and they are big enough to be seen through an optical microscope.  Bacterial infections include salmonella (food poisoning), tuberculosis (TB), and cholera.  The Black Death of the 14th century, sometimes compared to the 1918 flu pandemic and the present problem, was in fact not viral but bacterial.

A virus, by contrast, is essentially just a relatively small piece of genetic material (double-strand DNA or the simpler single-strand RNA) wrapped up in a protective coating of protein; some viruses, including SARS-CoV-2, also have a layer of lipid (a kind of fat) on the outside – that’s why thorough handwashing with soap is so important, it really does destroy the virus.  Spiky parts of the protein attach to a host cell and enable the virus to get inside, where the RNA (or DNA) hijacks the cell’s normal operation and forces it to produce copies of the virus instead of copying its own DNA.  The virus population then leaves to continue spreading, usually destroying the host cell.

There are actually millions of different kinds (families) of virus, a few thousand having been studied and described in detail, but only a very small number are harmful to humans; others attack different species of animals and plants (such as foot-and-mouth disease or tobacco mosaic virus, the first one to be identified at the end of the nineteenth century).  Some viruses that are harmful to bacteria can even be used as medicines (bacteriophages) instead of other antibiotics.  Most viruses are harmless, and some of them, like many bacteria (probiotics), are beneficial and important for our health – we live in harmony with them.  Examples of human viral infections are measles, norovirus, herpes simplex (a cause of cold sores), HIV, and Ebola.  The coronavirus family causes flu and some colds as well as SARS-related respiratory diseases including Covid-19.

The two factors that set the danger level of a particular harmful or ‘pathogenic’ virus are these: how infectious is it (how easily is it passed on), and what damage can it do?  Some well-known viruses are rather infectious but do little harm, like most colds; others can be fatal but do not spread easily, like rabies.  Unfortunately SARS-CoV-2 has a high infection rate for close contacts and is potentially life-threatening for certain vulnerable types of people.  Hence the lockdowns and other severe methods adopted worldwide to reduce its spread while effective medical treatments are sought.

One of the problems with viruses is that they can change their form.  The change may be in the genetic sequence of the RNA (a mutation, a process that occurs naturally in all RNA and DNA), and this alters the behaviour and effects of the virus.  A virus can also change the structure and shape of its protein coat, which is rather like changing your clothes or going in disguise.  This makes it harder to treat them effectively; new flu vaccines have to be developed every year for new virus strains.

How might viruses be treated?  As with all diseases, we can focus on cure and/or reducing the physical results and symptoms, and we can aim for prevention.  The massive efforts of the NHS in recent months have been most visible in the treatment of those who are already infected, especially where the symptoms are severe.  Here the aim is to provide support and relieve symptoms while the body’s own natural immune system fights the invader, but help might also be given through specific antiviral drugs that work with the immune system, especially if this is struggling.  Effective antivirals have been developed for some viruses including HIV and hepatitis C.  The recent clinical trial that showed the anti-inflammatory dexamethasone to provide significant help for severely affected Covid-19 patients is welcome news and many other potential drugs are being tested, but large numbers have already been tried and rejected as ineffective or even dangerous, including the anti-malaria treatment once promoted strongly in the USA.  One important fact here is completely certain: antibiotics are not an effective treatment for a viral infection and should not be taken, as this reduces their proper usefulness against bacteria.  Nor do antibacterial agents in cleaners kill the virus; just use plain soap and disinfectants.

The big aim, of course, is to do something serious, and preferably medical, about prevention.  The emergency imposition of lockdowns and other physical restrictions can provide only a temporary solution.  The usual method to prevent, or at least drastically reduce, viral infections is vaccination, a procedure that was first used against smallpox, a disease that has now been completely eradicated as a result: a great success story.  Vaccination works by prompting the body to produce a defence against a particular virus by presenting it with something similar enough to it but without the danger.  This can be a modified form of the virus (for example, the protein coat but with no RNA inside), or a weakened version, or a closely related but harmless virus from the same family.  Once the body has been trained to recognise the enemy, the defence mechanism will quickly kick into play if the real virus comes along later.  The immune system can be trained to recognise and attack the protein or genetic component; these different approaches are being used in the many attempts currently being made around the world in a huge effort of international cooperation marred by only a few selfish agencies and governments wanting to keep results and sales profits to themselves.

The fight against SARS-CoV-2 and the Covid-19 it causes would be more effective if everyone understood and believed the basic science and its medical implications.  Unfortunately, as with many aspects of life, significant minorities refuse to accept facts and instead pick up and develop ideas that have no basis at all in real evidence.  We live in an age of conspiracy theories, total fantasies that are spread mainly through social media.  Among those that are sadly relevant to the current pandemic are that the whole virus spread is a hoax designed to impose greater control over our lives, and that the virus is spread by 5G technology (related to the baseless 5G cancer scare I’ve addressed in a previous article), perhaps even as a deliberate manmade bioweapon.  There’s also significant resistance to vaccination, even if and when we do have an effective vaccine, particularly in some supposedly advanced and intelligent societies: the so-called antivaxxers continue to believe discredited reports of links between vaccines such as MMR and problems like autism, reports that are known to be based on deliberate scientific fraud.  Some viruses are vastly more risky than any approved vaccination.

As an aside, computer viruses are so called because they behave rather like biological viruses: they invade a host, cause some kind of nuisance or serious harm, replicate themselves, and seek to infect other hosts through electronic contact, often exploiting human ignorance.  They are variously treated by disinfection programs and by front-line prevention offered by antivirus software.

The expression ‘going viral’ describes a piece of information, an idea, a visual image, or something else that – often to the surprise of the person generating it – is widely and rapidly passed around to attract the attention of others who, in turn, transmit it further so that it quickly becomes widespread and often hits the headlines in competition with major news events.  Recent examples include the support for ‘Captain Tom’ in his NHS fundraising garden walks and the responses to the killing of George Floyd leading to Black Lives Matter protests around the world.  While it is good for some things to go viral in this way, the spread of ridiculous conspiracy theories that undermine attempts to tackle the Covid-19 pandemic is itself a social viral threat adding to problems of the original biological viral threat.  The world would be better off without both of these pests.

19: March 2020

It could be you

…but it probably won’t be.  In strict mathematical terms, your chance of winning the National Lottery jackpot is 1 in 45,057,474.  It’s actually easy (with a calculator!) to work this out.  There are 59 numbered balls, so if you pick one at random, then another (not the same one again), and so on for six balls, the total number of possible sets of six is 59×58×57×56×55×54.  This includes many sets with the same six numbers in different orders (like 3-10-18-29-42-55 and 18-42-10-55-3-29); there are 6×5×4×3×2×1 = 720 such sets, which are all identical if the numbers are rearranged in increasing order like the winning lottery ball numbers.  So the number of unique possible combinations of the six numbers is 59×58×57×56×55×54 divided by 720, and this is 45,057,474.  Only one of these wins the jackpot.

If you toss a coin and see which side ends on top, most people will readily agree that heads and tails have an equal probability of 50% (we’ll discount the tiny probability that it lands on its edge and doesn’t fall over).  What’s your prediction after three heads in a row?  Some people think it’s more likely to be tails next time to “even it up”, but in fact it’s still 50% for each of heads and tails – every throw is completely separate and independent of the others.  Mind you, after a series of 25 heads, I’d put my bet on another head, because I’d suspect it’s a double-headed coin; the probability of 25 heads in a row from a normal coin is about 1 in 33 million (that’s 0.5 multiplied by itself 25 times), nearly as unlikely as winning the lottery jackpot.

Of course, not all probabilities can be worked out exactly from a mathematical formula.  Many are estimates based on experience, and these need to be revised as that experience grows.  Just look at the series of “once in a lifetime” extreme weather events in our own country, let alone the rest of the world.  The probability of such storms is now considered rather higher than it used to be.  Changing our estimate of probabilities applies also to updating the rain forecast during the day, and to things like sports betting odds.

Estimated tiny probabilities and their opposites – near certainties – are particularly tricky to deal with, leading to striking comparisons like a room full of monkeys with word processors producing a Shakespeare play, or a tornado in a scrapyard generating a Boeing 747.  Part of the trouble is that, however improbable it may be, if something is true or has happened, then it’s a fact – the probability becomes irrelevant now.  This sort of argument occurs in debates about the miniscule random probability of the laws of nature and fundamental physical properties being “just right” to support life on earth.  It’s happened and we’re here!

The atheist biologist Richard Dawkins wrote a whole book (Climbing Mount Improbable) in 1996 to demonstrate the reasonability of evolution based on random genetic changes and the constraints of natural selection, in response to claims that this has an extremely low probability of producing the complexity of life as we know it and so “could not have happened”.  However, just a decade later, in The God Delusion (2006), his supposedly knock-out blow against the existence of God is based on the exact opposite argument that God (even Dawkins’ distorted version) would be such a complex being that the probability of his existence is extremely small, therefore God almost certainly doesn’t exist.  He was somewhat disappointed that his brilliant logic did not lead to the widespread conversion of religious believers to atheism!  The same argument was probably(!) the basis of the 2008–9 “atheist bus campaign” in London and elsewhere, with the slogan “There’s probably no God; now stop worrying and enjoy your life”.  Responses to both the book and the buses have been robust, in print and in live debates between Dawkins and Christians.  The main weakness of the crude probability argument, apart from the sheer difficulty of providing a sensible probability estimate in the first place, is that it simply doesn’t apply to something that either is true or isn’t and has no parallels.  For that we need to look, not at statistics, but at evidence – of all kinds: scientific, historical, documentary, testimonial (as in a court of law), and personal.  Statistics and probabilities have their rightful place, but they aren’t the only approach to weighing evidence and coming to conclusions about what’s likely to be true, particularly for something that claims to be unique.

And if you’re feeling overdosed on statistics by now, don’t worry.  There probably won’t be any in my next article!

18: January 2020

Not likely!

“There are three kinds of lies: lies, damned lies, and statistics.”  Mark Twain popularised this saying, which he wrongly attributed to Benjamin Disraeli.  (The connection between lies and Prime Ministers is another subject that we won’t explore here!)  Statistics, with which our previous article finished, can certainly be misleading, either by deliberate misuse or because many people just don’t understand it very well.

In mathematical terms, the probability of something happening or being true can be expressed as a number between zero (0), totally impossible, and one (1), a complete certainty; alternatively we can multiply everything by 100 to give a probability between 0 and 100%.  When it comes to dice, throwing just one gives six possible results with equal probabilities of 1/6.  If you repeat this many times over, you expect to get the numbers 1–6 roughly equal numbers of times, but how nearly equal should they be and how much of a deviation should make you suspect dice loading?  The answer depends on how many throws you’ve made and can itself vary if you repeat the whole experiment again.  Statistics handles such questions with formal concepts like probability distributions, standard deviations, and confidence limits.

Throwing two dice together gives 36 possible pairs of numbers (6×6), the combined total ranging from 2 to 12.  The eleven different totals aren’t equally probable because there’s only one way of making 2 and one way for 12, but 6 ways of making 7, which is the most likely total.  So the probability distribution (flat) for the result of a single throw is different from that (with a central maximum) for two dice throws added together.  Statistics recognises many different distributions, some of which occur often in scientific research, affecting experimental measurements and results calculated from them.

Here are some situations in which statistics are misused or misunderstood.  The first is league tables for things like school exam and test results, university research assessment exercises, and hospital performance.  A lot is made of positions in a league table and how they change from one occasion to another, annually or at other intervals.  This concern ignores at least two factors that can seriously affect the meaning and value of such comparisons.  The first is that the various contributions to the league tables involve putting numerical scores on things that aren’t always precisely measurable, and usually no indication is given of the level of uncertainty in these numbers.  So each “total score” that is given as a single number should really have what statisticians call a “confidence interval” around it, such as 56±3 to suggest the answer could easily range from 53 to 59.  In most cases sensible confidence intervals (uncertainties in the scores) are considerably larger than the differences in scores for whole wide ranges of positions in the table, so the positions themselves are really subject to quite large uncertainties.  Secondly, judging criteria are often changed from year to year, so movements up and down don’t necessarily reflect a real change in performance.

Similar criticisms can be made of opinion polls, or at least of the way they are reported and interpreted.  The main problem here, quite apart from whether people are telling the truth when answering questions, is that an opinion poll surveys a small proportion of the entire population and is supposed to give the same result as if everyone were asked.  Statisticians call this sampling a population (or distribution) and it’s vital that the sample is genuinely representative if the results are to be valid.  The size of the sample is also important: the smaller it is, the larger the uncertainties in the results.  To be fair, the reporting of opinion polls before an election has improved in this respect in recent years, with explanations often given of the likely uncertainly levels and sample sizes, but small changes are still seized on by some of the media.

Extremes of probability – near-certainties and very tiny probabilities – are particularly prone to misuse, partly because most people struggle to understand extremely small and large numbers that have to be written in special notation such as 1082 which means 1 followed by 82 zeros, very approximately the number of atoms estimated to be in the observable universe.  I’ll give you just one example to end what’s been a rather dry article this time – congratulations if you’ve stuck with it so far – and we’ll look at some more misunderstandings next time in, I hope, a lighter vein.  The first British astronaut, Helen Sharman, recently said “Aliens exist, there’s no two ways about it.  There are so many billions of stars out there in the universe that there must be all sorts of different forms of life.”  In a previous article we saw how the probability of extra-terrestrial life can be seen as the result of multiplying together the likely number of planets capable of supporting life (probably a very large number) and the probability of life existing on any one planet (an extremely small number).  Helen Sharman has taken the first of these and ignored the second; for her, the result is essentially a probability of 100% (“no two ways about it”).  I think her expertise in aeronautics is probably greater than her grasp of statistics (but I’m not going to put a figure on the probability!).

17: November 2019

Are you sure?

By the time you read this, the UK will have left the EU – or will we?  There’s a general election coming – or is there?  Politics, not only in Britain, hasn’t experienced such uncertainty for a long time, and predictions of what will happen even in the next week or month are completely unreliable (which doesn’t stop people making them, of course).

Uncertainty is a fact of life that we encounter all the time.  It has its place in science too, contrary to the popular image that science deals in undisputed concrete facts: “Science has proved…” is supposed to be a convincing line in advertisements, arguments at the pub, and attempts to dismiss something you don’t personally believe in.  But it just isn’t so.

After Sir Isaac Newton and his contemporaries developed theories of motion, gravity and other forces, and other famous scientists later provided explanations of electricity and magnetism, the universe was believed to operate like perfect clockwork, governed by laws that could be written down in precise mathematical form.  That view was shattered by experiments around a hundred years ago demonstrating that, when you make observations at the very tiny level of atoms and molecules, strange things happen.  Instead of definite confident predictions, we have to speak of probabilities and a range of possible outcomes.  This was the birth of quantum theory and the introduction of lots of new ideas and new words to understand and express a reality that defies precise description, including the Heisenberg Uncertainty Principle that provides an estimate of the degree of precision (reliability) we can expect in a physical measurement: “Heisenberg probably rules OK” used to be a popular joke.

Fortunately for the vast majority of us, in normal everyday life the Heisenberg uncertainty is so incredibly small that it has no practical importance for us in our understanding of the behaviour of things like cars, food, and footballs.  It does matter a lot, however, for those working at sub-microscopic dimensions, including many in molecular biology, genetics, chemistry, particle physics, and microengineering. 

For example, over the years electronic components have been steadily miniaturised, and a current smartphone has vastly more processing power than the room-sized mainframe computers I used in my first research.  Recently there has been much talk of so-called quantum computers in which this miniaturisation is taken to its extreme and the components are individual molecules.  Major success has just been claimed in rapid calculations on a prototype quantum computer that would supposedly take many years on even the fastest conventional computer.  The trouble is that quantum computing is subject to quantum level random behaviour and, in its simple form, is therefore unreliable.  It’s likely to be some years before this development becomes a practical reality.

One of Albert Einstein’s most famous quotations is “God does not play dice with the universe”.  While this has been used both by atheists and by their opponents to support their viewpoints (Einstein was enigmatically neither a conventional religious believer nor an atheist), the emphasis is surely on the dice and the picture of randomness and chance, almost like gambling, that is conjured up by the counter-intuitive weirdness of quantum theory.  Einstein didn’t like quantum theory and never really accepted it, although he couldn’t convincingly argue against it and even some of his own work that earned him a Nobel Prize provided evidence to support it.  We can’t explain scientific observations at the molecular level without quantum theory, and yet we know our current understanding in these terms is incomplete and seriously lacking, because the two big current theories of physics – quantum theory describing the microscopic scale, and relativity describing the cosmic scale – just don’t fit together properly.  The so-called Theory of Everything (used as the title for the 2014 biographical film about Stephen Hawking) still eludes scientists as a single explanation covering both extremes and everything in between.  Science can’t get rid of uncertainty but has to encompass it; it’s an integral part of how things are.

The mathematical language of uncertainty is the area of statistics, with terms like probability, deviation, and expectation.  It finds practical application in many modern technological developments such as image recognition and extracting useful information from “noisy” data affected by random influences masking low-level measured signals such as radio telescope images or low-light photo and video.  Interestingly, the statistical theory that is most widely and successfully used in these areas was developed by an 18th century Presbyterian minister, Thomas Bayes, and is known as Bayesian statistics.  It describes probability in terms of a degree of belief in some proposed explanation, based on initial information and then modified by further information that becomes available.  This is just one of many examples throughout history of a positive interaction between Christian belief and scientific understanding and application.

16: September 2019

The environment’s for everyone

If you live in today’s world, whatever your age, race, sex or beliefs, the environment affects you, and you affect it, whether you like it or not.

This year’s Holiday Club for children, run jointly by the Prudhoe and Stocksfield churches in the first week of the school holidays, had the title “Transformers”.  Alongside the Bible-based teaching, craft construction and junk modelling activities, games and food provided for over 100 5–11 year-olds together with involvement of some older children as helper Apprentices and a first-time toddler group and parent café, we included topical environmental themes.

For 10 minutes each morning we acted out the production of a children’s TV programme with themes including waste and pollution, energy and other resources, water, food, and biodiversity in the context of caring for the good world God has made and entrusted to us.  We talked about reducing, reusing and recycling waste, especially plastics.  Genuine scientific demonstrations, some prepared beforehand, included a simple filter for dirty water, how paper is recycled, and powering a torch with a battery made from a pile of pennies, bits of cooking foil and cardboard with a dose of salt and vinegar.  And can you explain why an orange floats in water ­– until you peel it?

These topics were easily understood and responded to by the children, who have a strong basic concept of fairness and could see that selfish and greedy western lifestyles are harming the environment and the poorest people in the world are affected the most.  Some will have thought about how they might change their own habits or get involved in environmental projects when they get back to school, or support relevant campaigns.

Of course, for consistency, the Holiday Club organisers had to look carefully at practical arrangements for the week: out went single-use convenience plastics and a “reduce, reuse, recycle” policy was followed.  The adults, too, were seriously challenged to be Transformers in our own lifestyles.

One environmental topic we mentioned only briefly at the Holiday Club was climate change, because we thought it too complex for the youngest children and also they would be less able to do something about it themselves.  There’s no doubt, however, that it is increasingly hitting the news, even pushing Brexit out of the headlines.  This summer has brought yet more weather extremes across the world and unusual features with some records broken for both temperatures and rainfall.  It has been reported, and these are definite facts rather than opinions, that the 10 hottest years in the UK since records began have all been since 2000.  We’ve seen the impact in strain on resources and infrastructure, including a major national power failure and a near disaster for a Peak District reservoir dam.  It’s recently been suggested that climate change is increasing the incidence of aircraft flight turbulence –ironic, given that air travel is a significant contributor to greenhouse gas emissions.  Some would say that nature is fighting back in self-defence.

Despite continuing scepticism by those who refuse to accept the overwhelming scientific evidence because of self-interest, there is essentially no doubt that major climate change is a reality and it is largely caused by human activity.  Scaremongering isn’t going to change people’s habits enough, it seems, so we have to hope that technological advances, such as further reductions in the relative cost of green energy and the development of more environmental travel, will be rapid and effective, and we need to have greater investment in these and public demand for them.

We’re approaching the traditional season for harvest festivals.  Although many people no longer appreciate our dependence on the earth’s land resources and their produce, it is still real; indeed, with current environmental concerns, the relevance and importance should be clearer.  Many churches are combining the traditional harvest themes with ecological issues.  For some years international Christian aid agencies such as TearFund have provided useful up-to-date material for use in harvest services, and increasingly now the EcoChurch initiative (ecochurch.arocha.org.uk) is attracting attention and involvement.  My own church, Stocksfield Baptist, for example, has recently joined the EcoChurch movement and our all-age harvest service on the first Sunday in October will follow this theme.  I expect there will be a number of others in the area too.  EcoChurch includes commitments to reduce waste, use green energy (and less of it; not all churches are old and cold!), and follow the “Reduce, reuse, recycle” principles.  What we seek to teach our children we need to adopt for ourselves.

The environment’s for everyone – and it’s everyone’s responsibility.

15: July 2019

Is anyone out there?

There has long been interest in the question whether we are alone in the universe or whether there is life, intelligent or not, anywhere other than on our planet.  The Starship Enterprise’s expeditions “to boldly go” and find other species and ET’s visit to Earth are among many contemporary media examples, but there are earlier well-known novels by H G Wells and Jules Verne, and ideas about extra-terrestrial life have been discussed for many centuries, stretching back at least to the time of Greek philosophers hundreds of years BC.

The modern scientific Search for Extra-Terrestrial Intelligence (SETI) began about 60 years ago and is largely based on looking for evidence of life on other planets and for possible attempts at communication by alien beings.  It is costly and laborious, with no certainty of ever achieving any positive results, and has generated much more interest from the general public than from those who are called on to fund it.  Not all SETI interest is science-based, of course; one famous physicist said that reports of UFOs are more a result of human non-intelligence than non-human intelligence!

Organic carbon-based life has rather strict requirements for its environment, especially regarding the atmosphere and temperature range of liquid water, and a planet capable of supporting life has to orbit its sun in a so-called “habitable zone” (or Goldilocks zone – “just right”).  Until quite recently little was known about planets – even whether they existed – beyond our own solar system, but it is now reckoned, from increasingly sensitive scientific observations, that planets are actually rather abundant in our galaxy (and presumably in others), and growing numbers are being detected that could, in principle, fulfil the necessary conditions for life.  Some physicists and astronomers think, on this basis, that there are probably many, many potential places in the universe that could harbour life, so it’s likely to be out there somewhere.

The trouble with this argument is that the probability of finding life elsewhere, in purely statistical terms, depends on two important factors multiplied together: the number of possible life-bearing planets, and the probability of life existing and developing on any one planet.  The first of these may be huge, but the second, according to our understanding of the appearance and history of life on earth, is extremely tiny.  Because of this, in contrast to the optimistic physicists, the general view of biologists is that extra-terrestrial life is unlikely.  The result of multiplying a huge number by a tiny one, both of them very vague, is quite unknown.  So the SETI people just keep on looking, and the film-makers continue to exercise their fantasy and imagination about “life, but not as we know it”, unrestricted by the problem of the vast distances involved and the consequent hugely long times required for conventional travel and even for communication.

What if we do ever find intelligent life elsewhere?  It raises questions that are not only scientific, but also sociological, political, moral and religious.  It is interesting that, according to surveys, the majority of people with no religious faith think that it would cause a crisis for world religions, while the vast majority of believers see no threat.  We would all, undoubtedly, suffer a severe culture shock!

One of the biggest questions to arise from all this is what it really means to be human.  The same question crops up out of other modern scientific pursuits, especially in psychology and in neuroscience, investigating the relationship between mind and brain.  It is temptingly simple in these areas, as elsewhere in science, to move from an observed connection to an assumption of explanation and to use words like “only” and “nothing but”.  We’ve previously looked at issues and concerns raised by Artificial Intelligence (AI), and these include the same question of the nature of true humanity.  Announcing a recent award of £150million to Oxford University for research that will include exploring the ethics of AI, the donor Stephen Schwarzman said it was “important for people to remember what being human is”.

Part of the Christian answer to the question comes from the Biblical description “made in the image of God”.  This has many applications; here they include characteristics of rationality, intelligence, creativity, morality, relationship, and responsibility.  Many of the problems we face today come from a failure to respect and achieve these characteristics, falling short of true humanity, marring the image of God.  Would we treat extra-terrestrial forms of life, whatever they may be, as badly as we do other life already known to us including other members of our own species?  Sadly, experience suggests we would.

A 1989 Calvin and Hobbes cartoon said “Sometimes I think the surest sign that intelligent life exists elsewhere in the Universe is that none of it has tried to contact us”.  What is intelligence anyway?  The Bible prizes another God-given quality more highly: wisdom.  We could do with more of it in our crazy world!

14: May 2019

There is no Planet B

According to a recent survey, the UK’s top desirable neighbour is Sir David Attenborough.  If he did live next door, you’d probably find yourself often discussing the environment.  Just the other week he presented a BBC documentary on ‘Climate Change – The Facts’, with dire warnings about the disastrous effects we face unless we take drastic action to reduce the production of carbon dioxide, methane and other greenhouse gases.  In this he joins the almost unanimous opinion of expert climate scientists worldwide.  Attenborough’s popularity, even when saying uncomfortable and unpalatable things, can be seen from the flood of protests when he was criticised on TV by Richard Madeley after the documentary.

The topic is certainly in the news currently, with the Extinction Rebellion protests on the streets of London and elsewhere, and their earlier high-profile disruption of a Brexit debate in Parliament.  The School Strike 4 Climate movement started by Swedish teenager Greta Thunberg last year has been joined by many, with much support from some of their families, teachers, and leading politicians; this article’s title is one of their slogans.  Environmental campaigns are also attracting well-known actors and musicians.

Climate change, while being particularly prominent, is by no means the only environmental problem we currently face.  A marked decline in insects, including bees and butterflies, and many other wildlife species is a consequence of human activity such as intensive farming methods, wasteful fishing practices, and gross international inequalities in food production and consumption.  Mike Pratt wrote about the loss of insects in his article in the last TVE, together with human responsibility for it.

I’ve just returned from a science conference at which the first talk was about the PET-digesting enzyme that was discovered last year (you read about it here!), and now scientists are trying to modify this genetically to make it work on a wider range of plastics.  It would certainly help!  Another speaker, presenting new research on the science of battery materials for practical applications, was pointedly asked by someone in the audience if she drove an electric car; the answer was that she has a hybrid.

Meanwhile highly controversial plans for the development of opencast coal mining have been hotly contested legally for a site at Druridge Bay on the Northumberland coast, and more recently approval has been sought for similar plans in our more immediate local area near Throckley.  The Druridge Bay (Highthorn) scheme has gone right up to the High Court and to two successive government environment ministers, and a delayed final(?) decision is due during May according to the latest information.

I was pleased to hear last month that our local Newcastle University stands very high up in an international league table of universities judged by their environmental impact, taking in teaching, research, policies and practice; the use of 100% green electricity, rapid progress in reducing investment in fossil fuels, an emphasis on recycling, and a strong policy of sustainable food purchasing all contribute to this success, and are institutional models for the sorts of things we can all contribute individually.  Do you know that many people aimed to give up, as far as possible, single-use plastics for Lent this year?  It makes a change from chocolate and alcohol!

If you’re interested in finding out more, you may like to come to a day conference I’m organising on Saturday 18 May in Durham.  The title is ‘Stewards of Creation: Christian perspectives on environmental issues’, but it’s designed for a general audience and no particular scientific or religious background is required or expected.  There’s a small fee (with a reduction for students), but this includes lunch.  You can see further information on our website bigquestions-anyanswers.org, including a link for buying tickets in advance (before 13 May) for catering purposes.  The topics to be presented and discussed include nature conservation, sustainable living, and our use of mineral resources – all very relevant to the issues mentioned above.  And we’ll be asking, and trying to answer, the fundamental questions of why we should bother anyway, and what we can do about it.  What you do personally really can make a difference!

(Note added later: the recorded talks from this conference are available on this website and also at cis.org.uk)

13: March 2019

Artificial intelligence: the biggest science scare of all?

What do you most fear for the world’s future?  Nuclear war, environmental catastrophe, terrorism, alien invasion, asteroid collision, genetic modification – they’ve all been written about and explored in films and other popular culture.  But for many people, the big scare is artificial intelligence (AI).  While we benefit enormously from advances in automation and depend increasingly on rapid easy access to information, there are those who fear that it will grow out of human control.  Will computers, robots and other machines take over the world?

A popular image is the android robot that eventually becomes indistinguishable from a human being but has greater physical and mental capability.  It’s perhaps one of the most common fears about AI developments and was a major ingredient of the writings of Isaac Asimov, probably my favourite science fiction author, with his ‘Three Laws of Robotics’.  The 2001 Spielberg film AI explored this with the story of a robot substitute for a young boy, programmed with the ability to love, with disturbing consequences.

Other fantasy has featured robots and machines resembling humans to a greater or lesser extent but easily told apart from them, from the humorous antics of Short Circuit (1986) to the cyborg menace of The Terminator (1984 onwards) deriving its terror from its power and seeming indestructability.  The Machines of The Matrix Trilogy (1999–2003) didn’t need to look human: they dominated by subjecting humans to the artificial reality of a cosmic computer simulation.

Outside fiction, nobody has yet come anywhere near developing a convincingly humanoid machine.  The world actually contains many, many robots, but most have shapes designed to suit their particular tasks and purposes, like the multi-jointed arms of production lines such as those in car factories.  This form of AI, with a specific focussed application, is certainly of concern to many because they see their livelihood threatened by such machines that can do their jobs more efficiently and accurately.  And it doesn’t just threaten manual workers: some delicate surgery is now being done successfully by robots, for example, and financial stock dealing is increasingly being done by computer programs.

I went to a ‘Christians in Science’ conference in November on the topic ‘What it means to be human in an age of machines’; it tackled a number of issues connected with AI, including ethical as well as technological matters.  One speaker, a leading computer scientist, gave his opinion that robots, of whatever design and appearance, are not the greatest AI threat we face.  We should instead be concerned about what is known as ‘Big Data’, the collecting together and analysis of masses of information that can be put to all sorts of purposes.  We constantly give away information about ourselves through our electronic activities, whether it be internet searches, online shopping, store loyalty cards, or payment methods such as credit cards and Paypal.  Have you ever thought how much about you is known by the likes of Google, Amazon and Facebook?  And what will they do with it all?  The Chinese government is well known to tap into the country’s centrally controlled main social media service.  It could never happen here… could it?  Electronic and digital security – personal as well as national – is one of the major problems of our time.

Artificial Intelligence is, of course, rather like many other areas of science and technology, such as nuclear energy, medicinal drugs, transport systems, communications and the internet.  In itself it is morally neutral: it is the development and practical application of underlying scientific knowledge about how things work.  But the applications can be for good or bad, and sometimes it can be rather ambiguous.

The potential benefits of AI include greater efficiency and productivity, safer operations, more leisure, and a generally better quality of life across the world.  But this won’t happen without conscious deliberate decision based on a desire to do good by those who have it in their power to do so.  Wrongly used, it can increase inequality, exploitation, and disadvantage for many.  Those who make the decisions need to be held to account and not allowed to hide behind huge faceless organisations, corporations, and government agencies.  We’ve seen a lot of blame shifting and heard many lame excuses from powerful people in this area recently.

Just think what you’re doing when you use social media and the internet, or swipe a plastic card across a reading device.  What are you telling someone about yourself, and what might they do with the information?

12: January 2019

Science scares 2: what’s in a name?

Last time we looked at some scare stories that arise from misunderstanding of science.  This time we focus on just two words that rightly raise some concerns but often generate more fear than they should because of their associations in many people’s minds.

The first is ‘nuclear’.  The really negative image here is of nuclear weapons: their threat hung over the years of the Cold War between the western nations and the Communist USSR and continues today in political tensions and a desire to prevent their wider availability, especially to terrorist groups.  The bombs dropped on Japan at the end of World War 2 were devastating and horrifying at the time, but far more powerful weapons have been developed and stockpiled since then.  I’m sure most of us would wish them all away.

Nuclear weapons represent a deliberately uncontrolled release of some of the huge amount of energy that holds together the ingredients of atomic nuclei, the dense tiny centres of atoms that make up all the elements of the physical substances around and within us.  Some atomic nuclei, especially the heavier ones such as uranium, are naturally unstable and change into other nuclei with the release of energy.  If this is carefully controlled, the energy can be peacefully and constructively used, and this is the basis of nuclear power stations.  These, of course, generate not only electricity but also strongly polarised opinions: while nuclear energy has much in its favour compared with burning fossil fuels, it does raise issues of security and safety, as we saw in the major incidents at Chernobyl and Fukushima, as well as the problem of dealing with the radioactive waste products.  This is one of many examples of science and technology that are morally neutral in themselves, but can be used for both good and evil.

Unfortunately the negative aspects make ‘nuclear’ a scary word.  But it actually just means something to do with a nucleus: atomic nuclei have other interesting and useful properties too, and there are other kinds of nucleus (it just means core or centre) – we can talk of a nuclear family without implying the parents and children are radioactive!  Some atomic nuclei can behave as tiny magnets and this leads to a scientific technique called nuclear magnetic resonance (NMR), a way of exploring the surroundings of these atoms in a material.  It’s widely used in physics, chemistry and biology.  In medical applications its main target is water molecules in the body, which give different NMR effects for different tissues and can be used in all sorts of diagnoses.  It uses large magnets together with radio waves, and many people have experienced and benefited from this.  It’s completely harmless if you don’t have any metal implants in your body.  To avoid scaring people, however, the N word is avoided and it’s called magnetic resonance imaging (MRI) instead.

The second scare word is ‘genetic’, which means to do with genes, the inherited sections of our DNA that control the production of proteins and other ways in which our bodies work.  We considered the ‘genetic code’, the information content of DNA, in an earlier article.  Scientists also talk about genetic disorders (inherited problems), genetic manipulation, and genetic engineering.  This gives the impression of interfering with natural processes.  While such intervention has been made for generations in medical and surgical procedures and most of us have no objection to the use of operations, medicine and anaesthetics, tinkering with DNA at the molecular level worries many, because it raises images of unnatural creatures, designer babies, and ‘playing God’ with life.  Concern is expressed about genetic modification of food crops, especially in the UK and EU, though traditional plant (and animal) breeding is also a form of genetic modification, albeit it slower and less finely controlled: the fruit, vegetables and particularly cereals we eat now are very different from those of much earlier ages.

Just as with nuclear energy, genetic engineering is morally neutral in itself: it is an ability we can put to good or bad use.  Developing food crops that are resistant to diseases, give better nutrition or higher yields, or without harmful side-effects is a good thing, and making them less susceptible to pest damage is preferable to widespread use of chemical agents that have other undesirable impacts – it will be difficult to feed a growing world population without such changes; but deliberately making crops infertile so that new seeds have to be bought every year from the rich multinational companies controlling the industry is morally questionable or worse.  Much objection to some recently developed medical genetic procedures is fuelled by lack of understanding, misinformation, or inflammatory and misleading descriptions.  These new techniques raise serious ethical issues; because something becomes technically possible does not mean it should be done.  Unfortunately a society that is losing its acceptance of values such as objective truth and universal standards in favour of individual choice and relative truth is poorly equipped to address those issues, and science cannot generate its own ethics without other input; where should they come from?

Next time we’ll make the science scares into a trilogy with a third episode.

11: November 2018

Science scares: fact or fiction?

As midnight on 31 December 1999 approached, many people dreaded a worldwide disaster because of the so-called ‘Millennium Bug’.  In the preceding months some companies did a roaring trade in preventative measures, and supermarkets ran out of food and emergency supplies.  It turned out to be the biggest non-event for years – virtually no computer systems crashed because of the date change – but it shows how a limited understanding of science and technology issues can cause fear and how this can be exploited.

Similar concerns were expressed when the Large Hadron Collider, the world’s biggest machine underground below Geneva, was prepared for major experiments looking for evidence of the Higgs Boson, also known as the ‘God Particle’ about 10 years ago.  The panic theories included the creation of a black hole that would swallow up the earth or even the whole universe, despite the lack of anything remotely approaching enough energy for this.

Perhaps the biggest scares relate to medicine.  This isn’t surprising when you remember the tragedy of thalidomide leading to major birth defects in the late 1950s, and the frequent, and too often contradictory, advice given on diet and health; these can make it difficult for ordinary people to distinguish real concerns such as obesity and smoking from more debatable points such as the impact of small-scale alcohol consumption.  One of the most notorious cases was the 1998 claim that the triple MMR vaccine could cause autism, which led to a marked drop in vaccinations and subsequent dangerous outbreaks of disease, especially measles, in the following years.  Careful investigation showed that Dr Andrew Wakefield, responsible for the autism claim, based his conclusions on faulty research, probably deliberately, and all other studies have completely discredited them.  Although he was banned from medical practice in the UK, he continues to promote his unfounded views in the USA, where he is regarded as a hero by anti-vaccination campaigners.

It doesn’t help that relatively few people with scientific training work in the non-specialist media, and some newspapers publish complete nonsense in such stories.  Even the more respectable BBC seems to think it necessary to give equal time to both sides of an argument even when the reliable evidence points overwhelmingly in one direction (they’ve done the same with climate change until a very recent policy change); that’s how Wakefield managed to get his voice heard so much and people believed him.

More recently a scare story went round, reporting that Gateshead Council’s new street lighting was run on 5G technology, with a risk of causing cancer in the local population.  In reality there was no such plan.  5G technology is indeed on its way as a faster and more efficient means of communication for mobile phones and other devices, and every advance in this field brings its cancer scares.

All wireless digital communication relies on some form of electromagnetic radiation, which consists of waves of oscillating electric and magnetic fields that transmit energy.  The very small section of the electromagnetic spectrum (the full range of possible radiation) that we can see is, of course, visible light in its various colours.  What distinguish different colours, and different parts of the spectrum in general, are different wavelengths and different frequencies of oscillation of the waves; frequency goes up as wavelength goes down across the spectrum.  The energy of radiation, which determines what effect it can have on substances including our bodies, depends directly on the frequency.  Going up in frequency and energy from visible light we have ultraviolet (= beyond violet), X-rays and gamma rays (a form of radioactivity).  These have enough energy to remove electrons from atoms and so change materials chemically, including modifying DNA in our biological cells; such high-energy radiation can cause skin and other cancers.

Lower in energy than visible light we have infrared (= below red, which we experience directly as heat from the sun), microwaves, and a wide range of radio waves.  These cannot cause significant chemical reactions and so are incapable of inducing cancer.  Existing and planned future communications are limited to certain regions of radio and microwave frequencies.  Many studies have been made of their possible impact on human and animal bodies, and there is no evidence whatsoever of anything other than a very tiny heating effect with some of the higher frequencies, and even these are not significant.  Apart from frequency/energy the other important property of electromagnetic radiation is intensity or power (measured in watts: an electric kettle uses 3000 watts, a low-energy light bulb around 10 watts).  The power of a typical mobile phone signal is less than 1 watt, a small fraction of the heat output of a single human body, and lower than any light bulb.  Any notion that mobile phone usage can fry or scramble the brain can be completely dismissed.  In fact, advances in technology mean increasing efficiency, and each new generation of devices actually becomes safer in this respect, not more risky.

More scary science stories next time!

10: September 2018

Let there be light!

Some of our readers will remember the regular scheduled 4-hour power cuts resulting from the miners’ strike in 1972, when most electricity came from coal-fired power stations.  I was a postgraduate research student, and at these times we gathered in a sitting room with candles and a large fire, and played Monopoly in the semi-darkness.

These days our electricity comes from a much wider range of sources, with an increasing contribution from renewables – sources of energy that are not destructively consumed or can be regenerated on a human timescale, such as biomass, wind and solar energy.  We still depend too much on unsustainable fossil fuels and there is much opposition to nuclear energy, but genuinely ‘clean’ energy is rapidly becoming cheaper, easier and more reliable with much investment in technological development.

According to an official report, nearly 25% of global electricity generation came from renewables in 2016.  At least two countries, Iceland and Norway, have 100% renewable energy sources for electricity.  It is necessary and desirable to push this improvement forward as fast as possible, the reasons including the limited supplies of fossil fuels (and we need them for better purposes than burning them), climate change driven largely by carbon dioxide from combustion, and other environmental and health issues.

Of course renewable energy is far from new – just think of traditional windmills and water-wheels, though these are now largely part of history.  Hydroelectric power has long been with us, and continues to be a major contributor in some parts of the world.  Among the fastest growing technologies are solar and geothermal sources, and many of us have banks of solar panels on our house roofs.  Scientists continue to find more efficient materials and processes for converting free and abundant sunlight, as well as wind and wave energy, into electricity and heat.  Many governments are aiming to phase out petrol and diesel road vehicles in coming decades, and battery technology is one of the major research and investment areas now in industry and university laboratories, with goals of increased efficiency and capacity so we can drive further between recharges.

There’s another important reason for developing renewable energy supplies.  We take for granted that our electricity is constantly and reliably available through a national distribution grid – at least, when it isn’t disrupted by industrial strikes or severe storms.  But about 1 in 7 of the world’s population have no access to such a supply of electricity, mainly in parts of Asia, Africa and South America.  This isn’t just inconvenient, it’s a major hindrance to development and life improvements, and can be dangerous; just imagine, for example, a pregnant woman going into a difficult labour at night when there is little or no available light.

Renewable energy is ideal for such local, rural situations where an electricity grid does not exist.  Even just one solar panel can generate energy for storage in a battery and then used at night.  Small- and medium-scale projects can transform whole communities and make their lives better and safer.  Such projects are funded and run by major international agencies such as the United Nations (though not as much as they might be by the World Bank, which continues to invest disproportionately in fossil fuels), by aid budgets of individual nations such as the UK, and by relief and development charities.  One such charity, a specifically Christian one called Tearfund, celebrates its 50th year in 2018 and many churches are marking this by holding a ‘Light Service’ using as little electricity as possible (perhaps only to show a relevant short video) or none at all – even for making tea and coffee after the service!  We’ll be doing it at Stocksfield Baptist Church on Sunday 16 September at 10am, and all are welcome to come and take part.  You may learn something valuable and interesting.

When I was a boy, we spent a fortnight of our summer holidays each year with my mother’s parents.  They lived in a big old house next to one of the Norfolk Broads, with no mains electricity, gas or water.  For us as children it was a big adventure.  But for many in developing countries it’s far from fun and life is tough; renewable energy projects bring them new hope.  We can play our part too, not only by donating to development charities, but by reducing our own energy demands, especially for non-renewables.  It just takes some simple, and often quite small, changes of habits and priorities.  And have you switched to a green energy supplier yet?

9: July 2018

What Charles Darwin didn’t know

“What’s the difference between a pattern and a code?”  This is a key question in Dan Brown’s latest novel “Origin”, with a theme of how life on earth began and where it is heading.  According to the hero Robert Langdon, patterns occur everywhere in nature, but codes – by definition – must carry information.  He goes on to say that codes are the deliberate inventions of intelligent consciousnesses.  Some critics reckon this goes too far; for example, the development of spoken language, definitely a code in this sense, could hardly be said to have begun deliberately, though there’s certainly intelligence behind it.

When Charles Darwin developed his theory of evolution by natural selection in the mid-1850s, he based his ideas on natural variations in isolated species populations together with the results of other people’s deliberate selective breeding experiments.  He drew conclusions, still largely accepted and further developed today, about what we now call genetic inheritance, but he had no idea what the actual physical mechanism of this inheritance was, because the modern science of genetics was unknown.  The founder of genetics was Gregor Mendel, an Augustinian monk in what is now the Czech Republic.  He carried out experiments with varieties of peas in his monastery garden at around the same time as Darwin was writing “On the Origin of Species”, worked out how certain characteristics were passed on, and coined the terms dominant and recessive to describe what we now call genes.  His work was largely overlooked until the start of the 20th century.

The genetic code, which is the information of heredity, is carried by the biological molecule DNA.  It is an extremely long molecule with two external backbones of phosphate (related to bones and minerals) and sugar units twisted into a double helix – a shape rather like a spiral staircase or a curved ladder laid out on the slide of an enormously tall helter-skelter.  Each ladder rung connecting the phosphate-sugar rails is a pair of small molecules called nucleobases, one attached to each rail.  Although each molecule of human DNA has over 3 billion such base-pair rungs, there are only 4 different bases, known as C, A, G, and T, the initial letters of their chemical names; and each rung is always a pairing of A with T, or C with G, never any other combination.  This means a DNA molecule can make two identical copies by splitting down the middle of the rungs and each half assembling a matching set of bases on a new second rail.  This is how your DNA reproduces itself and is passed on to your descendants.

But the real power and significance of the genetic code is that the detailed sequence of bases controls how proteins are constructed from their small amino acid building blocks.  Each of the 20 different amino acids in proteins corresponds to three successive bases along the DNA chain, for example CGA means glycine and CAT means histidine.  One of the most remarkable features of this biological code is its efficiency: it uses only 4 characters, in contrast to the 20–30 characters in most western alphabets and thousands of characters in written Chinese.

All humans share 99.9% of the same DNA sequence, a higher proportion than most other species.  What we have in common identifies us as human, and the small differences are what make you unique (unless you’re an identical twin).  The entire sequence of over 3 billion bases in the human genetic code was finally established in a huge research programme, the Human Genome Project, at the turn of the millennium.  Its achievement is helping in the development of medical treatments for genetically inherited diseases.

Some surprises came out of this work.  Less than 2% of the DNA base sequence is actually used in sections that code for protein production (the genes).  The function of much of the rest is still a mystery, though some provides self-repair mechanisms and other maintenance features.

So, if the genetic code carries information for a purpose, does it imply deliberate design?  As Robert Langdon says, “That’s the paradox.”  The characters in Dan Brown’s “Origin” seem to be moving towards an odd position blending atheism with a mystical New Age spirituality and nature worship – is this true of the author too?  The Director of the Human Genome Project at the time of its completion is less ambiguous in his viewpoint: Francis Collins, now the head of the US National Institutes of Health, moved from an atheist stance to a personal Christian faith largely as a result of his research in biology and medicine, and the developing story of DNA and the human genetic code confirmed him in this.  Having begun with a Dan Brown quotation, I’ll end with one from President Bill Clinton, in the speech he and Francis Collins prepared together for the announcement of the first draft of the human genome in June 2000: “Today we are learning the language in which God created life”, to which Collins added “we have caught the first glimpse of our own instruction book, previously known only to God.”

8: May 2018

We need some plastic surgery

Remember CFCs?  That’s chlorofluorocarbons, widely used in refrigeration, air conditioning and aerosols until the 1980s, when it was discovered they were damaging the ozone layer in the atmosphere, which absorbs much of the sun’s ultraviolet radiation, a major cause of skin cancer and other problems.  The Montreal Protocol of 1987 led to a near-total international ban and the introduction of replacement chemicals, and the ozone layer has gradually recovered.  A success story!

We face as serious a challenge today with massive plastic pollution worldwide, and this time legislation won’t solve the problem: plastics just aren’t going to be banned outright and can’t be so easily replaced, though there are recent proposals to ban plastic straws and cotton buds in England, and plastic bag use is already illegal in some African countries.

The problem with plastic is that most of it is chemically very stable and unreactive – one of the reasons for its widespread use – so it takes a long time to decompose.  It’s also not easy to recycle, especially when it is combined with other materials, such as in disposable coffee cups: billions of these are thrown away each year after a single use.

It has been estimated that well over 8 billion tonnes of plastic has been made so far, and over 6 billion tonnes has been discarded as waste.  Less than 10% gets recycled, some gets incinerated, but nearly 80% ends up in landfill or in the oceans as a long-term hazard.  A lot of plastic waste collects in one area of the Pacific because of the pattern of ocean currents, and this “Great Pacific garbage patch” between California and Hawaii is believed to be twice the size of France, about 80,000 tonnes in weight, with nearly 2 trillion items in it – nearly half being huge plastic fishing nets.  The densest coastal collection of plastic waste is probably on the uninhabited World Heritage Henderson Island, with up to 670 items per square metre, brought ashore by the tides!

Of course, this is just what’s visible.  Even more alarming are the tiny particles of plastic, including especially the microbeads in many cosmetics, that are designed to be washed away as waste, and end up in the food chain of marine life (and so in ours too).  And did you know that bottled water, a major use of plastic and generator of waste, often contains significant amounts of microscopic plastic, detracting from its supposed advantage over tap water?

With talk of partial bans, plans for deposit schemes on plastic bottles, surcharges for plastic bags and disposable cups, and demands for technological solutions in modified manufacture and more recycling, this has recently become a very hot topic, hitting the number one spot of BBC News and newspaper headlines.

There are some welcome and surprising bits of good news, including the enzyme that has evolved to digest PET, the plastic most widely used in bottles.  Its molecular structure has been studied (using powerful X-rays at Diamond Light Source in Oxfordshire, a facility I’ve used a lot in my own research – just do a web search for it), enabling scientists to modify it to improve its performance, reducing PET’s durability from hundreds of years to a few days.  A glimmer of hope!

What the recent news items and features don’t say, and they should, is that legislation and technology can only do part of the job of tackling this problem: a lot can also be contributed by ordinary people, just as with other environmental issues.  We all need to play our part, starting with “the 3 Rs”: reduce, reuse, recycle.

Reduce: avoid buying and using plastic where you can.  Shun overpackaged goods at the shops, prefer glass bottles to plastic (they’re making quite a comeback for milk), don’t use a plastic bag at the checkout if you don’t really need it, say no to plastic straws and cutlery.

Reuse: that includes shopping bags, coffee cups, lunch boxes.  Carry a reusable water bottle; filling stations and drinking fountains are becoming popular, and bottled water isn’t really better for you than tap water.

Recycle: plastic recycling facilities are hugely variable across our area and the country as a whole.  If your local authority doesn’t provide plastic recycling, join others in campaigning for it.  And in the meantime see if you can reuse more and throw away less.  I’m pleased to say that Newcastle University is one of the best in the UK for sustainability (including waste management) and well up in international league tables in this respect.

As a Christian, I note that this approach is in tune with the Biblical mandate to take good care of the world we’ve been given – it’s the only one we have, and it’s not ours to exploit selfishly and ruin but rather a resource to value and use wisely.  The old-fashioned word for this is stewardship and it’s been the responsibility of human beings since the very beginning.  It’s good that this moral imperative is recognised by many people whatever their religious beliefs or none; we just need to make sure we act on it consistently and don’t wait for others to solve the problem for us – they won’t, not without our contribution.

7: March 2018

Chasing Easter – and where does our calendar come from?

My first article for 2018, about the Star of Bethlehem, attracted quite a few comments, particularly about the date of Christmas, so I thought this time I’d look at the date of Easter and why it moves around so much.  It turns out to be quite complicated.  You might think science had little to do with it, but important contributions come from astronomy, archaeology (evidence to support the reliability of the New Testament reports), and records of eclipses and earthquakes.

Secular history as well as the Bible tells us that Jesus died at the Jewish Passover festival; his resurrection from death to life occurred two days later on a Sunday, the day between being the Sabbath rest day.  From early days the Christian church wanted to celebrate Easter, its most important historical event, always on a Sunday and to retain the calendar link with the Passover.

And here the difficulties begin.  For a start, Passover is scheduled on a lunar calendar – based on the moon for easier accurate observation, not the sun.  Officially, Passover falls on the first full moon following the Spring equinox in the northern hemisphere; that day of equal length day and night we now recognise as 21 March, though it was once thought to be 25 March (with 25 December as the winter solstice, the shortest day, one reason for its selection as Christmas Day).  However, the date of the full moon depends on your time zone, so the calculations are based on a formal calendar rather than actual astronomical observations.

The situation gets more complicated when you recognise that neither a lunar month – about 29½ days for the cycle of the moon’s phases – nor a solar year – average slightly under 365¼ days for the earth to orbit the sun – has an exact whole number of days, and these two aren’t in a simple ratio.  And these natural astronomical divisions of time aren’t even constant because of variations in the earth’s rotational axis, non-circular orbits, and the minor gravitational effects of other planets.

All of this, compounded by limited accuracy of early astronomy, has made the construction and maintenance of calendars a nightmare and a challenging task, and corrections have had to be made over the centuries.  It’s been important, not only for religious purposes but also for navigation, trade, government, and other communications.  Our modern life depends on accurate timekeeping.

In early Roman times, there were ten months (our March to December) with either 29 or 31 days, roughly following the observed lunar cycle and avoiding even numbers which were considered unlucky, and months weren’t counted in mid-winter until January and February were included later.  This added up to only 355 days for 12 months.  The extra days to make up a solar year were inserted as an extra month every three years (which keeps the months in phase with the moon) until Julius Caesar rearranged the calendar by adding extra days to individual months as we now have them – February lost out because it was the unpopular last month of the year before spring came.  The extra quarter of a day per year went into leap years, one every four years.

By the 16th century it was obvious that these calculations were not quite right and the calendar was many days out of synchronisation with the sun.  In 1582 the modern Gregorian calendar began to replace the previous Julian calendar (Britain adopted it in 1752), and 10 days (5 to 14 October inclusive) were skipped over – to some people’s alarm, as they believed 10 days had been stolen from their allotted lifespan.  Incidentally, that’s the reason for the otherwise puzzling date of 6 April for the start of the UK tax year; it had originally been set at the supposed equinox of 25 March, and the tax authorities did not want to lose 10 days out of their quarter-year revenue (plus one more for a leap year difference between the calendars).

So the western (Roman Catholic and Protestant) churches now take Easter as the Sunday following the first calendar full moon on or after 21 March; this can be as early as 22 March and as late as 25 April.  Other ‘movable feasts’ follow suit: Lent beginning on Ash Wednesday in February and occasionally early March, and Pentecost which used to give us the Whit Monday bank holiday before this became one of the fixed May holidays.  The eastern (Orthodox) churches continue to use the Julian calendar for their calculations, so their Easter is nearly always later.

I said it was complicated, and now I’ve run out of space.  The archaeology, eclipse and earthquake evidence will have to wait for another time, but I’ll say here that they help to provide us with the most likely date for the very first Good Friday and Easter Sunday: 3 and 5 April 33.  Of course, the dates themselves aren’t really important; it’s what happened on them that matters.

6: January 2018

Twinkle, twinkle, little star…

…how I wonder what you are.  So runs the nursery rhyme, but this question has also been asked of the star of Bethlehem that, according to chapter 2 of Matthew’s gospel in the Bible, led ‘wise men from the east’ to the birthplace of Jesus Christ.  Cynics scoff at this account, and even some theologians have said it is unscientific and without historical evidence.

Far from it!  The description fits well the behaviour of a comet: it made a new appearance, moved slowly across the background of familiar stars, and ‘stood over’ Bethlehem – a phrase used widely of tailed comets in ancient literature.  The rate of movement (consistent with a journey of a couple of months for the wise men) and the reported directions of sighting initially in the east and later the south (the way from Jerusalem to Bethlehem) agree not only with a comet in general, but with a specific comet recorded completely independently by Chinese astronomers as being visible for about 70 days in the year 5 BC – one of three observed by them in the 30-year period from 20 BC.

Who were the ‘wise men’ (also referred to as Magi) and what made them undertake this journey?  The tradition of ‘three kings of orient’ began only about 500 years later and is based on a particular interpretation of some Old Testament verses and the identification of their gifts of gold, frankincense and myrrh for the new-born king.  The gospel account says nothing of kings, nor gives their number or names.  They were without doubt religion-based astronomers/astrologers (there was no distinction in those days, as there was none between alchemy and chemistry in the Middle Ages), who studied the skies and interpreted their observations in terms of significant present and future events.  This was a well-known occupation in the area of Persia (modern Iran), Mesopotamia (Iraq) and Arabia; frankincense and myrrh were valued products derived from Arabian plants.

In 7 BC the large planets Jupiter and Saturn were close together (a conjunction) three times in the sky, in the star constellation Pisces, a fact important enough to be recorded on a contemporary clay tablet found near the site of Babylon; this would have signified a forthcoming birth of a Jewish king.  In 6 BC Mars joined in to give three planets together, something that happens only once every 805 years and would have been taken as a sign of a great historical event about to happen.  So the comet of 5 BC, appearing in the east in the constellation Capricorn, triggered an urgent need to set off and find the newly born special king in Israel.  That the wise men went to Herod in Jerusalem is only natural: where else would they expect to find a king?  They didn’t follow the star there – it was behind them in the east.  It was the Jerusalem priests who identified Bethlehem, from an Old Testament prophecy of the expected Messiah, and now the comet was indeed before the wise men, having moved round to the south in the meantime, as recorded by the Chinese.

As for the date, the comet was visible in the spring of 5 BC; our celebration of Christmas ‘in the bleak mid-winter’ is a tradition established rather later and not at the time, and Jesus was almost certainly born around the time of the Jewish Passover (near Easter for us), when the milder weather allowed sheep and shepherds to be out in the fields at night as also recorded in Luke’s gospel.  Herod is reckoned to have died in 4 BC, and Jesus lived with his family for a couple of years in Egypt as refugees to keep out of his way, before returning to their home in Nazareth.  The notoriously cruel Herod (who murdered several members of his own family) had all boys in Bethlehem up to 2 years old killed, fitting with the set of astronomical features he had heard from the wise men.

So it all fits together, and Matthew’s account in the Bible is fully consistent with what we know from history and from astronomy.  Indeed, it serves to fix the data of Jesus’ birth within a period of just a few weeks in the spring of 5 BC.  Our calendar, with the change from dates BC to AD (or BCE to CE if you prefer) centred supposedly on this birth, was set about 500 years later and proves to be a few years wrong, but that’s not a bad attempt for its time!

As with strings of archaeological discoveries in the Middle East over many years, here history and science support the Bible’s account previously scorned by experts lacking this evidence.  January 6th is the traditional day for celebrating the coming of the wise men to Bethlehem, and it’s neither a nursery rhyme nor a fairy tale.

[This material is drawn from a much longer and more detailed scholarly article published in the journal Science & Christian Belief in 1993 (volume 5, pages 83–101), which you can read at http://www.asa3.org/ASA/topics/Astronomy-Cosmology/S&CB%2010-93Humphreys.html; it was written by Sir Colin Humphreys, Professor of Material Science at Cambridge University, who spoke at one of our Tyne Valley ‘Big Questions – Any Answers?’ events in 2017, on the subject of miracles.]

5: November 2017

Where are we heading?

Do you know what’s in Room 101?  No, I don’t mean the BBC TV series!  I first read George Orwell’s ‘1984’ at school, when its setting was still about 20 years in the future.  Along with other so-called dystopian novels depicting scary prospects, such as Aldous Huxley’s ‘Brave New World’, it jumped right up the bestseller lists after the Brexit vote and Donald Trump’s election last year as people tried to imagine what lay ahead.

The question of our direction and where it leads can be asked, not only about politics and economics, but also of modern developments in science and technology, especially in biomedical, genetic and neuroscience research where there has been such rapid progress in knowledge and capability.

A topic of particular concern to many people is artificial intelligence, automation and robotics.  A recent article in The Sunday Times points out that robots can already run factory production lines and lay bricks much faster than humans, we’re now seeing the development of driverless cars, and drones are becoming commonplace in the skies, but this is just the beginning as automation moves increasingly into shopping, banking, transport and communications – indeed, almost every part of our lives.  I remember arriving years ago at a fog-bound Newcastle Airport in one of the first planes to be fitted with an automatic landing system, and what a relief it was not to be diverted, along with all the other flights that day, to another destination.  And it’s not just the routine tasks: robots have also been shown to outperform skilled surgeons in keyhole operations, and they can make decisions very fast – which is fine as long as they’re given the right information. 

While there’s much to be welcomed here, the threat in many people’s minds is to their jobs, while others worry about safety and security: a hacked computer is bad enough, but what about a hacked driverless car – or a drone, especially one fitted with weapons?  Are we facing the prospects of a real-life ‘Terminator’ or ‘The Matrix’?  The Sunday Times article quotes a large group of technology pioneers – the very people who have been driving these innovations – in an open letter to the United Nations calling for a ban on killer robots: “Once this Pandora’s Box is opened, it will be hard to close.”

The last in our series of nine ‘Big Questions – Any Answers?’ talks on topical science issues from a Christian perspective tackles this question: “The rise of the intelligent machines – friend or foe?”  You can hear it, and ask your own questions, in Hexham on Wednesday 8 November, and further details are given on our project’s website at bigquestions-anyanswers.org, where you can catch up on previous topics too.

As we come to the end of the funding for this project, we can also ask “Where are we heading?” in the science-faith discussion.  There’s been a lot of interest and engagement, and there seems to be much desire to continue in some way.  Various approaches are possible, and if you’d like to contribute comments and ideas – or just express your interest – you can do so in a simple online survey: just search for “big questions any answers survey” or go straight to goo.gl/vhuxTi, and answer some or all of the ten questions; it will help us in future planning.  Or, if you prefer, you can just send me an email: two addresses are given below.

There’s also an opportunity to hear four more talks and join in a discussion on a range of scientific topics of current interest on Saturday 11 November in Kingston Park, in an event I’ve organised with the Faraday Institute for Science and Religion based in Cambridge.  This costs £25 (half-price for students), but included in that is lunch as well as other refreshments.  Further details are at goo.gl/Ynu4WU, where you should register in advance (as soon as possible!) so that we will have enough food.  It could be well worth a few hours of your time.

In case you haven’t read ‘1984’ (and if you now plan to do so, skip over the rest of this sentence), Room 101 contains “the worst thing in the world”, whatever that may be for any individual person; it’s the government’s torture room (called the Re-education Room in 1984 Newspeak), designed to promote submission to Big Brother.  Whether or not artificial intelligence comes into this category for you, there are certainly many modern developments in science, technology and medicine that raise big moral, ethical, and more generally human questions that we must all address and not leave to entrepreneurs and politicians to deal with according to their own interests.  The interface between science and Christian faith isn’t just an academic exercise, or a slanging match between militant atheists and fundamentalist Christians; it’s an important and fruitful breeding ground for generating constructive views, practical solutions, and further key challenges relevant to our modern age.  It can, and should, make a positive difference, and we want to keep up the momentum of this year’s activities.

4: September 2017

So whose fault is it anyway?

“There’s absolutely every reason to check if you’ve been sold PPI.”  I don’t know about you, but I’ll be glad when the PPI claims deadline is reached and we can get rid of these annoying adverts, but don’t hold your breath – it’s still two years away.  Even so, some lawyers are appealing against it so they can continue with their profitable “no win, no fee” cases.  We live in an age of litigation, with legal fault claims frequently in the news and encouragement to pursue compensation on the flimsiest of evidence.  We’re told that our car insurance premiums are too high because so many drivers make spurious whiplash injury claims, and the latest wheeze is to say you suffered food poisoning in your holiday hotel a couple of years ago and get the travel insurance to pay up.

When something more serious happens and there’s a real tragedy, alongside the genuine concern and generous response we usually see for the victims, the cry goes up “Who’s to blame?”  The buck has to stop somewhere, and heads must roll, whether it’s for a tower block fire, a terrorist attack, a police cover-up, or a flooded town.  It’s someone’s fault – not mine, of course – and we see public enquiries, private investigations, and tabloid press witch-hunts to find out and point the finger.

There’s a natural human tendency for me to hold others to account for their faults while excusing my own.  “Don’t blame me – I can’t help it – that’s the way I was made”; but don’t try that line if you’ve done something to hurt me!  What has science to say about this?  With rapid advances in the modern field of genetics, we can trace our inheritance of all kinds of characteristics across generations.  It is known that particular genetic abnormalities are associated with the development of certain diseases and disabilities, and in some cases a direct cause can be demonstrated.  In other cases, however, the genetic feature indicates a tendency, but not an automatic cause for something unusual, and perhaps undesirable, to happen: there’s a huge difference between genetic influence (it might happen) and genetic determinism (it will happen).  Despite much research, and contrary to some media hype at times, there is no real evidence for genes that cause laziness, obesity, or a particular sexual orientation: nature and nurture – genetic and environmental influences – are both at work.  The debate goes on: are great musicians or footballers or geniuses born or made?

“Science and morality: can I blame my genes?” is the subject of one of the autumn series of four talks on the science-faith interface I’ve organised for our area, which you’ll see advertised in this issue of Tyne Valley Express and on posters and flyers.  It’s in Stocksfield on the evening of Thursday 28 September, and brings together, in Keith Fox from Southampton, expertise in biochemistry and genetics and a Christian perspective of moral responsibility for our behaviour.

Another dimension of blame arises when we hear of floods, droughts, eruptions and earthquakes.  Have we contributed to some of these through climate change, and do our habits and lifestyles make the effects of natural disasters worse than they might be?  Have you seen the 2001 film “The man who sued God”?  In it, Billy Connolly (knighted in this year’s Queen’s Birthday Honours) is a fisherman whose boat is destroyed by lightning.  The insurance company refuses to pay out, saying it was an “Act of God” and therefore excluded from cover.  With comical logic, he sets about suing God for compensation, which he does by taking all the church denominations to court.  I won’t give away the ending, but neither Billy nor God comes out as loser.

Joking aside, this is a serious issue, especially for millions in Nepal, Haiti, Bangladesh, Pacific islands and many other parts of the world, rich and (more especially) poor.  As many commentators asked in 2004, “Where was God in the Boxing Day tsunami?”  So “Are natural disasters Acts of God?” – a question posed by Bob White from Cambridge, an expert on volcanoes and earthquakes, in another of our autumn series talks, in Ponteland on Wednesday 11 October.  Bob’s book on this topic is called “Who is to blame?” and he approaches the subject as both an eminent geologist and a committed Christian, currently Director of the Faraday Institute for Science and Religion in Cambridge and a Fellow of the Royal Society.

The other talks coming your way are on quite different topics of current interest; one looks at our place on a small planet in a huge universe and asks about human significance in a cosmic context (a theme of many SciFi films), and the other addresses the development and use of artificial intelligence, with some people’s fear that robots will take over our jobs and maybe the whole world (shades of “The Matrix”).  If they do, then I suppose we can stop blaming other people for what goes wrong, or even “the system”, and can blame the machines – but then, since we made them, whose fault are they in the first place?

3: July 2017

Who do you think you’re kidding?

Donald Trump obviously didn’t read my article four months ago, and he didn’t turn up in Prudhoe in June when we had a talk on “Climate change – is it real and does it matter?”  This was the third in a series I’ve arranged, at different venues in the local area, giving a Christian perspective on topical science issues.  The first, in Hexham, asked “Creation or evolution – do we have to choose?” and drew an audience of over 100, and the second, in Wylam, considered “Mine for ever? Our use of the earth’s resources” and touched on various environmental issues.

At the time of writing I’m preparing to give the fourth talk, in Heddon on the Wall, tackling “Science and Christian faith – age-old enemies or natural allies?”; it will be in the past by the time you read this, but the fifth is still to come, in Stocksfield, and is advertised in this issue: “Can we believe in miracles in an age of science?”  More talks will take place in the autumn, looking at artificial intelligence, natural disasters, and the universe (“Life, the universe and everything” perhaps?).

Science and religion (here Christianity specifically): both make claims to present truth, but it’s a widespread belief that they are opposed and incompatible so we have to choose between them – or even be suspicious of both.  We’re told we live in a postmodern age, in which there is no absolute truth and what is true for you may not be true for me; we’re free to choose for ourselves with no-one to tell us how.  Where people in an earlier generation asked “Is it true?”, now the key question is “Does it work?”, and experience matters more than facts and stuff – a view borne out by research into our spending habits.

I rather think the general mood is changing again – of course, you’re quite entitled to disagree with me!  Following a divisive EU referendum in which it’s now clear both sides presented distortions and false claims, the emergence of “alternative facts” and “fake news” in the rise of Donald Trump to power, and an unnecessary UK election that promised stability but delivered uncertainty, people are perhaps thinking that truth might actually matter after all, and we could do with having some reliable things to trust.  But what?  “What is truth?” asked Pontius Pilate at the trial of Jesus, and it’s generally regarded that he was being cynical.  Having problems when confronting claims of both truth and authority, he was a postmodernist long before his time!

Those who push the idea that science and faith have always been in conflict (in fact it’s a late 19th century invention) like to back up their case by appealing to a string of historical events, like Galileo’s imprisonment and torture by the Inquisition, the great Oxford Debate on evolution in 1860, and the Scopes Monkey Trial in Tennessee in 1925.  Careful historical investigation has shown that most of these accounts, as you will find them on the internet, are at best distortions that can rightly be called “alternative facts”.

The chemist Peter Atkins has said “There is no reason to suppose that science cannot deal with every aspect of existence.”  At one time people thought science and technology would solve all our problems, but we know better now and those who really understand science recognise both its great abilities and its major limitations: there are many questions that simply lie outside its scope.  Atkins’ better known colleague, the biologist Richard Dawkins, frequently speaks out against religion as the great enemy of science, and his approach includes making up his own definitions – such as for faith and miracles as well as describing God as an alternative scientific hypothesis – which he can then easily attack.

Whatever such campaigners may tell you, there are actually many people who are both serious scientists and committed Christians, including the speakers in our series of talks.  The two approaches to exploring truth are not incompatible, despite what you see and hear in the media, for which conflict rather than harmony is newsworthy and a good basis for entertainment.  If you think miracles are impossible by definition because they violate the laws of nature, Sir Colin Humphrey’s talk on 10 July will challenge that view and make you think again.  It’s the “science versus faith” position that’s kidding you.

2: May 2017

Is modern health care for everyone?

“In this world nothing can be said to be certain, except death and taxes.”  So said the American scientist and statesman Benjamin Franklin in a letter in 1789, though he wasn’t the first to express this opinion.  These days we often hear of organisations and people trying to avoid taxes ­– think of Google, Amazon, and some well-known politicians and sport stars – and there are even those who think they can cheat death by having their bodies deep-frozen in the hope that unknown future discoveries can bring them back to life.  That isn’t an option for all of us; it costs a fortune, and I think they’re completely wasting their money.

Of course, people on the whole are now living longer than previous generations because of improvements in conditions and advances in health care and medicine.  The Queen, herself in her 90s and the longest-reigning monarch of the UK, has been provided with increasing numbers of civil servants to send congratulations to those reaching their 100th birthday.  We have lots more pensioners (self included!) and the state pension age is being pushed up.  Up and up are also going the costs of health services and care homes, constantly in the news, and our National Health Service is in danger of collapse from financial and other pressures.

The government doesn’t have unlimited resources to spend on the NHS, but it also doesn’t want to lose support by making unpopular cuts.  So hospital waiting times are growing, some treatments will no longer be available on prescription, and there’s a shortage of nurses.  While medical research, including world-leading work at our own local Newcastle University, develops new ways of dealing with illnesses and life-threatening conditions, these are often expensive and some of them are controversial, such as Newcastle’s so-called “three-parent baby” therapy for fighting mitochondrial disease.  The NHS faces challenges that are financial, political, and ethical.

So if you were in charge of health policy, what would you decide?  Should we pay for a doctor’s surgery consultation?  Should joint replacements be denied to those over a certain age?  Should the NHS charge for treating people whose problems are a direct result of their binge drinking, overeating, smoking, or other harmful lifestyle choices?  How should the development of new medical drugs be funded? – remember that pharmaceutical companies are profit-making firms, not charities or public bodies; I’ve been involved in some of their research, and it can’t be done on the cheap.

Who actually makes these decisions, and on what basis?  There isn’t a simple answer, with many agencies and complex factors involved.  It’s always difficult to balance demands and priorities that compete with each other.  Whatever you think, someone else will have the opposite opinion; just look at the failure of the US administration in its recent attempt to replace so-called Obamacare because those opposed to it couldn’t agree how to go about it.  And don’t forget that, however much we may worry and complain about the NHS, many countries don’t have any kind of national health care system at all, and only those with enough money get help with their medical problems.  For the majority in the world, modern health care certainly isn’t for everyone.

Although we aren’t expecting easy solutions, some of these issues will be addressed in a one-day conference on Saturday 13 May in Kingston Park, Newcastle, which I’ve organised on behalf of an organisation called Christians in Science.  The conference title is “Playing God? Research, ethics and practice in modern medicine” and there’s a range of expert speakers with an opportunity for questions and discussion.  It’s intended for anyone interested in the topic, not just medical specialists.  If you want to know more, and maybe buy a ticket to attend (£25 covers costs and includes lunch), you can visit cis.org.uk or goo.gl/uKHJCb on the internet.  I hope there may be a free evening event later in the year, somewhere in the Tyne Valley, with a speaker tackling some of these points.

In the meantime, perhaps Benjamin Franklin was right in linking death and taxes: in real life, to fight one, we need the other.

1: March 2017

Do we believe the experts?

Michael Gove famously said during the Brexit campaign, “People in this country have had enough of experts.”  Mind you, he also said you could count him out of any Conservative leadership contest!  To be fair, he was just talking about expert economists “saying that they know what is best and getting it consistently wrong”, and later events showed he had a point.

But what about experts in science?  Though we may not have quite the same expectation as in the 1960s (I’m old enough to remember Harold Wilson’s “White heat of technology” speech and its impact on the 1964 General Election) that science would solve all our problems, it’s still true that we put huge trust daily in the work of scientists and engineers in our dependence on the technology of transport, computing, communications, health and leisure.  We rely on these experts to do their job and give good advice; we don’t stop to question the ability of civil engineers every time we drive across a bridge, though cynics have pointed out that the Titanic was built by experts, and we can all be sceptical about expert pronouncements on health issues such as food, drink and medicines.

So what do we make of scientific expertise on climate change?  The simple answer is that we probably choose whether to accept the evidence based on what we want to believe.  The very latest public survey reported by New Scientist magazine in February suggests that 69% of people in the UK think most scientists say climate change is real and is mainly the result of human activity.  In actual fact, the proportion of scientists with relevant expertise who accept human-induced climate change is about 97% – if you had 100 climate scientists in a room, probably only 2 or 3 of them would oppose this.  97% is about the same as the certainty scientists have that smoking can lead to lung cancer, and who argues against that these days?

That 69% survey result means that about 30% of people don’t actually know (or claim not to know) what the scientific experts say.  Interestingly, that’s roughly the same as the proportion of people who themselves deny the reality of climate change resulting from human action.  64% personally agree with the almost unanimous scientific opinion, up from 59% in 2015.  (This is for the UK; the illustration about CNBC coverage relates to the USA, where climate change denial is even higher.)

Why is there this mismatch between public and expert opinion, albeit with a gradually reducing gap?  Among the reasons are (1) the way the media tend to present opposing views as if they had equal weight (the same happened in the MMR vaccine and autism controversy some years ago); (2) the powerful voices of those with vested interests opposed to the steps we need to take to tackle the problem; and (3) the short-term horizons of some politicians.  Thankfully, despite the climate change scepticism expressed by Donald Trump, and his removal of all relevant data from the White House website – the so-called Control-Alt-Delete approach – you can still see the scientific facts on the websites of the US Environmental Protection Agency (EPA) and NASA.

And it isn’t just the scientific experts who are warning us.  Ten years ago the authoritative Stern Review demonstrated that there are compelling economic reasons to take climate change seriously and do something about it.  Lord Nicholas Stern, one of the world’s leading economists and with rather more credibility than Michael Gove, recently gave a public lecture at Newcastle University, saying that nothing in the last 10 years has seriously argued against this, and the need for action is now greater than ever.

This isn’t the place to present the scientific evidence.  If you’re interested in important topical science-related issues such as this, watch out for a series of expert talks I’m arranging in the Tyne Valley during 2017.  These talks, with opportunity for questions and discussion, have been funded by an international charity and are open to all without charge; details will follow when the plans are finalised.  You’ll see them in the Tyne Valley Express!  I look forward to seeing some of you.