Science and Sensibility
Dai Davies
Brindabella.id.au

In Sense and Sensibility, Jane Austen contrasts two sisters: Elinor, firmly grounded in the pragmatic reality of everyday life, and Marianne who indulges her feelings for music and poetry then, almost fatally, in romantic love. Austen's life spanned a period when the rise of the scientific, rationalist and reductionist views were being challenged by a new romanticism that wanted to re-assert the value of human feelings and beauty, and rejoice in the complexity and diversity of nature – the Age of Sensibility. This dichotomy was acted out in the life of Austen's contemporary Mary Shelley who moved from her mother's rationalism to marry the Romantic poet Percy Shelley and express her internal conflict in the story Frankenstein, or the Modern Prometheus.

Did Austen reflect thinking of her time – the rise of the naturalist – in commonly referring to the characters in her novels as ‘creatures’, occasionally as a ‘human creature’, or as a ‘rational creature’? Significantly, it is usually the characters themselves referring to each other this way, which suggests that rather than it being used as a stylistic device by the author it is being presented as common language of the time.

Was it? It occurred to me as I wrote the last paragraph that we now have tools for evaluating word usage in the past, such as Google's Ngram viewer. It uses the company's vast scanned book collection to plot word and phrase usage back to 1600. I've done little to check how reliable this tool is but, as an indication of what was written about and preserved in libraries, here goes.

The phrases ‘human creature’ and ‘rational creature’ experienced a rapid rise in usage in the 1740s, peak at about 1780 around the time Austen was born, then tail away over the nineteenth century. Many other related phrases (human being, human race, mankind, rational being, species) follow a similar trajectory with ‘human’ rising earlier around 1700. These dates are in line with the rise of rationalism with words such as ‘science’, ‘knowledge’ and ‘logic’. The word ‘rational’ itself rose strongly in the 17th century and peaked in the 1740s.

This period has been commonly called The Age of Enlightenment. I can't associate the word ‘enlightenment’ with the various tyrannies that ended in The Terror of the French revolution. As an Age of Reason, though, it did see the continued evolution of science as the systematic study of nature. I see a rising schism between the worlds of fantasy and the world of fact.

Steven Shapin in A Social History of Truth places a surge in empiricism in the late Elizabethan Era when even the most innocuous disagreement could lead to a duel. Being armed with a few observations was good for the health. For Empiricism the 18th century ended on an illustrious note with the publication of the observations and communications of Gilbert White in The Natural History of Selborne. The forces of Enlightenment and their utopian dreams ended the century in a bloodbath which flung many creative idealists towards a romanticised view of nature – a simplified and distorted one, perhaps, but emphasising harmony and beauty.


Occasionally I see the sun rise above the horizon in the morning. More commonly I see it arc across the sky as the day progresses then when I take my dog for a walk late on winter afternoons I'm acutely aware that as it sinks below the horizon we're likely to lose her ball in the fading light. Sometimes when the dog goes off on a sniff-about in her olfactory world that I'm excluded from – who's been by, how long ago and what they've been eating – I lie back on the grass and watch the clouds. Recently my mind drifted into a solar perspective and I was able to imagine myself on the surface of a ball rotating away from the sun.

It's not something I often do, but I was taught that this was the ‘correct’ view of things as ‘discovered’ by Copernicus, and how Galileo risked martyrdom by publicising this view. I'll put aside the question of whether anyone had thought of this before in the history of the human race, but is it really a ‘correct’ view?

Later I learned the power of mathematics and science in finding the best perspective to look at a problem in order to simplify it mathematically and conceptually. Mathematics provides valid geometrical transforms between an infinity of perspectives. The solar-centric view certainly wins out mathematically as a simpler view, but not a completely correct one. Physics shows that it's just an approximation since the Sun, the Earth, and other planets rotate about a constantly moving centre of motion that happens to be near the Sun because of it's vastly greater mass, and all of this is rotating about the galactic centre – pedantic, I know, but the whole argument is. Generally, the correctness of a perspective is judged by how well it suits the task at hand which in the case of the sun's movement is what our senses tell us.

This led me to wonder what all the fuss was with Copernicus and Galileo. When I read Galileo's Daughter by Dava Sobel the issue took a different perspective. Galileo was born at the start of the Counter Reformation, the Catholic Church's attempt to maintain some form of unity in Europe – a time of spiritual, political and military unrest with astronomy playing a peripheral symbolic role.

My overall impression of Galileo now is that he was a very interesting character who loved to push the boundaries of peoples' sensibilities, whether it be challenging their visual perceptions with the telescope, or our position in the solar system, or the speed of falling bodies not depending on their mass. He broke social conventions by having his three children out of wedlock. Despite apparently having the Pope's approval to address the issue of the Copernican world-view as a fictional dialogue he pushed his luck almost too far by placing the official church view in the mouth of a simpleton.

Rather than attacking the church, he was a devout Catholic and acquaintance of the Pope. It seems to me that his actions were partly egotistic and partly an attempt from within to encourage the church to show flexibility. Pushed into relative seclusion he went on to produce some of his greatest work and reflect on his earlier scientific work and a life lived to extremes.


Moving on to the nineteenth century and that great scientist, Charles Darwin, I find another example of rewriting history to suit the modern secular view and the construction of secular saints. My first surprise was to find that the idea of evolution was not new – even in intellectual circles. Charles's grandfather Erasmus Darwin, a noted natural historian, had written about it before Charles was born. The idea had been discussed by philosophers and assumed by animal breeders and others way back beyond the start of our recorded history.

Erasmus Darwin's life (1731-1802) spanned the rise of rationalism yet he prefaced his book Zoonomia with a quote that is a beautiful and succinct expression of the pagan world-view from the Roman poet Virgil who speaks of an inner spirit infusing nature. Erasmus says ‘would it be too bold to imagine that all warm blooded animals have arisen from one living filament’ then hints at natural selection as a mechanism.

It may seem that I'm trying to undermine the stature of Charles Darwin as a great scientist but, on the contrary, the scientific method doesn't really address the generation of new ideas which tend to evolve in many minds over long periods until they finally start surfacing. Science is a process for evaluating, even verifying, ideas and this was what Charles did in an exemplary manner – collect evidence and evaluate it. He, like Alfred Wallace, went on a long and dangerous journey of observation then spent many years piecing together the evidence he had accumulated – sometimes accidentally. He also demonstrated a creative talent with his successful hypothesis of the formation of atolls.

Some of Charles's contemporaries gleefully used his work to attack biblical literalism and a central role for humans in the panoply of Life. Others realised that the theory of evolution supported traditional social views that had evolved gradually through trial and error over the history of competing human cultures rather than ideological fantasies concocted in the hermetic subcultures of intellectuals.


The Copernican view of the solar system and human evolution confronted peoples sense of their place in the universe but were intuitively intelligible. By the end of the 19th century electromagnetism had been described theoretically by Maxwell and had moved from from the realm of odd localised effects in the laboratory to large scale and practical application that affected the way we lived. Marconi transmitted electromagnetic signals coding meaningful information over large distances through a mysterious medium labelled the ‘ether’.

Going down beyond the limits of the microscope others were starting to probe the building blocks of matter. Ludwig Boltzmann and Max Planck had developed the idea that the energy states of matter might be quantised as discrete levels.

In the first years of the 20th century Einstein used the quantisation of energy to explain an apparent paradox to do with the photoelectric effect. Matter was starting to look like a system of vibrational resonances. Light, thought to be a pure wave, was looking like discrete energy packets. At an atomic level, the world was starting to look quite strange and unintuitive.

By the mid 1920s Erwin Schrödinger had produced a mathematical description of Wave Mechanics. It described the world at an atomic level as the movement and interaction of waves or vibrations, and it worked. The immediate problem was to try and explain what it meant – to interpret it. This question has dogged and divided Physics ever since. The standard Copenhagen Interpretation suggests an innate randomness in nature that prompted Einstein's famous contrarian comment about God not throwing dice.

It was a serious point. The word ‘random’, like ‘infinity’ is an abstract mathematical construct. To apply it to reality is to claim that reality is non-deterministic which destroys the foundations of the scientific world-view – the consistent mapping of cause and effect. It has no practical meaning that can be defined rationally, but the 1920s was a strange time. Europe was suffering from a collective shell-shock after the horrors of war and strange spiritualist sects abounded. One of the great scientific expositions of the time, The Science of Life (H.G. Wells, J. Huxley, G.P. Wells), compiled a compelling summary of life and evolution then jumped at the end into a discussion of spiritualism and even presented a photograph of ectoplasm oozing from an unconscious subject's mouth. Perhaps Physics caught the fever.

Although de Broglie, Bohm, Einstein and others tried to rationalise the interpretation, phrases such as ‘wave-partical duality’, ‘quantum weirdness’, ‘spooky action at a distance’ and ‘multiple universes’ still dominate the discussion. I think there is a realistic approach that can generate a testable hypothesis for the nature of reality but that's a topic for another time.


The 20th century saw a major shift from amateur science – science done for the love of it or the glory of God – to the career science of academia. Use of the word ‘scientist’ rose from the 1880s – ‘research’ and ‘research grant’ rising together in the 1940s. This shift saw a growth in applied science as the advances in basic electrodynamics and thermodynamics played out into a wide range of technological advances. The era also saw the rise of pulp science as the publish-or-perish world of academic careerism pushed people into writing more and more about less and less feeding a hungry publishing industry. Scientific disciplines divided into fields, then sub-fields and so on as people marked out and defended their patch – the logical limit of which being a specialist who knows everything about nothing.

The post WWII decade or so was productive for science and technology. In quick succession we had: the chemistry then the coding of genetics; the laser and catalytic fusion from quantum physics; and plate tectonics from Geology. Both the laser and catalytic fusion are more technological developments than basic science – both products of quantum mechanics, but the laser brought quantum phenomena to a visible macroscopic scale and catalytic fusion (the quantum cousin of the laser at nuclear energies) has the potential to provide control of individual fusion events making it the ultimate energy source. The idea of plate tectonics had been around since the first decade of the century but the weight of evidence and strength of resistance tipped by about 1963.

Then what? Can you name a scientific discovery of the last half century that has fundamentally changed our view of nature rather than routine followup from earlier breakthroughs? I've been asking people this question for some years now without a convincing suggestion in response. A scan of the Nobel recipients shows a list of topics so obscure that few scientists would recognise them and describe their importance, let alone anyone else.

I've been told that the reason is we now know most of what there is to know. Aren't physicists talking about an imminent Theory Of Everything tying all of science up in a neat theoretical package? Well, no. It sounds promising but on close inspection it's just an attempt to patch over a few major problems in theoretical physics – and it's failing.

I see a few examples that to me demonstrate an active avoidance of big issues: catalytic fusion; developments in our understanding of complex systems; and new insights into the architecture of the brain. Each of these is potentially transformational but they have been studiously ignored – if not shouted down – by elements of the scientific community.

Catalytic fusion was bitterly attacked and wrongly denounced as impossible when it surfaced as cold fusion. This is certainly the most significant problem that we could be tackling. Eventually, success will transform our future.

Our understanding of chemical reaction chains, self replicating chemical networks, and how memory and learning develop has expanded (e.g. Stuart Kauffman's The Origins of Order – Self Organisation and Selection in Evolution). The implications for how we view Life and intelligence are profound – a thin layer of intelligence permeating Life. Not much intelligence, but widely spread, and over long time periods guiding Life in its exploration of its possibilities.

Our understanding of complex systems has expanded. It has application in many domains, but the emergence of Complexity in the interaction between cells in living tissue, particularly in the brain, has interesting implications. Renato Nobili's work suggesting holographic (distributed) memory on cortical tissue may eventually revolutionise our picture of the brain but is still largely ignored more than a quarter of a century later.

I think a real problem with contemporary science has to do with the bulk of research being limited to either academia or industry. Industry is product oriented and spends little effort on basic research. Academia is locked into incrementalism. The third way, amateur or citizen science, has played a significant role in astronomy over the years. In agriculture, Australia has Ken Yeomans (Keyline Farming) and Peter Andrews (Natural Sequence Farming) who have developed, or re-developed, important approaches to dealing with water.

With the arrival of the internet, we have the emergence of crowd-sourced science. At a simple level of involvement, many people provide computing power toward mathematical number-crunching. The cold fusion debate saw CF sceptic Tom Droege set up an experiment in his basement to try to replicate the Fleischmann and Pons experiment with assistance from people across the internet. It failed, but the attempt was an interesting precedent. Crowd sourcing and crowd funding of ideas provides an exciting option for future scientific exploration.


We might view institutional science as in the hands of steady, unadventurous Elinor carefully measuring and analysing. Is Marianne banished to obscurity? Perhaps it was ‘the bomb’ that turned her from the pursuit of deep knowledge, but wooed by the press she has acquired a taste for public glory and throws herself into brief but high profile liaisons based on fanciful ‘studies’ that provoke a Gothic delight in fear. Perhaps she should be locked in an attic till she comes to her senses.

What then of Frankenstein? We have ‘Frankenfoods’ – so called because they seem to go beyond the limits of natural selection and are also perceived as driven by a blinding commercial ambition and an oversimplified view of nature. Are they a current or potential danger? Probably not, on any grand scale. But what about the idea of engineering a deadly virus that targets particular animal species, or even particular human genetic streams? I think this is possible but I don't know enough to be sure. What I am sure of is that such a virus would mutate and evolve to widen its range of targets. That's what Life does and, given time, does it very well.

Less obvious, perhaps, is the increasing reliance on, and misplaced reverence for, computer models of economies, stock markets, and climate. These are Complex Systems in a qualitative mathematical sense that goes well beyond just being very complicated. It's their dynamics that is complex – islands of relative stability within seas of deterministic chaos that defy the simplistic computational models we have today.

Exploring natural and inherently Complex systems we are in a similar position to the Polynesians exploring the Pacific Ocean – sailing the seas looking for signs of land – birds, clouds, patterns in the waves. At the simplest levels of nature, the atomic realm, these islands or stable attractors form regular patterns, the distribution of which can be characterised using equations such as the Schrödinger's Equation. For a broader view new approaches are needed.

Eerily close to the image of Frankenstein and his monster is a common contemporary fascination for the android robotic form. Do we really need machines made in our image?


To end on a more positive note I return to catalytic fusion. Its potential is too great for us to dismiss. Even though it has been demonstrated experimentally with muon catalysed fusion, people still claim that it's impossible. It can't be proved impossible without considering the almost infinite variety of possible atomic configurations that might produce it. We need a wide variety of competing attempts rather than one Big Science institutionalised effort that will be quickly overrun by careerism. Above all, we need to start looking and keep looking. It won't happen soon, or all at once, but it will happen someday. The clock doesn't start ticking till we start looking seriously.