A history of seeing animals, part two

Part one

Christianity had a huge impact on how some parts of the world saw animals.  The teaches claimed that God had given man the right to rule over animals, that they were made for us and each animal had a specific purpose.  The bestiaries of the middle ages encompass this way of thinking.  Animals were used to teach religious principles and morality through illustrated lessons.

In medieval times, we have people living alongside their animals, often farmers sharing their home with their stock.  This meant they knew each animal individually and valued them because of their contribution.  This type of relationship had been the case for thousands of years before but would soon be changed.

When the black plague hit Europe, animals were looked at with suspicion.  Scapegoats were needed to quell the panic and try and set the world back in order.  In particular, wild and potentially diseased animals were seen as dangerous and were often killed as a way of cleansing the community.  It was around this time that we saw animals being put on trial for crimes, in a serious way, just as humans were.

Renaissance thinking brought a more scientific way of looking at the world and with it, nature became something to be investigated, to be put under a microscope.  Again this was looking at animals as something that were here for us, as instruments, a world view that kept humans in the centre of the universe.

In the 1600s, Descartes presented animals as equipment, as mechanical objects that don’t feel pain and this was another way of rationalising poor treatment.

Moving forward, we find the Enlightenment playing host to conversations and debates about animals as philosophical and ethical subjects.  This was fuelled by urbanisation and commodification of animals, the increase of print media and the popularity of vivisection in science.  Around the same time, farmers began moving animals out of their home, putting distance between man and beast which would of course have an impact on how animals were viewed.  The urbanisation and industrialisation of England would take the urban rural divide and amp it up.  The gulf between human and animals would grow and animals would increasingly be seen as commodities and would thus be treated badly.

By the 18th century, controlled breeding was happening which would change the very species themselves, more so than domestication had, into the most efficient object for our use.  Animals were being turned into the food machines that Descartes saw them as.  Around this time, it was also being argued that domestication was good for animals – they were protected from predators, given a reliable and regular source of food and butchering them was an act of kindness that prevented suffering.

“Farm animals became statistics rather than individuals, which took into account their marketability, the level of meat production, and the density of customer populations.  By the end of the eighteenth century, farm animals were mathematized.”
– Brian Fagan

Up until this point in time, animals determined how humans lived, now humans were determining how animals lived, and even how they grew.  The depersonalisation of animals was increasing at a pace as rapid as industrialisation.

Darwin’s work on evolution, whilst it took a long time to take hold, also changed how we looked at animals.  For some people, it confirmed that (western) humans are the highest evolutionary point, for others it connected us to (some) animals.

During the 1700s and 1800s, pet keeping was becoming more common.  But class mattered.  At first pet keeping was for the upper classes whilst the animals of lower classes were looked down upon.  By the 19th century, pets were much more widespread and this brought with it another change in how we see animals.  It started to be accepted that animals, at least pets, had personalities and were individuals that should be treated well.  Juxtaposed against this increase in pet keeping was an increase in big game hunting which would symbolise dominating nature, conquering the wild and imperialism.

The reputation of Britain also changed over the last few hundred years.  In the 1700s we were perceived as being cruel to animals, as having an indifference towards the suffering of animals and generally thought to be harsh towards them.  By the end of the 1800s, treating animals well had become part of what it meant to be British.  For a while, during the wars, animal kindness took a bit of a backseat but would be revived in the 1960s and 70s.

Today we seem to care about animals as individuals, as status symbols – such as #animalselfie – and sometimes from a conservation perspective.  However, we also still very much see a divide between humans and other animals, with humans being the superior side of this.  This is having, and will continue to have, devastating impacts on the world we live in.  Unless we change how we see non-human animals and nature, sustainable change will not be made.

Resources

A history of seeing animals, part one

“The kinship between humans and animals has never been static, having been at the mercy of changing social norms and fleeting trends… human economic, cultural, and demographic factors play a major role in how we perceive of, and treat, animals.  So do age, education, ethnicity, occupation, religion, and sex.”
– Brian Fagan

As we saw when we looked at bestiality, how we view animals and think of them is time and culturally specific.  As we are at time when it seems clear we need to rethink our relationship with nature, a quick glance back seemed useful.

Our ancestors developed an awareness and understanding of the animals around them, predators and prey.  At least seventy thousand years ago, human cognitive abilities improved and so did hunting skills and technology.  This would be a move that changed how humans interacted with their world.

Hunters would treat prey as a living being, often seeing them as sacrificing themselves for humans, and thus we treated them with respect.  In order to successfully hunt, and hence survive, they had to know their prey.  They had to watch them and understand them, they had to know when not to approach and how to make their prey less fearful.  This creates an intimate relationship between predator and prey and we can see this in the cave art and in the stories that we told each other about the world.  Importantly, humans and animals were equal and there was no hard line between humans and other animals.  In this culture, individual wealth wasn’t a concept in the way it would become with domestication.

“Domestication changed the world, it’s landscapes, animals – and humanity.  About ten thousand years ago – the precise date will never be known – numerous deliberate acts, such as the corralling of young ungulates, turned animal-human relationships on end… Humans were now the masters, so the role of animals changed.  They became objects of individual ownership, tangible symbols of wealth, and powerful social instruments.”
– Brain Fagan

Domestication marks a shift in attention from dead animals to live ones, from communal resources to personal ones.  Its thought that dogs started to be domestication around 15,000 BCE and by about 10,000 BCE, other species followed.  In particular goats, sheep, pigs, cattle, donkeys, horses and camels.

“There were advantages on both sides in these history-changing partnerships.  What were now farm animals, bred in captivity, acquired better grazing and foraging carefully orchestrated by deliberate herding, and security from predators.  Humans acquired predictable meat supplies, milk, and a whole range of valuable by-products – everything from hides and fur to horn and sinew.”
– Brian Fagan

Dogs helped our ancestors to hunt, they offered guard functions, they may even have pulled loads and would become companions in their own right, as we are familiar with today.

Through domestication of sheep and goats, humans were able to settle in an area.  Livestock would provide meat, milk, skins and wool and would be a predictable and more reliable resource.  It also allowed humans to claim a piece of land as theirs, and this land would pass from generation to generation.  This was the time at which animals became more than just resources, they became a symbol and they linked generations.  The size of your herd was a sign of your wealth and thus your status.  Where previously animals were not owned, they were prey for hunters, now animals were property and with this came changes to human existence.  Rules around inheritance arose and this meant marriage ties became more important.

Around 6000 years ago, humans hitched a plough to an ox and established the first source of animal power for food production.  This meant it was possible to create surplus food which meant less time needed to be spent working and created leisure time and a social division of labour.

Then, around 4000 BCE, cities were established and grew which in turn meant a increased demand for goats and sheep.  This would lead to increased flock sizes which would have its own impact on how we saw and related to livestock.  In more rural areas, farmers and herders knew their animals by name, maintaining a close relationship but in more urban areas, the relationship between human and beast was changing.  The demand for meat and animal products increased and in response, so did the size of herds.  This led to depersonalisation, and seeing livestock as commodities rather than living creatures.

By 2500 BCE, pack animals were on the scene.  This involved the donkey, the horse and later the camel and would allow humans to travel over long distances, carry commodities for trade, supported armies and so on.  Areas became linked, empires grew and the world became more interconnected.

“Donkeys have worked alongside people for more than eight thousand years – but “alongside” actually means in the background, for they have always been inconspicuous players in history.  Plodding asses carried food and water, exotic luxuries, and essential commodities.”
– Brian Fagan

Despite the huge role they have played in our history, we still tend to see donkeys as stubborn beasts of burden.  As pack animals were often used as a caravan, they were treated as a group rather than individuals, and relationships are between individuals, thus it was easier to mistreat or overuse the donkeys.  This highlights a difference in how we have seen donkeys and horses – humans ride horses but tend to use donkeys to carry things, the former is a one to one relationship, the latter isn’t.  Thus there tended to be a bond between man and horse that wasn’t there when it came to donkeys.

Horses were likely domesticated somewhere around 4000 BCE and from there on, we established an, often, intimate relationship with them.  This was a relationship which was beyond function, the horse and rider were bonded, they were a team.  Horses, like cattle before them, became a status symbol.  They were an animal which brought prestige to the owner or rider.  They were noble creatures and as such were named and cherished.

“The Greeks made a clear distinction between the noble horse and the “servile” donkey, which corresponded in broad terms with that between people who were free and slaves.”
– Brian Fagan

Aristotle felt that nature had made animals as food and labour for humans and that they were subservient to us.  This fits with how we used animals and also made it ok for us to use them that way.  Animals were utilised for human benefit and human development.  Whilst some people will have had a personal relationship with some animals, on the whole, they were considered food and labour.  For the Romans, animals were beasts of utility as well as a source of entertainment.  Animals were pit against each other, against humans and were slaughtered as a form of amusement.

For more about how we view non-human animals, come back tomorrow!

Resources

Disability in History: Medieval Era

Around 673AD, Christianity began eating away at medical understanding, replacing it with superstition.  Illness began to be understood as something that was a punishment from God in response to sins and transgressions, or conversely the result of witchcraft, or possession by the devil.  The cure for illnesses would be prayer, penitence and pilgrimage.

Visions, such as those experienced by Hildegard of Bingen along with migraines, were thought to be messages from the divine.  Joan of Arc is probably the most famous example of this.

Alternative understandings of disability were that disabled people were closer to God because they were suffering purgatory whilst on earth, rather than after death like everyone else.  This would mean they’d reach heaven sooner.  This may be one of the factors that continued the Roman practice of having ‘fools’ in court who were able to tell the monarch the blunt truth where others couldn’t.

During the medieval period, mutilation was used as a punishment for crime, leaving visible reminders of their status and past.  Unfortunately this had repercussions on the non criminal disabled population.  The deformed or disabled body became associated with the criminal body and gave weight to the idea that you can judge a book by it’s cover; that you can tell someone’s character and morality by their appearance.  This meant that non criminal disabled people ended up having to prove their nature and fight prejudgement.

Most disabled people lived and worked within their community, supported by their friends and family.  Without this support, some people resorted to begging and others were cared for by monks and nuns.  Religious hospitals began to crop up across the country as a way of carrying out the Christian duty to shelter strangers.  As well as caring for those with short term illnesses, they began to take in those with disabilities who were unable to live in their community.

“The idea of the almshouse (sometimes referred to as ‘Maison Dieu’) developed from hospitals. Almshouses were built to provide long-term shelter for the disabled and aged infirm, and soon became a common feature of towns and cities. They were founded and supported with donations from kings, church dignitaries, nobles and merchants, all keen to ease their passage to heaven with good works.”
– Historic England

From the 13th century, the King had duties towards people with learning disabilities, including the duty to ensure they were cared for.  This combined with the almshouses and hospitals feels like a move towards a more caring, concerned society, possibly motivated by the idea of securing oneself a place in heaven.

Then would come the witchhunts.

It’s well known that women were targeted as part of the witch hunts but so were disabled people.  Anyone different, disabled and mentally ill were more likely to be targets.  Think about the stereotypical witch – she has a bent back, a disfigured face and is often seen peering as if she has sight difficulties.

“The concept of the disabled person as sinner, and as being in league with the Devil, or even being its ‘spawn’, gained tremendous traction during the Middle Ages and beyond.”
– Quarmby

The Malleus Maleficarum, the witch hunter’s handbook, was written in 1487 and declared that children who had impairments were born to witches, disabled people were proof that the devil existed and visibly impaired children were changlings.  Another popular belief was that disabled people were possessed by the devil.

The focus on disabled people can be understood through a lens of scarcity and fear.  The witch hunts came at a time when resources were limited and if a disabled person couldn’t pull their own weight then they were another mouth to feed.  By casting disabled people as immoral or unworthy, then you wouldn’t have to feel compelled to help them, or guilty if you didn’t.  Essentially, disabled people were once again a scapegoat and like as in ancient Greece, removing them from the community cleansed it.

Around the same time, major religions were informing views, such as the the Christian view of disability as a punishment and the focus on curing the disabled as well as the importance of renouncing the sin that caused it.  Alongside this was Judaism which viewed impairments as ungodly and the result of wrongdoing.  There were also ideas around possession and trying to exorcise people who likely had epilepsy but at that time were thought to have the devil in them.

These ideas influenced today’s backdrop of disability where we still find reference to disability as a result of something you did wrong, or disability as something you should fight and battle to overcome (even if that’s not achievable).  Think of how we speak about cancer and other illnesses, and how we commend people as brave and inspirational when they do something whilst having a disability.

When Shakespeare was writing in the late 16th century, he depicted Richard III as twisted in body and mind, a short cut to casting him as the villain.  Around the same time, the travelling freakshow was being born.  In Europe, fools were collected and exhibited by aristocrats and royals.  And people with deformities and intellectual impairments were being displayed at village fairs and other festive occasions.

Resources:

Disability in History: The Ancient World

“Our attitudes to imperfection and disability descend directly from the concept of the body beautiful of Greek and Roman culture. Those cultures, and the stereotypes and prejudices they developed towards disabled people, were and remain very powerful.”
– Katharine Quarmby

The attitudes that Quarmby is referring to include the idea of disability as punishment for sin, as a result of curses, as scapegoats and as monsters.  People were sacrificed for the good of the community, were cast aside and/or were rounded up into so called freak shows.  Essentially, they were dehumanised.  These perspectives and attitudes are still prevalent today.

In ancient Greece, when a tragedy had occurred it was thought to be because the gods were displeased with the mortals.  To appease them, a scapegoat was chosen and offered to the gods and these were, on the whole, so called ‘useless’ people, outcasts and beggars.  All labels that suggest a higher than average proportion of disabled people.  The scapegoat would be banished from the city, or killed, as a way of purging the area of whatever pestilence had befallen it.  By removing the scapegoat from the area, you were in effect removing the pollution.

In a similar way, babies born with deformities were also expelled from the community.  To ensure that no disabled babies slipped through the cracks, they were inspected at birth.  If they didn’t meet the standards of the day, they were dropped in a well.  Diodurus went a step further instructing that anyone who acquired a disability should kill themselves.  Underlying this cruel treatment was the Greek idea that only the fit should survive.  These ideas would be seen again in the early twentieth century with the obsession around eugenics.

“Disability was, as it is too often today, seen as shameful.”
– Quarmby

In ancient Greece, we also see the link between disability and evil, something which would be amplified by the middle ages with the advent of witch hunts.  The only real exception to the attitudes to disability were when they were the result of battle.

Similar attitudes prevailed in ancient Rome where again we find that children born with a disability were to be killed, although it would now be by drowning.  The Romans saw disabled people as freaks and a spectacle, attitudes that clearly informed the Victorian fascination with freak shows.  Romans enjoyed watching disabled people fight and if they escaped death at birth, they were in demand as entertainment.  They ‘kept’ disabled people, especially ‘fools’ as pets and Roman emperors had dwarfs in confidential positions in court, something that began in Egypt and which could later be seen in English courts.

“We spit on epileptics in a fit, that is, we throw back contagion.  In a similar way we ward of witchcraft and bad lucky which follows meeting a person lame in the right leg.”
– Pliny

“Neither the Greeks or the Romans had a word equivalent to ‘disabled’ but the term that they often use is ‘teras’ (for the Greeks) and ‘monstrum’ (for the Romans). These are the same words they use to describe mythological monsters.”
New Statesman

Of course, not all disabilities are considered equal and that was the case back then as well.  Certain illnesses or conditions that we might consider to be a disability today may have been less of an issue then.  Remember that a lot of the population were hungry, overworked slaves and that inevitably health issues that we take a pill for and move on, wouldn’t have been treated.  The lack of a word for disability means we don’t know for sure what would have been considered enough for you to be killed at birth.

Additionally, the family you were born into made a huge difference.  If your parents were rich, then you had better options.

Over in Egypt, the main concern was around genetic contamination but that didn’t mean they were cruel to disabled people.  Indeed, one Egyptian texts says:

Do not laugh at a blind man
Nor tease a dwarf
Nor cause hardship for the lame.
Don’t tease a man who is in the hand of the God
Nor be angry with him for his failings.

We see this more inclusive attitude reflected in roles that were available for disabled people, whether as musicians, servants etc.  Disability was also considered socially acceptable in Mesopotamia.  Unfortunately, we have inherited more from Greeks and Romans than we have from Egypt and Mesopotamia.

Resources

A short history of period products

Today we are lucky to have a choice of period products, including disposable and reusable options.  Traditionally, however, people who menstruate haven’t been so lucky…

Back in ancient Egypt, papyrus was used and in ancient Japan paper was used.  The Native Americans made a version of sanitary towels using buffalo skin and moss, which, comparatively, sounds heavenly!

That said, as an aside, it’s not always clear what is true.  Take the case of Tampax telling website readers that in ancient Greece, wool was wrapped around wood and used as a tampon.  As appealing as the myth seems to be, it’s likely just that.  Although it could be a misunderstanding of an ancient Greek treatment for vaginal issues, it could be an attempt by tampon companies at naturalising their product, especially given we’ll see the concerns around virginity later… Another problem is that most of history is written by men… And upper class men at that…

“Part of the reason that there is little extant evidence is located in the dual nature of sanitary protection.  This subject is both taboo and mundane, leading to an apparent lack of contemporary early modern sources.  Menstruation is a commonplace experience for women the world over, yet it is often considered a subject to be left unspoken.”
 – Sara Read

Evidence that Read has found, suggests that in the 17th century, some women were using ‘clouts’ or ‘rags’, folded cloths used to absorb blood.  However, this wasn’t across all classes.  She has found other evidence that suggests only higher ranking women would wear sanitary protection.  Sponges may have been used, possibly just by prostitutes.  Her paper, linked below, is a very interesting read about attitudes and beliefs around menstruation in early modern England as well as practices.

Whilst DIY methods were certainly used, and especially in more recent history we know rags or homemade pads were used, for much of history, it’s likely that free bleeding was the main ‘choice’, especially amongst the poorer strata of society.

In the second half of the 19th century, a variety of patents were filed in the US for various period paraphernalia – and as an aside, much of what I’ve been able to find is based on period products in the US.  These included horrific sounding early takes on a menstrual cup – generally made of metal or hard rubber, rubber pants and period aprons…

“The menstrual apron and pad holder in front are cloth-covered rubber. The wearer pinned absorbent cloth, such as bird’s-eye diaper cloth, onto the inner side of the holder. Of course, the woman wore the whole contraption “backwards,” under her dress, and over her buttocks, to keep the menstrual blood away from her clothing.”
– Museum of Menstruation

You can see an advert for one of these over at MUM.

Whilst all the options pre-20th century sound awful, they were also being used at a time when people had fewer periods.  Between a later age of menarche, more pregnancies and time spent breastfeeding and poor nutrition, menstruation was often suppressed during an adult’s life.

A lot changed at the back end of the 19th century when disposable sanitary towels went on sale.  By the 1890s, middle class women were ordering mass produced sanitary napkins, or buying the fabric to make their own at home.  Disposable options were particularly revolutionary.  Before this, women were trying to deal with bloody cloths and handwashing but now, they could simply be thrown away – funny how things go full circle!

In 1899, a female German doctor wrote the following in a book aimed at middle class women:

“It is completely disgusting to bleed into your chemise, and wearing that same chemise for four to eight days can cause infections.”

This suggests that this was a very common practice, and also adding weight to the move towards contraptions to deal with bleeding.  This also came at a time when, in America at least, menstrual blood was considered dangerous and so reusable rags were thought to harbour germs and gases which could contaminate the user.  Advertising claimed that doctors supported disposable options as healthier.  For anyone starting their period at this time, commercial options seemed to be the ideal, a basic necessity.

A common method of protection in the 1900s included linen belts with napkins pinned to them and by the 1920s, the belts were sued with disposable napkins.  During the first world war, nurses had noticed the cellulose that was used for bandages absorbed blood better than cotton.  They realised the potential and began using it for their periods.  Kotex saw the market opportunity and manufactured these new, highly absorbent, disposable napkins.

The 1930s saw the arrival of tampons for married women but were not for unmarried women as it was thought they would break the very precious hymen.  It was 1929 that Dr Haas created the tampon, likely based on earlier prototypes, and it was also likely that being a male doctor gave him some kudos.  A businesswoman marketed them under the name Tampax in 1936 and whilst it was adopted by some, others thought because it was worn internally it was little more than a dildo.  In reality, tampons offered freedom from belts, pins, pads and chafing and allowed for physical activity.  Dancers and swimmers in particular welcomed them.

Around this time, Leona Chalmers patented the menstrual cup, but it didn’t take off.

Concerns around tampons were addressed in the 1940s. Dr Robert Latou Dickinson gave tampons a boost when he said that they are narrow enough not to break the hymen and hence are not a threat to virginity.  He also said that any sexual stimulation from the tampon was momentary and nothing compared to how the sanitary pad rubbed against the body.

Despite the expectation that women should work through their periods during World War Two, the 50s found women encouraged to be quiet and restful instead.  It was at this time that PMS was labelled.

In the 1960s, washable cloth pads came back into fashion and with the hippie movement, the menstrual cup was relaunched but again it didn’t take off.  In 1969, the self adhesive pad came out and allowed you to get rid of the belts and pins.

By the 90s, sanitary towels had grown wings and although bulkier than the pads today, they were very recognisable.  Despite many people using tampons, fears over the hymen remained.  A Tampax advert in Seventeen, showed a concerned girl asking if she’d still be a virgin if she used a tampon.

In the 2000s  and 2010s, menstrual cups finally took off and period pants hit the market and reusable and environmentally friendly products have grown in popularity.

Links

A short history of wheelchairs

As a wheelchair user, I started to wonder how my life might have been had I been born 100 years ago, 500 years ago or 1000 years ago and (assuming I actually survived) this would be very dependant on the types of wheelchairs that were available.  With this in mind, I ventured into the history of wheelchairs.

Early images of wheelchairs are found in stone carvings in China and on a Greek vase.  The former showing a wheeled transport device and the latter a wheeled bed for a child.  But despite these early records, the first known dedicated wheelchair was invented in 1595.  It was made for Phillip II of Spain and had small wheels at the end of the chair’s legs, a platform for his legs and an adjustable backrest.  It wasn’t self propelled but then again he was a king so was probably surrounded by servants anyway!

Sixty years later, Stephen Farffler made a self propelling chair which was mounted on a three wheel chassis and had handles on the front wheel which allowed the user to move without assistance.  The handles operated a bit like a hand bike…

Possibly the best known early wheelchair is the Bath chair, named after the city, not the washing facility.  It was created by John Dawson and had two large wheels and one small.  It was steered using a stiff handle but was very heavy and had to be pushed or pulled.  This version of the wheelchair outsold others in the early 19th century but it wasn’t comfortable and so adjustments and improvements were made over time.

In 1869 we have a patent for the first wheelchair with rear push wheels and small front casters, something we would easily recognise today.  Again, this model needed improving and a few years later, hollow rubber wheels were used, pushrims for self propelling were invented in 1881 and in 1900 we find the first spoked wheels.

Injured soldiers returning home from World War Two were more likely to survive certain injuries because of the discovery of antibiotics.  This meant that there was a sudden influx of people who had spinal injuries etc that would previously have killed them.  In turn, this meant an increased need for wheelchairs.  Depending on their injury, some of these veterans would have been unable to self propel and, having previously been active, would have found themselves dependant on others.

It was one of these soldiers, who was frustrated with his situation, who advocated for a better wheelchair.  This combined with Canada’s commitment to veteran support, resulted in a request to George Klein to build a brand new type of wheelchair.  After Canadian vets had been given their electrically powered chairs, an effort was made to engage manufacturers.  One of which was Everest & Jennings.

Harry Jennings built the first folding, tubular steel wheelchair in 1932 for his friend Herbert Everest.  They then joined forces to set up Everest & Jennings who monopolised the wheelchair market for years.  In 1956 they were the first to mass produce electric wheelchairs.  These were fairly rudimentary, had only two speeds and were very bulky but still, they paved the way for the plethora of electric wheelchairs we have today.

Whilst slightly off topic, it’s worth noting that 1952 saw the beginning of wheelchair sports and by 1960, the first Paralympic games were being held.  The increased visibility of people with wheelchairs alongside the more specialised uses for them, almost certainly aided the refinement and variety of chairs that we are now lucky to have.

Moving forward, in the second half of the twentieth century, developments to the wheelchair happened quickly.  Motors were added to standard wheelchairs, then lightweight aluminium was used and the availability of plastic inevitably led to further innovations. Further, as computer technology boomed in the last fifty or so years, we have seen these enhance and improve the available powerchair technology.

Today we have wheelchairs that can be used in sports, that are very lightweight, that can raise the user up so that we can sit at bars, that can be controlled in different ways and which ultimately allow many more people control over their movement.  Wheelchairs, powered or not, are highly customisable and although I haven’t sat in a pre-20th century one, I can imagine, are significantly more comfortable and allow for a better quality of life.

Aside, please don’t use the term wheelchair bound.  A lot of wheelchair users can walk or stand, and even those who can’t, aren’t tied to their chairs.  It also makes it seem like wheelchairs are a terrible burden and whilst they aren’t perfect, they are amazing and significantly improve people’s lives.

Links

A short history of mental illness

I will also be doing a series of posts on the history of disability and obviously there are overlaps between this topic and that one.  I will also be focusing on England with this post. 

With the history of mental health, it’s harder to establish particular attitudes and practices as you get further back in history.  Often mental illness and physical illness are conflated or not specific and because of this, I decided not to start with the ancient world today.  Obviously our ideas of mental illness today are very different to those in the past and so too is the language used.  Please bear this all in mind as you read on.

“Whether a behavior is considered normal or abnormal depends on the context surrounding the behavior and thus changes as a function of a particular time and culture.”
Noba Project

Very early understandings of mental illness often attributed a supernatural cause, such as being possessed by the devil or having displeased a god.  Naturally this understanding of the problem influenced the treatment and in the case of having an evil spirit inside you, trepination may be the answer.  It involved drilling a hole in the sufferers skull to release the spirit.

Supernatural causes of mental illness were prevalent between the 11th and 15th centuries as people searched for reasons for natural disasters such as the plagues and famines.  This was also when we had the witch trials and I’ll talk more about that in a different post, for now though, I want to note that not everyone thought mental illness was down to the devil.  Johann Weyer and Reginald Scot believed that people being accused of witchcraft were actually suffering from mental illness and that the mental illness was down to disease, not demons.  The church banned their writings.

As you can see, mental illness was often tied to religion or spirituality and in-keeping with this, it was generally the monks and nuns which provided care for people who were ill.

The first English hospital for the mentally ill was Bedlam.  It was established as a hospital in the 13th century and by 1403, ‘lunatics’ made up the majority of the patients.  Originally run by monks, Henry VIII seized Beldam during the dissolution.  Before he died he transferred control to the Corporation of London making it a civic, not religious, institution.  In 1619 Helkian Crook became the first medically qualified ‘keeper’ and shows that mental illness was starting to be seen as a medical issue.  Despite this slow change in London, mental illness was still seen by many as supernatural or religious in origin.  Symptoms included those that you would expect today but also included not praying, not feeling pious, talking too much, sexual urges and hatred of your spouse.  As you can see, some of these symptoms were a way of controlling those who didn’t conform.

On the whole, as was the case with disability, most people were cared for by family or the community although there were a number of mentally ill people who were living on the streets.  In the eyes of the law, during the 16th and 17th century, they were seen as unable to reason and responsibility for their affairs was allocated by the Court of Wards.

Around this time, there was a move towards asylums and this was based on the belief that people who were mentally ill could thrive in a clean, healthy institution such as York Retreat.  Unfortunately, we also find asylums being treated like zoos.  It was considered entertaining to visit and see the patients in Bedlam and I suspect the situation was the same in other asylums.  Conditions were dreadful, people were tortured and forced to live or exist in appalling situations whilst also being displayed in a humiliating fashion.

Attitudes towards mental illness inevitably change as science and medicine evolves and prevalent beliefs alter.  At the end of the 18th century, with the enlightenment, it was thought that people arrived as a blank slate and your outcome was down to nurture.  This obviously affected how people saw disability and mental illness

The industrial revolution brought vast changes to the landscape and the emphasis was heavily on productive workers.  At this time, there was a rapid expansion of institutions and people with mental illness were moved from home to asylums.  Early ideas focused on moral treatments but professionals quickly lost interest or hope with this approach.

By 1900, more than 100,000 ‘idiots and lunatics’ were living in 120 county pauper asylums and 10,000 in workhouses across the country.  It was thought that financial aid to help people live in the community encouraged laziness.  They didn’t seem especially concerned that asylums were expensive and often people who went in, never came out, spending a long and miserable life there.  Rather, reformers who encouraged the building of asylums, claimed that they would be a safe space to cure people or to teach them useful skills.

The buildings themselves could be made up with long corridors, sometimes ¼ mile long, or rows of blocks.  Men and women were segregated and dormitories could consist of up to 50 beds which stripped patients of privacy and space – beds were crammed in and the person next to you could be an arms length away.  High walls prevented escape and staff lived on site, making them a self contained world.  There was always a cemetery and some even had their own railway station… They were to all intents and purposes a world of their own, and a law unto themselves.  This allowed poor practices and abuse to run riot and the outside world were oblivious.

Whilst I’m sure many of the patients did have mental illness, the asylum also feels like it was used as a bit of a dumping ground with people being admitted on dubious grounds.  Those who did have mental illness often suffered from things which we see as very treatable today, such as panic attacks, and it’s highly likely that being in the asylum did more harm than good.

In 1948, the NHS was created and asylums etc were no longer separate to the physical side of health.  Psychiatrists began to experiment with treatments and physical activity was carried out on the body to help treat the mind eg ECT was widely used to treat depression.  Another bodily based treatment involved giving patients insulin to induce a coma, as a way to treat deep seated issues.  Whilst this all sounds horrific to us today, it was an important shift towards making the treatment of mental illness more of a science.

In the mid 1950s, over half of the NHS beds were for mental health and was costing a lot of money.  A report in 1957 drew attention to the outdated asylums and mental hospitals and highlighted the idea of community care.  Around the same time, new drugs were being discovered and created that would control some of the behaviours associated with mental illness that had led to people being sent to asylums.  Note I said control, these were often used to tranquillise patients rather than to cure their illnesses.  That said, in the late 50s and 60s, specific drugs were available to use for specific disorders.  With this huge change, there was less need to confine people and care in the community seemed to be a realistic possibility.

The late 50s and 60s saw the move towards people living in the community and also a better public awareness of the conditions of asylums.  To try and improve standards, open door policies were established and freedom was increased for patients.  There was also the introduction of occupational therapy which showed that people with mental illness weren’t inherently useless…

In the 70s, with the recession, spending on mental health was cut.  Bed by bed, ward by ward, the asylums were closed.  As beds were no longer available, people had to be cared for in the community and on the whole it was charities who picked up the pieces.  The help that was needed to transition patients from asylum to communities never materialised and many people were left facing a new world without support.

Whilst we are still far from perfect in how we, as a society, support and treat people with mental illness, we have come a long way and, as a user, I am very grateful for the help available today.

Resources

A short history of prosthetics

Prosthetics have both a practical purpose and an emotional one, with some people feeling that they help to make them whole.  In ancient Egypt, there is evidence of a woman with a prosthetic toe that is made from wood and leather and some people say wouldn’t have affected her ability to walk.  Those people hypothesise that because they were a sandal wearing culture, she had felt it important to her identity to have the prosthetic.  Others believe that it will have contributed to her ability to walk.  Either way, I think it’s pretty amazing that we have evidence of prosthetics that old, especially given the materials they were generally made with.  In Egypt they were made from fibre and wood and echoed the importance they placed on wholeness.

Another early example is an artificial leg that dates back to about 300 BC.  It was found in Italy and was made of bronze and iron with a wooden core.  It’s thought this was held in place by a leather belt.

Whilst we tend to think of a prosthesis as replacing a limb, they are used to replace other body parts including eyes, breasts and teeth.  And when it comes to teeth we find a lot more literature.  Did you know, for example, that at one time hippo ivory was used to make false teeth as it was stronger than alternative ivory and didn’t yellow so quickly.

Etruscan false teeth from between 8th and 3rd century BCE have been discovered as have sets of false teeth which were made from animal teeth or even human teeth and were connected to intact teeth with a metal band.  Anyway, I don’t like the dentist and all this talk about teeth is too much for me….

Hook hands, peg legs and iron hands were used from roman times to the end of the middle ages with little advancement in technology.  In the 16th century, a hinged arm and a locking leg were invented.  The heavy iron was replaced by a mix of leather, paper and glue tanks for a French locksmith of all people.  We also have to thank watchmakers for contributing to the development of prosthetics as gears and springs were used and needed a careful approach for the intricate parts.

The history of prosthetics is about the history of the prosthetics of the wealthy, or lucky, as is often the case today.  Knights may have been fitted with them because of their status but possibly also because the history of prosthetics has always been intertwined with the history of wars and the soldiers that fight in them.  We know of a roman general that lost his hand and couldn’t fight, but with the aid of an iron prosthetic that could hold his shield, he was able to retain his identity as a general and presumably return to war…

Around 1800, a breakthrough was made in the mechanics of prosthetic limbs by James Potts.  His ‘Anglesey’ leg had articulated parts and used cat-gut tendons to hinge the knee and ankle, creating a walking motion when the toe was lifted.  This design was further developed by adding a heel spring.

The American Civil War saw many many limbs amputated and the US government supplied these soldiers with prosthetics, allowing them to return to work…. So kind!  This vastly increased demand and presumably there were tweaks to design at the same time.  Midway through the war, a new way of attaching the prosthesis was developed that used suction rather than straps.  Another prosthetic that came from the war was a rubber hand which had fingers and was able to connect to an array of attachments.

World War One also saw an increased demand for prosthetics but poor designs and poor fitting led to many going unused.  Common complaints included pain related to friction between prosthesis and the amputated limb and the weight of the prosthetic.

Throughout most of history, prosthetic limbs were wood or metal although I read about one that was made from plaster and animal glue and another that was iron with a wooden core.  More recently, lighter options have become available.  Lightweight aluminium combined with the suction attachment made for more practical and more affordable options and more recently plastics and electronics have followed.  Another big change is around the look of them.  Historically, prosthetic limbs have been designed to replicate the limb and to make other people feel comfortable but in recent decades, there has been a noticeable move towards function over appearance.

In the 1960s, children affected by thalidomide were born with malformed limbs and technological solutions to medical issues were sought.  These came in the form of personalised prosthetics which sped up the advancement of this area.  Gas powered prosthetics were invented to help children and whilst they may have sounded great, and certainly I’m sure some kids found them helpful, others found them difficult and cumbersome.  They required a lot of time away from home to fit them and teach the children how to use them and this obviously had to be repeated as the child grew.  Further, as the child grew up, they wanted to be able to do more with their prosthesis such as feed themselves, write and go to the toilet by themselves.  To be able to do these tasks would make mainstream school accessible.

Gas had been chosen as a power source because batteries at that point were impractical.  As time went on, other ideas were considered and someone thought that a more modular system might work and by this point technology had shrunk making batteries more practical.

In the 1990s, knees that used computer chips were introduced.  The chip controlled the speed and swing of the knee joint and sensors provided feedback.  In 1998 the first electric arm was fitted.  The i-limb was the first prosthetic to have individually powered fingers and gave the user more control and more feedback.  As well as limbs that allow for walking, we have seen limbs that are designed for running and other sports.

Today we are seeing a more personalised approach to prosthetics including the alternative limb project which seeks to go beyond the replacement of a limb and creates imaginative and personalised options.

Links

A short history of feeding tubes

Whilst you’re probably vaguely familiar with nose and stomach feeding tubes, it hasn’t always been that way… rectal feeds were once the only way… and up until the 1940s the rectum was used for water, saline and glucose solutions.

The first recorded attempt dates back to ancient Egypt when reeds were used to give rectal feedings of chicken broth, wine and eggs. Rectal feeding was used as there was no way to reach the upper GI tract without killing the patient.

There is a long period before any known, recorded developments in artificial feeding.  In Spain, in the 12th century Ibn Zuhr attempted parenteral nutrition, supplying nourishment intravenously to a human with the aid of a hollow silver needle.  It is unknown how successful it was.

A few centuries later, in 1598, Capivacceus used a hollow tube with a bladder attached to one end to reach as far as the oesophagus.  This thinking was developed and in 1617 Fabricius ab Aquapendente used a silver type of NG (nasal gastric – nose to stomach) tube that went as far as the pharynx for patients with tetanus.

In 1646 Von Helmont used leather to create a flexible, hollow tube that patients would swallow and it would feed into the top of the oesophagus.  A syringe was used to deliver blended food.

By the mid 17th century, thinking was focused back on parenteral feeding:

“The idea of providing nutrients intravenously in humans was first realised when Sir Christopher Wren injected wine and ale in dogs way back in the middle of the 17th century.”
– Ahmad Fuad Shamsuddin

Wren had invented an IV made of goose quills and porcine bladders and was also able to give opiates to dogs through this.  There were issues and in 1710 Courten concluded that fats needed to be manipulated before being administered through an IV.  Despite these developments, IV feeding is a fairly new therapeutic tool.

In the 1700s physicians experimented with blends of wine, eggs, jellies and milk and in 1710 it was suggested that the leather tube could be used to reach down into the stomach.

Another stepping stone in the history of feeding tubes saw John Hunter, in 1790, using whalebone covered in eel skin attached to a bladder pump to feed a mix of jellies, beaten eggs, sugar, milk and wine.  In the early 1800s, food blends included thick custards, mashed potatoes and pre-digested milk, whatever delightful thing that is…

During the first half of 19th century stomach pumps were used to feed severely mentally ill patients in England but it wasn’t a straightforward technique with complications including stomach lacerations and drowning in beef broth…

Apparently it was in 1837 that the first gastronomy was suggested.  That is a tube which goes into the stomach through the tummy.  It was attempted around 1845 but there were many complications including infections which couldn’t be dealt with as antibiotics hadn’t yet been created.

In 1867 Kussmaul introduced a flexible orogastric tube – a tube that goes from mouth to stomach rather than nose to stomach.  Three years later, in 1870, Dr Staton was the first surgeon in the US to perform a gastrostomy with long term survival.  The patient was an 8 year old boy.  Another four years and Ewald and Oser would introduce a soft rubber tube.

It would be 1878 before the first jujunostomy was attempted – that’s a gastrostomy which goes into the duodenum instead of the stomach.  But rectal feeding was still about and in 1881 the US president James Garfield was kept alive after being shot by being rectally fed beef broth and whisky.

Moving into the 20th century, we the early days of the central line which would lead to IV feeding and parenteral feeding as well as soft flexible tubes introduced to make artificial feeding more comfortable and more successful.

Unfortunately, paralleling this was the forced feeding of suffragettes.  This was a torturous affair made up of brutal attacks.  A primitive method of feeding was used that was painful – the tube through the nose was often too large and any resistance from the prisoner lead to further pushing, if the nasal tube failed, a throat tube was used which involved a metal spring gag.

Around 1910, Einhorn began experimenting with NJ tubes and shortly after, in 1916 continuous and controlled delivery of liquid nutrition was suggested when it became clear bolus feeding was not always tolerated.  The Levin tube, introduced in 1921 was very stiff and thicker than the tubes used today which are made of soft polymers such as silicone and polyurethane but was presumably progress then.  Another development came in the 1930s with feeding via a pump.

The literally life changing discovery of modern antibiotics in the 1940s changed the landscape of artificial feeding dramatically.  Many of the surgeries that had failed because of infection were now viable.  This was developed further in the late 1940s when polyethylene tubing began to be used and the first enteral feeding pump was developed.

In the 1960s, with the focus on space travel, work was carried out on nutrition to help astronauts get the right food and prevent malnutrition.  This information would later be used to create the formulas used today in tube feeding.  These were further developed in the 1970s.

In 1979, the PEG insertion technique was developed and performed on a 6 month old in the US.  This is a common method still used today which uses a cut in the stomach and an endoscopic tube – hence percutaneous endoscopic gastrostomy.  It’s this kind of insertion that I had.

I’ve written before about how grateful I am for my feeding tube, it has given me back my life and I am also incredibly grateful for all those innovative thinkers and all those unfortunate patients that have gone before me.  Thank you.

Sources and further reading:

York Death Tour

Recently I took it upon myself to drag one of my carers around York to learn about death and graveyards.  Don’t feel too sorry for her, it was a warm day and we had stopped for a cup of tea half way through.

The information I used to cobble together the tour came from a York tourist board self guided tour, The York Graveyard Guide and Tyburn Tales (about executions in York).  It was both interesting and educational and I thought I’d share some of what we discovered in case anyone else is fascinated by the history of death in York…

It is estimated that the city of York contains about half a million corpses and skeletons within the walls alone.  In the middle ages, there were about 50 graveyards and over the years, they filled up, were closed and sometimes built on.

We started off the tour in Museum Gardens at St Leonard’s Hospital Arch.  The hospital was erected on the site of St Peter’s Hospital which was damaged in a fire in 1137.  In medieval times, hospitals cared for the sick, the poor, the old and the disabled but also tended to spiritual health as well as physical health.

In one of the arches of St Mary’s ruins, you can see the tomb of William Etty.  Next we headed round the corner and peered over the fence to St Olave’s graveyard.  In 1853 vaults were available in the church but cost a rather steep £100 to discourage people from being buried there.

DSC_1955eweb

Inside Kings Manor there is apparently a couple of find medieval stone coffins but you need to ask the porter’s permission to see them so we didn’t bother.  We did however learn that the coffins were wedge shaped and this was because Christianity stressed that you should leave the world as you entered it – that is without any grave goods.  This was very different to the Roman way of doing things which involved many grave goods such as wine and food and jewellery.  Bodies in the Christian burials were wrapped in a shroud, which in the 17th century had to be made of English wool to encourage the English wool trade…!  Like the Romans, the coffin was made of stone to preserve the body for resurrection.

Next stop was Bootham Bar which has been a gateway into York since 71AD.  It was also one of the places where you could find heads impaled on spikes.  If you were found guilty of treason you would be punished horrifically and your head would be boiled in salt water and covered in pitch to preserve it before it was put on a spike on one of the bars (entrances) to York.

Walking through Bootham Bar and heading towards the Minster you find St Michael-le-Belfry Church.  In front of the church is a triangle of pavement and was once part of the graveyard.  Burials are often still close to the surface so building work can disrupt them; on one occasion, a skeletal hand fell out of the floor.  One of the people buried here was Nathan Drake who died in 1778, 47 years later his wife Mary joined him, aged 92.  Another notable person was Dr Alexander Hunter who was a medical graduate from Edinburgh.  He came to York to take over a medical practice and was one of the founders of Bootham Park Hospital which opened in 1777 and was the fifth purpose build asylum in the country.

There is obviously much that could be said about the minster and death but that feels like an entire blog post (or series of books!) of it’s own.

Monk Bar was the next stop.  It is the tallest of all the bars and home to York’s only working portcullis which was last lowered in 1953 for the coronation of the queen.  The rooms above the gateway give access to so called murder holes which allowed enemies to be attacked from above.  At one stage the rooms were used as a prison and in 1631 held a man called Martin Best.  Best had arrived in York having previously been in London, in a house that was infected with the plague.  He was kept in the prison and his goods were burnt as part of the attempts to mitigate the impact of the plague in York.

This tenuously led on to us learning more about the plague.  The worst plague in York was in 1604 and was blamed on the arrival of the Scots.  York was struck again in 1631 but managed to avoid the great plague of 1655.  Attempts to control the plague included killing the city’s cats and dogs who were thought to spread the disease.  The poor who got ill were moved to camps outside the city and were supplied with food and drink.  Other victims quarantined themselves in their home.  Money was dipped in vinegar and goods coming into the city, especially cloth, were often impounded.

Nearby was St Maurice’s which has since been pulled down to make way for the inner ring road.  The church was near the County Hospital and when you went in for help, you had to pay a deposit to cover your burial fees.  If you made it out alive, you got that money back.  If not, you were buried in St Maurice’s graveyard.

DSC_1961eweb

After a quick chat about St Maurice’s in Monkgate, we moved round the city walls to the sainsburys car park.  Also known as Jewbury which might give more of a hint about its relevance.

Jewbury was a cemetery between 1177 and 1290 when Jews were expelled from England.  Before 1177, Jews had to be taken to London for burial, wherever in the country they lived and died.  In 1177, Henry II gave permission for Jewish burial grounds outside about 10 cities, York being one of them.  This was obviously still incredibly inconvenient but a slightly better situation that before.

When Sainsbury’s car park was being built, nearly 500 skeletons were excavated but it’s estimated that the remains of about 1000 individuals were buried there.  As only one of the skeletons showed signs of a violent death, we know these were not the victims of the 1190 massacre.  Most of the burials were in wooden coffins with a few personal items.

Following the walls again, we headed onto Peasholme Green and stopped at St Cuthbert’s Church whose graveyard is raised above the pavement level.  Graves were originally dug 6 feet down, but as more and more people were buried, the sheer volume of bodies meant that graves got shallower and shallower and began to smell awful.  The authorities dealt with this by heaping earth on the graveyards which meant that the land rose quickly.  Shallow graves were also vulnerable to body snatchers and York was well placed to serve the illegal trade for both London and Edinburgh, being close enough to both to get bodies there before they perished too much.  As a result of this, the rich would pay more to be buried inside churches.

Just pas the church, on the right, is a sign to an art gallery and cafe.  Follow it and amidst the hubub of cars and buses and so on, you’ll find a wonderful garden, a sanctuary.  You’ll also find the friendly York School House Art Gallery and Cafe which had a wonderful turkish apple tea and I’m told the brownies are also great!

DSC_1966eceweb

DSC_1969eweb

Whilst there are many churches and graveyards in the centre of York, I didn’t want us to get overloaded with death so I’d selected some that were more personal to me, basically just because I had lived near them for a number of years.  They were familiar in the sense that I saw them on a near daily basis going to and from work and yet I didn’t know much about their history and who was buried there.

Starting with St Michael’s on Spurriergate.  This parish covered part of an undesirable area but also shopkeepers.  It had a small graveyard which was reduced in size in 1337 when it was divided into two parts by Church Lane.  The part now split from the church became a public urinal in 1857.

As you cross over Skeldergate bridge and start to head uphill, you pass a church that is now a nightclub.  We stopped here, on the edge of the busy road, to learn more about St Johns.  In the nineteenth century, this was the second most crowded parish and was next to the most crowded parish.  This busy area meant that the graveyard was reused many times and had to be closed in the mid 1800s.  The graveyard was paved over in 1966 when increased traffic meant the road needed to be widened.

Heading further up the hill, we came to St Martin-Cum-Gregory’s which was once one of the richest parishes in York.  Quite a different congregation to Spurriergate only a stones throw away.  This area was home to nobility but because the graveyard was crowded, in hot weather the smell of death was in the air… Amongst those rotting corpses, there are two monuments to the Cave family.  Thomas Cave was the founder of a dynasty of engravers and his grandson, Henry Cave, created a book with 40 engravings of York buildings which was published in 1813.

DSC_1973eweb

Nearby is St Mary Bishophill Junior, confusingly older than St Mary Bishophill Senior.  Anyway, St Mary Junior, had what might be one of the most overcrowded graveyards in York.  Burials meant breaking coffins and disturbing remains.

DSC_1978ewebDSC_1983eweb

A little further on is St Mary Bishophill Senior which dated back to to the eleventh century.  By the 1930s, worship had ceased and the church began to fall into ruin.  In 1963 it was pulled down and some of its stones were reused to build Holy Redeemer on Boroughbridge Road.

York was a place for the fashionable members of society and the hair styles of the late eighteenth century meant combs were in demand.  To keep piles of hair upon the head, the comb industry was essential and at this time they were made from horn.  Traditionally, comb makers were found near the source of the materials, that is the Shambles which was a street of slaughter houses.  Increased demand meant that new workshops were established around Micklegate Bar and Tanner Row.  The invention of a comb making machine in 1796 would see the end of the combmaking in York.  Anyway, the point of this detour into combs was all to say that one of the families buried in St Mary Senior were the Rougiers.  Joseph Rougier founded one of the largest and longest lasting firms of combmakers in York.

Other people buried there included George Benson, a cheese and butter seller and James Cawthorp who died in 1852 aged 37.  Cawthorp was a prison governor at the nearby gaol.  Thomas Gowland was killed on 3rd November 1851 when a train he was working on was hit from behind by another train.  He was crushed and died two hours later.  The coroner’s jury noted that no one was to blame and that it was bad luck and in 3 out of 4 cases, the second train would have come off worse and the guard wouldn’t have been harmed.

 

We did try to find the headstones of a few people who were executed.  At one point, executed criminals could be buried in church yards and Tyburn Tales does record the burial location of many of the executed people.  As such, we knew that on Wednesday 2nd August, 1672, Robert Driffield (aged 24) and Mark Edmund (22) were executed for setting fire to six corn stacks.  Many people gathered to watch their execution and their bodies were later interred in St Mary Bishophill Senior.  Given the age of the burials, it’s not surprising we didn’t find any trace of them.

And thus concluded this particular portion of my death tour of York!