A history of seeing animals, part two

Part one

Christianity had a huge impact on how some parts of the world saw animals.  The teaches claimed that God had given man the right to rule over animals, that they were made for us and each animal had a specific purpose.  The bestiaries of the middle ages encompass this way of thinking.  Animals were used to teach religious principles and morality through illustrated lessons.

In medieval times, we have people living alongside their animals, often farmers sharing their home with their stock.  This meant they knew each animal individually and valued them because of their contribution.  This type of relationship had been the case for thousands of years before but would soon be changed.

When the black plague hit Europe, animals were looked at with suspicion.  Scapegoats were needed to quell the panic and try and set the world back in order.  In particular, wild and potentially diseased animals were seen as dangerous and were often killed as a way of cleansing the community.  It was around this time that we saw animals being put on trial for crimes, in a serious way, just as humans were.

Renaissance thinking brought a more scientific way of looking at the world and with it, nature became something to be investigated, to be put under a microscope.  Again this was looking at animals as something that were here for us, as instruments, a world view that kept humans in the centre of the universe.

In the 1600s, Descartes presented animals as equipment, as mechanical objects that don’t feel pain and this was another way of rationalising poor treatment.

Moving forward, we find the Enlightenment playing host to conversations and debates about animals as philosophical and ethical subjects.  This was fuelled by urbanisation and commodification of animals, the increase of print media and the popularity of vivisection in science.  Around the same time, farmers began moving animals out of their home, putting distance between man and beast which would of course have an impact on how animals were viewed.  The urbanisation and industrialisation of England would take the urban rural divide and amp it up.  The gulf between human and animals would grow and animals would increasingly be seen as commodities and would thus be treated badly.

By the 18th century, controlled breeding was happening which would change the very species themselves, more so than domestication had, into the most efficient object for our use.  Animals were being turned into the food machines that Descartes saw them as.  Around this time, it was also being argued that domestication was good for animals – they were protected from predators, given a reliable and regular source of food and butchering them was an act of kindness that prevented suffering.

“Farm animals became statistics rather than individuals, which took into account their marketability, the level of meat production, and the density of customer populations.  By the end of the eighteenth century, farm animals were mathematized.”
– Brian Fagan

Up until this point in time, animals determined how humans lived, now humans were determining how animals lived, and even how they grew.  The depersonalisation of animals was increasing at a pace as rapid as industrialisation.

Darwin’s work on evolution, whilst it took a long time to take hold, also changed how we looked at animals.  For some people, it confirmed that (western) humans are the highest evolutionary point, for others it connected us to (some) animals.

During the 1700s and 1800s, pet keeping was becoming more common.  But class mattered.  At first pet keeping was for the upper classes whilst the animals of lower classes were looked down upon.  By the 19th century, pets were much more widespread and this brought with it another change in how we see animals.  It started to be accepted that animals, at least pets, had personalities and were individuals that should be treated well.  Juxtaposed against this increase in pet keeping was an increase in big game hunting which would symbolise dominating nature, conquering the wild and imperialism.

The reputation of Britain also changed over the last few hundred years.  In the 1700s we were perceived as being cruel to animals, as having an indifference towards the suffering of animals and generally thought to be harsh towards them.  By the end of the 1800s, treating animals well had become part of what it meant to be British.  For a while, during the wars, animal kindness took a bit of a backseat but would be revived in the 1960s and 70s.

Today we seem to care about animals as individuals, as status symbols – such as #animalselfie – and sometimes from a conservation perspective.  However, we also still very much see a divide between humans and other animals, with humans being the superior side of this.  This is having, and will continue to have, devastating impacts on the world we live in.  Unless we change how we see non-human animals and nature, sustainable change will not be made.

Resources

Advertisements

A history of seeing animals, part one

“The kinship between humans and animals has never been static, having been at the mercy of changing social norms and fleeting trends… human economic, cultural, and demographic factors play a major role in how we perceive of, and treat, animals.  So do age, education, ethnicity, occupation, religion, and sex.”
– Brian Fagan

As we saw when we looked at bestiality, how we view animals and think of them is time and culturally specific.  As we are at time when it seems clear we need to rethink our relationship with nature, a quick glance back seemed useful.

Our ancestors developed an awareness and understanding of the animals around them, predators and prey.  At least seventy thousand years ago, human cognitive abilities improved and so did hunting skills and technology.  This would be a move that changed how humans interacted with their world.

Hunters would treat prey as a living being, often seeing them as sacrificing themselves for humans, and thus we treated them with respect.  In order to successfully hunt, and hence survive, they had to know their prey.  They had to watch them and understand them, they had to know when not to approach and how to make their prey less fearful.  This creates an intimate relationship between predator and prey and we can see this in the cave art and in the stories that we told each other about the world.  Importantly, humans and animals were equal and there was no hard line between humans and other animals.  In this culture, individual wealth wasn’t a concept in the way it would become with domestication.

“Domestication changed the world, it’s landscapes, animals – and humanity.  About ten thousand years ago – the precise date will never be known – numerous deliberate acts, such as the corralling of young ungulates, turned animal-human relationships on end… Humans were now the masters, so the role of animals changed.  They became objects of individual ownership, tangible symbols of wealth, and powerful social instruments.”
– Brain Fagan

Domestication marks a shift in attention from dead animals to live ones, from communal resources to personal ones.  Its thought that dogs started to be domestication around 15,000 BCE and by about 10,000 BCE, other species followed.  In particular goats, sheep, pigs, cattle, donkeys, horses and camels.

“There were advantages on both sides in these history-changing partnerships.  What were now farm animals, bred in captivity, acquired better grazing and foraging carefully orchestrated by deliberate herding, and security from predators.  Humans acquired predictable meat supplies, milk, and a whole range of valuable by-products – everything from hides and fur to horn and sinew.”
– Brian Fagan

Dogs helped our ancestors to hunt, they offered guard functions, they may even have pulled loads and would become companions in their own right, as we are familiar with today.

Through domestication of sheep and goats, humans were able to settle in an area.  Livestock would provide meat, milk, skins and wool and would be a predictable and more reliable resource.  It also allowed humans to claim a piece of land as theirs, and this land would pass from generation to generation.  This was the time at which animals became more than just resources, they became a symbol and they linked generations.  The size of your herd was a sign of your wealth and thus your status.  Where previously animals were not owned, they were prey for hunters, now animals were property and with this came changes to human existence.  Rules around inheritance arose and this meant marriage ties became more important.

Around 6000 years ago, humans hitched a plough to an ox and established the first source of animal power for food production.  This meant it was possible to create surplus food which meant less time needed to be spent working and created leisure time and a social division of labour.

Then, around 4000 BCE, cities were established and grew which in turn meant a increased demand for goats and sheep.  This would lead to increased flock sizes which would have its own impact on how we saw and related to livestock.  In more rural areas, farmers and herders knew their animals by name, maintaining a close relationship but in more urban areas, the relationship between human and beast was changing.  The demand for meat and animal products increased and in response, so did the size of herds.  This led to depersonalisation, and seeing livestock as commodities rather than living creatures.

By 2500 BCE, pack animals were on the scene.  This involved the donkey, the horse and later the camel and would allow humans to travel over long distances, carry commodities for trade, supported armies and so on.  Areas became linked, empires grew and the world became more interconnected.

“Donkeys have worked alongside people for more than eight thousand years – but “alongside” actually means in the background, for they have always been inconspicuous players in history.  Plodding asses carried food and water, exotic luxuries, and essential commodities.”
– Brian Fagan

Despite the huge role they have played in our history, we still tend to see donkeys as stubborn beasts of burden.  As pack animals were often used as a caravan, they were treated as a group rather than individuals, and relationships are between individuals, thus it was easier to mistreat or overuse the donkeys.  This highlights a difference in how we have seen donkeys and horses – humans ride horses but tend to use donkeys to carry things, the former is a one to one relationship, the latter isn’t.  Thus there tended to be a bond between man and horse that wasn’t there when it came to donkeys.

Horses were likely domesticated somewhere around 4000 BCE and from there on, we established an, often, intimate relationship with them.  This was a relationship which was beyond function, the horse and rider were bonded, they were a team.  Horses, like cattle before them, became a status symbol.  They were an animal which brought prestige to the owner or rider.  They were noble creatures and as such were named and cherished.

“The Greeks made a clear distinction between the noble horse and the “servile” donkey, which corresponded in broad terms with that between people who were free and slaves.”
– Brian Fagan

Aristotle felt that nature had made animals as food and labour for humans and that they were subservient to us.  This fits with how we used animals and also made it ok for us to use them that way.  Animals were utilised for human benefit and human development.  Whilst some people will have had a personal relationship with some animals, on the whole, they were considered food and labour.  For the Romans, animals were beasts of utility as well as a source of entertainment.  Animals were pit against each other, against humans and were slaughtered as a form of amusement.

For more about how we view non-human animals, come back tomorrow!

Resources

Disability in History: Medieval Era

Around 673AD, Christianity began eating away at medical understanding, replacing it with superstition.  Illness began to be understood as something that was a punishment from God in response to sins and transgressions, or conversely the result of witchcraft, or possession by the devil.  The cure for illnesses would be prayer, penitence and pilgrimage.

Visions, such as those experienced by Hildegard of Bingen along with migraines, were thought to be messages from the divine.  Joan of Arc is probably the most famous example of this.

Alternative understandings of disability were that disabled people were closer to God because they were suffering purgatory whilst on earth, rather than after death like everyone else.  This would mean they’d reach heaven sooner.  This may be one of the factors that continued the Roman practice of having ‘fools’ in court who were able to tell the monarch the blunt truth where others couldn’t.

During the medieval period, mutilation was used as a punishment for crime, leaving visible reminders of their status and past.  Unfortunately this had repercussions on the non criminal disabled population.  The deformed or disabled body became associated with the criminal body and gave weight to the idea that you can judge a book by it’s cover; that you can tell someone’s character and morality by their appearance.  This meant that non criminal disabled people ended up having to prove their nature and fight prejudgement.

Most disabled people lived and worked within their community, supported by their friends and family.  Without this support, some people resorted to begging and others were cared for by monks and nuns.  Religious hospitals began to crop up across the country as a way of carrying out the Christian duty to shelter strangers.  As well as caring for those with short term illnesses, they began to take in those with disabilities who were unable to live in their community.

“The idea of the almshouse (sometimes referred to as ‘Maison Dieu’) developed from hospitals. Almshouses were built to provide long-term shelter for the disabled and aged infirm, and soon became a common feature of towns and cities. They were founded and supported with donations from kings, church dignitaries, nobles and merchants, all keen to ease their passage to heaven with good works.”
– Historic England

From the 13th century, the King had duties towards people with learning disabilities, including the duty to ensure they were cared for.  This combined with the almshouses and hospitals feels like a move towards a more caring, concerned society, possibly motivated by the idea of securing oneself a place in heaven.

Then would come the witchhunts.

It’s well known that women were targeted as part of the witch hunts but so were disabled people.  Anyone different, disabled and mentally ill were more likely to be targets.  Think about the stereotypical witch – she has a bent back, a disfigured face and is often seen peering as if she has sight difficulties.

“The concept of the disabled person as sinner, and as being in league with the Devil, or even being its ‘spawn’, gained tremendous traction during the Middle Ages and beyond.”
– Quarmby

The Malleus Maleficarum, the witch hunter’s handbook, was written in 1487 and declared that children who had impairments were born to witches, disabled people were proof that the devil existed and visibly impaired children were changlings.  Another popular belief was that disabled people were possessed by the devil.

The focus on disabled people can be understood through a lens of scarcity and fear.  The witch hunts came at a time when resources were limited and if a disabled person couldn’t pull their own weight then they were another mouth to feed.  By casting disabled people as immoral or unworthy, then you wouldn’t have to feel compelled to help them, or guilty if you didn’t.  Essentially, disabled people were once again a scapegoat and like as in ancient Greece, removing them from the community cleansed it.

Around the same time, major religions were informing views, such as the the Christian view of disability as a punishment and the focus on curing the disabled as well as the importance of renouncing the sin that caused it.  Alongside this was Judaism which viewed impairments as ungodly and the result of wrongdoing.  There were also ideas around possession and trying to exorcise people who likely had epilepsy but at that time were thought to have the devil in them.

These ideas influenced today’s backdrop of disability where we still find reference to disability as a result of something you did wrong, or disability as something you should fight and battle to overcome (even if that’s not achievable).  Think of how we speak about cancer and other illnesses, and how we commend people as brave and inspirational when they do something whilst having a disability.

When Shakespeare was writing in the late 16th century, he depicted Richard III as twisted in body and mind, a short cut to casting him as the villain.  Around the same time, the travelling freakshow was being born.  In Europe, fools were collected and exhibited by aristocrats and royals.  And people with deformities and intellectual impairments were being displayed at village fairs and other festive occasions.

Resources:

Disability in History: The Ancient World

“Our attitudes to imperfection and disability descend directly from the concept of the body beautiful of Greek and Roman culture. Those cultures, and the stereotypes and prejudices they developed towards disabled people, were and remain very powerful.”
– Katharine Quarmby

The attitudes that Quarmby is referring to include the idea of disability as punishment for sin, as a result of curses, as scapegoats and as monsters.  People were sacrificed for the good of the community, were cast aside and/or were rounded up into so called freak shows.  Essentially, they were dehumanised.  These perspectives and attitudes are still prevalent today.

In ancient Greece, when a tragedy had occurred it was thought to be because the gods were displeased with the mortals.  To appease them, a scapegoat was chosen and offered to the gods and these were, on the whole, so called ‘useless’ people, outcasts and beggars.  All labels that suggest a higher than average proportion of disabled people.  The scapegoat would be banished from the city, or killed, as a way of purging the area of whatever pestilence had befallen it.  By removing the scapegoat from the area, you were in effect removing the pollution.

In a similar way, babies born with deformities were also expelled from the community.  To ensure that no disabled babies slipped through the cracks, they were inspected at birth.  If they didn’t meet the standards of the day, they were dropped in a well.  Diodurus went a step further instructing that anyone who acquired a disability should kill themselves.  Underlying this cruel treatment was the Greek idea that only the fit should survive.  These ideas would be seen again in the early twentieth century with the obsession around eugenics.

“Disability was, as it is too often today, seen as shameful.”
– Quarmby

In ancient Greece, we also see the link between disability and evil, something which would be amplified by the middle ages with the advent of witch hunts.  The only real exception to the attitudes to disability were when they were the result of battle.

Similar attitudes prevailed in ancient Rome where again we find that children born with a disability were to be killed, although it would now be by drowning.  The Romans saw disabled people as freaks and a spectacle, attitudes that clearly informed the Victorian fascination with freak shows.  Romans enjoyed watching disabled people fight and if they escaped death at birth, they were in demand as entertainment.  They ‘kept’ disabled people, especially ‘fools’ as pets and Roman emperors had dwarfs in confidential positions in court, something that began in Egypt and which could later be seen in English courts.

“We spit on epileptics in a fit, that is, we throw back contagion.  In a similar way we ward of witchcraft and bad lucky which follows meeting a person lame in the right leg.”
– Pliny

“Neither the Greeks or the Romans had a word equivalent to ‘disabled’ but the term that they often use is ‘teras’ (for the Greeks) and ‘monstrum’ (for the Romans). These are the same words they use to describe mythological monsters.”
New Statesman

Of course, not all disabilities are considered equal and that was the case back then as well.  Certain illnesses or conditions that we might consider to be a disability today may have been less of an issue then.  Remember that a lot of the population were hungry, overworked slaves and that inevitably health issues that we take a pill for and move on, wouldn’t have been treated.  The lack of a word for disability means we don’t know for sure what would have been considered enough for you to be killed at birth.

Additionally, the family you were born into made a huge difference.  If your parents were rich, then you had better options.

Over in Egypt, the main concern was around genetic contamination but that didn’t mean they were cruel to disabled people.  Indeed, one Egyptian texts says:

Do not laugh at a blind man
Nor tease a dwarf
Nor cause hardship for the lame.
Don’t tease a man who is in the hand of the God
Nor be angry with him for his failings.

We see this more inclusive attitude reflected in roles that were available for disabled people, whether as musicians, servants etc.  Disability was also considered socially acceptable in Mesopotamia.  Unfortunately, we have inherited more from Greeks and Romans than we have from Egypt and Mesopotamia.

Resources

A short history of period products

Today we are lucky to have a choice of period products, including disposable and reusable options.  Traditionally, however, people who menstruate haven’t been so lucky…

Back in ancient Egypt, papyrus was used and in ancient Japan paper was used.  The Native Americans made a version of sanitary towels using buffalo skin and moss, which, comparatively, sounds heavenly!

That said, as an aside, it’s not always clear what is true.  Take the case of Tampax telling website readers that in ancient Greece, wool was wrapped around wood and used as a tampon.  As appealing as the myth seems to be, it’s likely just that.  Although it could be a misunderstanding of an ancient Greek treatment for vaginal issues, it could be an attempt by tampon companies at naturalising their product, especially given we’ll see the concerns around virginity later… Another problem is that most of history is written by men… And upper class men at that…

“Part of the reason that there is little extant evidence is located in the dual nature of sanitary protection.  This subject is both taboo and mundane, leading to an apparent lack of contemporary early modern sources.  Menstruation is a commonplace experience for women the world over, yet it is often considered a subject to be left unspoken.”
 – Sara Read

Evidence that Read has found, suggests that in the 17th century, some women were using ‘clouts’ or ‘rags’, folded cloths used to absorb blood.  However, this wasn’t across all classes.  She has found other evidence that suggests only higher ranking women would wear sanitary protection.  Sponges may have been used, possibly just by prostitutes.  Her paper, linked below, is a very interesting read about attitudes and beliefs around menstruation in early modern England as well as practices.

Whilst DIY methods were certainly used, and especially in more recent history we know rags or homemade pads were used, for much of history, it’s likely that free bleeding was the main ‘choice’, especially amongst the poorer strata of society.

In the second half of the 19th century, a variety of patents were filed in the US for various period paraphernalia – and as an aside, much of what I’ve been able to find is based on period products in the US.  These included horrific sounding early takes on a menstrual cup – generally made of metal or hard rubber, rubber pants and period aprons…

“The menstrual apron and pad holder in front are cloth-covered rubber. The wearer pinned absorbent cloth, such as bird’s-eye diaper cloth, onto the inner side of the holder. Of course, the woman wore the whole contraption “backwards,” under her dress, and over her buttocks, to keep the menstrual blood away from her clothing.”
– Museum of Menstruation

You can see an advert for one of these over at MUM.

Whilst all the options pre-20th century sound awful, they were also being used at a time when people had fewer periods.  Between a later age of menarche, more pregnancies and time spent breastfeeding and poor nutrition, menstruation was often suppressed during an adult’s life.

A lot changed at the back end of the 19th century when disposable sanitary towels went on sale.  By the 1890s, middle class women were ordering mass produced sanitary napkins, or buying the fabric to make their own at home.  Disposable options were particularly revolutionary.  Before this, women were trying to deal with bloody cloths and handwashing but now, they could simply be thrown away – funny how things go full circle!

In 1899, a female German doctor wrote the following in a book aimed at middle class women:

“It is completely disgusting to bleed into your chemise, and wearing that same chemise for four to eight days can cause infections.”

This suggests that this was a very common practice, and also adding weight to the move towards contraptions to deal with bleeding.  This also came at a time when, in America at least, menstrual blood was considered dangerous and so reusable rags were thought to harbour germs and gases which could contaminate the user.  Advertising claimed that doctors supported disposable options as healthier.  For anyone starting their period at this time, commercial options seemed to be the ideal, a basic necessity.

A common method of protection in the 1900s included linen belts with napkins pinned to them and by the 1920s, the belts were sued with disposable napkins.  During the first world war, nurses had noticed the cellulose that was used for bandages absorbed blood better than cotton.  They realised the potential and began using it for their periods.  Kotex saw the market opportunity and manufactured these new, highly absorbent, disposable napkins.

The 1930s saw the arrival of tampons for married women but were not for unmarried women as it was thought they would break the very precious hymen.  It was 1929 that Dr Haas created the tampon, likely based on earlier prototypes, and it was also likely that being a male doctor gave him some kudos.  A businesswoman marketed them under the name Tampax in 1936 and whilst it was adopted by some, others thought because it was worn internally it was little more than a dildo.  In reality, tampons offered freedom from belts, pins, pads and chafing and allowed for physical activity.  Dancers and swimmers in particular welcomed them.

Around this time, Leona Chalmers patented the menstrual cup, but it didn’t take off.

Concerns around tampons were addressed in the 1940s. Dr Robert Latou Dickinson gave tampons a boost when he said that they are narrow enough not to break the hymen and hence are not a threat to virginity.  He also said that any sexual stimulation from the tampon was momentary and nothing compared to how the sanitary pad rubbed against the body.

Despite the expectation that women should work through their periods during World War Two, the 50s found women encouraged to be quiet and restful instead.  It was at this time that PMS was labelled.

In the 1960s, washable cloth pads came back into fashion and with the hippie movement, the menstrual cup was relaunched but again it didn’t take off.  In 1969, the self adhesive pad came out and allowed you to get rid of the belts and pins.

By the 90s, sanitary towels had grown wings and although bulkier than the pads today, they were very recognisable.  Despite many people using tampons, fears over the hymen remained.  A Tampax advert in Seventeen, showed a concerned girl asking if she’d still be a virgin if she used a tampon.

In the 2000s  and 2010s, menstrual cups finally took off and period pants hit the market and reusable and environmentally friendly products have grown in popularity.

Links

A short history of wheelchairs

As a wheelchair user, I started to wonder how my life might have been had I been born 100 years ago, 500 years ago or 1000 years ago and (assuming I actually survived) this would be very dependant on the types of wheelchairs that were available.  With this in mind, I ventured into the history of wheelchairs.

Early images of wheelchairs are found in stone carvings in China and on a Greek vase.  The former showing a wheeled transport device and the latter a wheeled bed for a child.  But despite these early records, the first known dedicated wheelchair was invented in 1595.  It was made for Phillip II of Spain and had small wheels at the end of the chair’s legs, a platform for his legs and an adjustable backrest.  It wasn’t self propelled but then again he was a king so was probably surrounded by servants anyway!

Sixty years later, Stephen Farffler made a self propelling chair which was mounted on a three wheel chassis and had handles on the front wheel which allowed the user to move without assistance.  The handles operated a bit like a hand bike…

Possibly the best known early wheelchair is the Bath chair, named after the city, not the washing facility.  It was created by John Dawson and had two large wheels and one small.  It was steered using a stiff handle but was very heavy and had to be pushed or pulled.  This version of the wheelchair outsold others in the early 19th century but it wasn’t comfortable and so adjustments and improvements were made over time.

In 1869 we have a patent for the first wheelchair with rear push wheels and small front casters, something we would easily recognise today.  Again, this model needed improving and a few years later, hollow rubber wheels were used, pushrims for self propelling were invented in 1881 and in 1900 we find the first spoked wheels.

Injured soldiers returning home from World War Two were more likely to survive certain injuries because of the discovery of antibiotics.  This meant that there was a sudden influx of people who had spinal injuries etc that would previously have killed them.  In turn, this meant an increased need for wheelchairs.  Depending on their injury, some of these veterans would have been unable to self propel and, having previously been active, would have found themselves dependant on others.

It was one of these soldiers, who was frustrated with his situation, who advocated for a better wheelchair.  This combined with Canada’s commitment to veteran support, resulted in a request to George Klein to build a brand new type of wheelchair.  After Canadian vets had been given their electrically powered chairs, an effort was made to engage manufacturers.  One of which was Everest & Jennings.

Harry Jennings built the first folding, tubular steel wheelchair in 1932 for his friend Herbert Everest.  They then joined forces to set up Everest & Jennings who monopolised the wheelchair market for years.  In 1956 they were the first to mass produce electric wheelchairs.  These were fairly rudimentary, had only two speeds and were very bulky but still, they paved the way for the plethora of electric wheelchairs we have today.

Whilst slightly off topic, it’s worth noting that 1952 saw the beginning of wheelchair sports and by 1960, the first Paralympic games were being held.  The increased visibility of people with wheelchairs alongside the more specialised uses for them, almost certainly aided the refinement and variety of chairs that we are now lucky to have.

Moving forward, in the second half of the twentieth century, developments to the wheelchair happened quickly.  Motors were added to standard wheelchairs, then lightweight aluminium was used and the availability of plastic inevitably led to further innovations. Further, as computer technology boomed in the last fifty or so years, we have seen these enhance and improve the available powerchair technology.

Today we have wheelchairs that can be used in sports, that are very lightweight, that can raise the user up so that we can sit at bars, that can be controlled in different ways and which ultimately allow many more people control over their movement.  Wheelchairs, powered or not, are highly customisable and although I haven’t sat in a pre-20th century one, I can imagine, are significantly more comfortable and allow for a better quality of life.

Aside, please don’t use the term wheelchair bound.  A lot of wheelchair users can walk or stand, and even those who can’t, aren’t tied to their chairs.  It also makes it seem like wheelchairs are a terrible burden and whilst they aren’t perfect, they are amazing and significantly improve people’s lives.

Links

A short history of mental illness

I will also be doing a series of posts on the history of disability and obviously there are overlaps between this topic and that one.  I will also be focusing on England with this post. 

With the history of mental health, it’s harder to establish particular attitudes and practices as you get further back in history.  Often mental illness and physical illness are conflated or not specific and because of this, I decided not to start with the ancient world today.  Obviously our ideas of mental illness today are very different to those in the past and so too is the language used.  Please bear this all in mind as you read on.

“Whether a behavior is considered normal or abnormal depends on the context surrounding the behavior and thus changes as a function of a particular time and culture.”
Noba Project

Very early understandings of mental illness often attributed a supernatural cause, such as being possessed by the devil or having displeased a god.  Naturally this understanding of the problem influenced the treatment and in the case of having an evil spirit inside you, trepination may be the answer.  It involved drilling a hole in the sufferers skull to release the spirit.

Supernatural causes of mental illness were prevalent between the 11th and 15th centuries as people searched for reasons for natural disasters such as the plagues and famines.  This was also when we had the witch trials and I’ll talk more about that in a different post, for now though, I want to note that not everyone thought mental illness was down to the devil.  Johann Weyer and Reginald Scot believed that people being accused of witchcraft were actually suffering from mental illness and that the mental illness was down to disease, not demons.  The church banned their writings.

As you can see, mental illness was often tied to religion or spirituality and in-keeping with this, it was generally the monks and nuns which provided care for people who were ill.

The first English hospital for the mentally ill was Bedlam.  It was established as a hospital in the 13th century and by 1403, ‘lunatics’ made up the majority of the patients.  Originally run by monks, Henry VIII seized Beldam during the dissolution.  Before he died he transferred control to the Corporation of London making it a civic, not religious, institution.  In 1619 Helkian Crook became the first medically qualified ‘keeper’ and shows that mental illness was starting to be seen as a medical issue.  Despite this slow change in London, mental illness was still seen by many as supernatural or religious in origin.  Symptoms included those that you would expect today but also included not praying, not feeling pious, talking too much, sexual urges and hatred of your spouse.  As you can see, some of these symptoms were a way of controlling those who didn’t conform.

On the whole, as was the case with disability, most people were cared for by family or the community although there were a number of mentally ill people who were living on the streets.  In the eyes of the law, during the 16th and 17th century, they were seen as unable to reason and responsibility for their affairs was allocated by the Court of Wards.

Around this time, there was a move towards asylums and this was based on the belief that people who were mentally ill could thrive in a clean, healthy institution such as York Retreat.  Unfortunately, we also find asylums being treated like zoos.  It was considered entertaining to visit and see the patients in Bedlam and I suspect the situation was the same in other asylums.  Conditions were dreadful, people were tortured and forced to live or exist in appalling situations whilst also being displayed in a humiliating fashion.

Attitudes towards mental illness inevitably change as science and medicine evolves and prevalent beliefs alter.  At the end of the 18th century, with the enlightenment, it was thought that people arrived as a blank slate and your outcome was down to nurture.  This obviously affected how people saw disability and mental illness

The industrial revolution brought vast changes to the landscape and the emphasis was heavily on productive workers.  At this time, there was a rapid expansion of institutions and people with mental illness were moved from home to asylums.  Early ideas focused on moral treatments but professionals quickly lost interest or hope with this approach.

By 1900, more than 100,000 ‘idiots and lunatics’ were living in 120 county pauper asylums and 10,000 in workhouses across the country.  It was thought that financial aid to help people live in the community encouraged laziness.  They didn’t seem especially concerned that asylums were expensive and often people who went in, never came out, spending a long and miserable life there.  Rather, reformers who encouraged the building of asylums, claimed that they would be a safe space to cure people or to teach them useful skills.

The buildings themselves could be made up with long corridors, sometimes ¼ mile long, or rows of blocks.  Men and women were segregated and dormitories could consist of up to 50 beds which stripped patients of privacy and space – beds were crammed in and the person next to you could be an arms length away.  High walls prevented escape and staff lived on site, making them a self contained world.  There was always a cemetery and some even had their own railway station… They were to all intents and purposes a world of their own, and a law unto themselves.  This allowed poor practices and abuse to run riot and the outside world were oblivious.

Whilst I’m sure many of the patients did have mental illness, the asylum also feels like it was used as a bit of a dumping ground with people being admitted on dubious grounds.  Those who did have mental illness often suffered from things which we see as very treatable today, such as panic attacks, and it’s highly likely that being in the asylum did more harm than good.

In 1948, the NHS was created and asylums etc were no longer separate to the physical side of health.  Psychiatrists began to experiment with treatments and physical activity was carried out on the body to help treat the mind eg ECT was widely used to treat depression.  Another bodily based treatment involved giving patients insulin to induce a coma, as a way to treat deep seated issues.  Whilst this all sounds horrific to us today, it was an important shift towards making the treatment of mental illness more of a science.

In the mid 1950s, over half of the NHS beds were for mental health and was costing a lot of money.  A report in 1957 drew attention to the outdated asylums and mental hospitals and highlighted the idea of community care.  Around the same time, new drugs were being discovered and created that would control some of the behaviours associated with mental illness that had led to people being sent to asylums.  Note I said control, these were often used to tranquillise patients rather than to cure their illnesses.  That said, in the late 50s and 60s, specific drugs were available to use for specific disorders.  With this huge change, there was less need to confine people and care in the community seemed to be a realistic possibility.

The late 50s and 60s saw the move towards people living in the community and also a better public awareness of the conditions of asylums.  To try and improve standards, open door policies were established and freedom was increased for patients.  There was also the introduction of occupational therapy which showed that people with mental illness weren’t inherently useless…

In the 70s, with the recession, spending on mental health was cut.  Bed by bed, ward by ward, the asylums were closed.  As beds were no longer available, people had to be cared for in the community and on the whole it was charities who picked up the pieces.  The help that was needed to transition patients from asylum to communities never materialised and many people were left facing a new world without support.

Whilst we are still far from perfect in how we, as a society, support and treat people with mental illness, we have come a long way and, as a user, I am very grateful for the help available today.

Resources