The Little Book of Medical Breakthroughs - Dr. Naomi Craft - E-Book

The Little Book of Medical Breakthroughs E-Book

Dr. Naomi Craft

0,0

Beschreibung

The Little Book of Medical Breakthroughs explains over 100 seminal discoveries, inventions and theories that have shaped the history of medical practice. Presenting a wide range of the most important medical breakthroughs, it covers a variety of topics, including artificial limbs used in Ancient Egypt, modern-day X-rays, immunisation and sanitation.This user-friendly book is arranged in chronological order and contains illustrations throughout.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 239

Das E-Book (TTS) können Sie hören im Abo „Legimi Premium” in Legimi-Apps auf:

Android
iOS
Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Contents

Glass Eyes

Sutures

Artificial Limbs

Urinalysis

Condoms

Caesarean Section

Spectacles

Valves in Veins

Forceps

The Microscope

Harvey and the Circulation of Blood

Vacuum Extraction

Prevention of Scurvy

False Teeth

The Ambulance

Smallpox Vaccine

Colour Blindness

Women Medical Students

The Stethoscope

Intravenous Fluids

Beaumont’s Experiments on the Gastric Juice

General Anaesthetic

The Vaginal Speculum

Spirometry

Appendicectomy

Ophthalmoscope

Plaster of Paris Casts

The Hypodermic Syringe

Sanitation

Nursing

The Treatment of Epilepsy

Mendel and the Birth of Genetics

The Germ Theory of Disease

Temperature Measurement

Paracetamol

Handwashing

Immunizations

Pinard Stethoscope

Sphygmomanometer

Local Anaesthetics

Contact Lenses

Electrocardiogram (ECG)

Psychoanalysis

Latex Surgical Gloves

Phototherapy

Röntgen’s X-Rays

Facemask

Aspirin

Blood Groups

Defibrillator

Corneal Grafting

Discovery of Alzheimer’s Disease

Contraceptive Coils

The Band-Aid

The Operating Microscope

Insulin

EEG

The National Blood Transfusion Service

Electrosurgery

Iron Lung

Discovery of Penicillin

Wheelchair

Thrombolysis

Pulse Oximeter

Dialysis

Sunscreen

Randomized–Controlled Trials

Mammography

Amniocentesis

Pacemaker

The Link Between Smoking and Lung Cancer

Antipsychotics

The Structure of DNA

Heart–Lung Machine

The Placebo Effect

Polio Vaccine

Bone Marrow Transplant

Ultrasound

Levodopa for Parkinson’s Disease

Foetal Monitoring

The Eradication of Smallpox

Hip Replacement

The Structure of Antibodies

Oral Rehydration Fluid

Glue for Wounds

Cognitive Behavioural Therapy (CBT)

The Contraceptive Pill

The First Artificial Heart Valve

Methadone Treatment

Angioplasty

Contraceptive Implant

Coronary–Artery Bypass Grafting

Palliative Care and the Hospice Movement

First Heart Transplant

The Abortion Act

Folic Acid and the Link with Neural Tube Defects

Salbutamol

The Five Stages of Grief

Tamoxifen for Breast Cancer

Pregnancy Tests

Isotretinoin for Acne

Computerized Tomography (CT)

Evidence-based Medicine

Cholesterol-lowering Drugs

Positron Emission Tomography (PET scans)

Cochlear Implants

Magnetic Resonance Imaging (MRI)

In-Vitro Fertilization (IVF)

Botox

Coronary Arterial Stents

Egg Donation

Keyhole Surgery

Preimplantation Genetic Diagnosis

Surfactant for Premature Babies

Intracytoplasmic Sperm Injection (ICSI)

Combination Therapy for HIV Infection

Viagra®

Telesurgery

Human Genome Project

Eye Injections to Prevent Blindness

Face Transplant

Cervical Cancer Vaccine

Acknowledgements

Index

HOW TO USE THIS BOOK

The New Holland Little Books are easy-to-use comprehensive guides to important subjects. The Little Books feature over 100 entries on key principles or theories essential to understanding the subject. Written in an easily accessible manner, each Little Book explains sometimes very difficult concepts and theories, putting them in their historical context, giving background information on the experts who proposed them in the first place, analysing influences and proposing, where relevant, links to other related entries. The books also feature tables, equations and illustrations, and end with a glossary, where relevant, and an index.

The Little Book of Medical Breakthroughs is arranged chronologically and the country of origin is listed, where appropriate. Each entry includes a clear main heading, the person or people responsible for the discovery, birth and death dates, where relevant, followed by a short introductory paragraph explaining the concept concisely. In some cases, the main essay is also cross referenced to linked subjects. The book ends with a comprehensive index.

Other books in the series include: The Little Book of Environmental Principles and The Little Book of Mathematical Principles.

4800 BC

Glass Eyes

Although known as glass eyes, artificial eyes are now made out of plastic, and are often so lifelike that they cannot be distinguished from the normal eye.

The oldest-known artificial eye was found in 2006 in the remains of a young woman living nearly 7,000 years ago – around the time of the Ancient Greeks – in what is now Iran in the Burnt City historical site.

It was probably made of natural tar mixed with animal fat. The thinnest blood vessels on the eyeball were made with golden wires, less than a millimetre thick. The eyeball had two holes on its two sides for fixing the eyeball to the socket.

Most Ancient Egyptian artificial eyes were made out of enamelled metal or painted clay attached to cloth and worn outside the socket. These were known as Ectblepharons. Not much changed over the next 10 centuries. Writing in the 16th century, French surgeon Ambrose Paré (1510–1590) described gold or silver versions, worn in front of the eyelids when they were known as ekblephara and under the eyelids when they were known as hypoblephara.

Late in the 16th century, false eyes began to be made out of enamel and glass. Exactly who made the first glass eye is not known, but the English playwright William Shakespeare (1564– 1616) knew about them when he wrote in King Lear:

Get three glass eyes;

And, like a scurvy politician, seem

To see the things thou dost not.

–King Lear to the Earl of Gloucester, Act IV, Scene 6

The first English artificial eyemaker set up business in Ludgate Hill in London in 1681, advertising enamel artificial eyes, ‘so exact as to seem natural’. Enamel was attractive but expensive and didn’t last long. More popular were glass eyes. Initially the Venetians, famous for glass making, were the main glass eye makers. However, by the mid 19th century, the experts in glass eye making mostly came from a region of Eastern Germany called Thuringia. Their products were of such high quality that they became popular all over the world. German craftsmen known as ocularists toured the United States custom-making artificial eyes, stopping for a few days in each city, fitting patients for new eyes, before moving on to the next. Fabricating secrets were closely guarded, passed down from one generation to the next. Eyes were also fitted by mail order. An ocularist would also keep hundreds of pre-made eyes, which were cheaper, providing patients with the closest fit.

Since the Second World War (1939– 1945), plastic has become the preferred material for making artificial eyes. There is no risk of breaking, chipping or scratching. A plastic eye can be more easily moulded to irregular contours of the eye socket, and can be worn all the time instead of having to be removed at night.

3000–4000 BC

Sutures

A surgical suture is used to stitch together the edges of a wound after an operation or to repair damaged tissue.

Some sutures dissolve, others don’t. They can be man made or natural (from silk, linen and catgut). Some are made out of one single filament, which causes less damage to the tissues but are harder to knot, or several filaments that are braided or twisted together.

Some of man’s earliest records show evidence of sutures. We know needles were used at least 3,000 years ago and archaeological records from ancient Egypt show that the Egyptians used linen and animal sinew to close wounds. In ancient India, physicians used the heads of beetles or ants to staple wounds shut. The live creatures clamped the edges of the wound shut with their pincers. Then their bodies were twisted off, leaving their heads in place. Other natural materials used by doctors in ancient times included flax, hair, grass, cotton, silk, pig bristles and animal gut.

The first description of catgut was in 175 AD. Made from sheep intestine (and bearing no relation to cats) this was easily available from musicians who used it for strings.

Not much progress was made in the use of sutures until the 19th century, when surgery became a viable option with the invention of adequate anaesthesia. Although surgery was less painful, wound infections were a major cause of death. Sutured wounds seemed more likely to become infected, so many surgeons preferred not to use them.

In 1847, the Viennese obstetrician Ignaz Semmelweis (1818–1865) discovered that handwashing considerably reduced the risk of infection, making surgery much safer. Having realized the benefit of disinfectant, Joseph Lister (1827–1912) Professor of Surgery in Glasgow, Scotland, used carbolic acid to clean his hands and instruments, and also soaked his catgut sutures in it. The infection rate fell dramatically and carbolic-soaked catgut became widely accepted by 1860.

As well as his contribution to antisepsis, Lister was the first to discover that the body absorbed catgut sutures. Absorbable sutures are useful for a wound that doesn’t need to be supported for more than a few days. Lister realized that if he soaked his sutures in chromic acid, like the tanners who used it to soak their leather, the catgut would last a week or longer. In 1881 chromic catgut was introduced.

By 1890 the catgut industry was firmly established in Germany, thanks to its use in the manufacture of sausages. From 1906 it was also sterilized using iodine.

Catgut was the staple absorbable suture material through the 1930s and, at one stage, one of the major manufacturers of catgut sutures, Ethicon, reported using intestines of 26,000 sheep a day! Where a nonabsorbable material was needed, surgeons continued to use silk and cotton. Suture technology advanced with the creation of nylon and polyester in the late 1930s. Needle technology also advanced and surgeons began using a needle which was crimped onto the suture, therefore reducing the trauma to the wound because the needle and the suture were the same width.

In the 1960s chemists developed new synthetic materials that could be absorbed by the body, such as polyglycolic acid and polylactic acid, and better sterilization technology, so that sutures could be sealed in a package and then sterilized, as they are today.

See: General Anaesthetic, pages 48–49; Handwashing, page 72; Sanitation, pages 61–62

c. 3000 BC

Artificial Limbs

For as long as people have been losing limbs there have been attempts to make artificial ones. A prosthesis is a replacement for a limb (or part of a limb) that has been amputated or may have been missing from birth.

The first known description of a prosthetic limb is in the Rig-Veda, an ancient Indian sacred poem written in Sanskrit between 3500 and 1800 BC. The story is about a warrior, Queen Vishpla, who lost her leg in battle. Once she had been fitted with an iron prosthesis, she was able to return to the fight.

Probably the oldest actual example of a prosthesis is the Cairo toe. It was found attached to the foot of an ancient Egyptian mummy dating from between 1069–664 BC. It is made of leather and wood, and is flexible. It looks worn, suggesting it was used, and not just added after death. Scientists believe the woman was in her mid-50s and may have lost the big toe from complications of diabetes.

Older still is the Greville Chester Great Toe, named after the man who acquired it for the British Museum, which dates between 1295 and 664 It is made of linen glue and plaster blended together, but this one doesn’t bend and was probably cosmetic.

Before the Egyptian toes were discovered, the oldest prosthesis in existence was the Roman Capua Leg, which was found in a grave in Capua, Italy, dating to 300 BC. It was made of bronze, but unfortunately it was destroyed during an air raid in the Second World War. A copy is kept at the Science Museum in London.

Generally, early prostheses didn’t have much function. Pliny the Elder (23–79 AD), a 1st-century Roman scholar, described a typical prosthesis when he wrote about Marcus Sergius, a Roman general who had his right hand amputated in the battle against Carthage (C. 218–201 BC). To allow him to get back to battle, Pliny writes that the general had an iron hand made by his armourer just to hold his shield in place. Others describe artificial legs that fitted into the stirrups allowing a soldier to balance on a horse, but not enabling them to walk.

In the 16th century, French Surgeon Ambrose Paré (1510–1590) started developing prosthetic limbs with basic functionality. ‘Le Petit Lorrain’ was a hand operated by springs and catches for a French army captain. He also invented an above-knee prosthesis, which consisted of a peg leg with a foot prosthesis. It had an adjustable system for attaching it to the body, knee-lock control and other engineering features.

By the 19th century, there had been more advances and greater attempts to make limbs more functional. For example, Douglas Bly of Rochester, New York, invented and patented ‘Doctor Bly’s anatomical leg’ in 1858. As Bly commented, this one still had its limitations:

‘Though the perfection of my anatomical leg is truly wonderful, I do not want every awkward, bigfatted or gamble-shanked person who always strided or shuffled along in a slouching manner with both his natural legs to think that one of these must necessarily transform him or his movements into specimens of symmetry, neatness and beauty as if by magic – as Cinderella’s frogs were turned into sprightly coachmen.’

A copy of an artificial leg in brass and plaster made around 1910 from the original at the Royal College of Surgeons, London. The original was found in a Roman grave in Capua, Italy.

In recent years, there has been more emphasis on developing artificial limbs that look and move more like actual human limbs. Advances in biomechanics, engineering and plastics, combined with the use of computer-aided design and computer-aided manufacturing, have all contributed to the development of more realistic artificial limbs.

One of the latest inventions in this field includes the world’s first commercially available bionic hand, which has five individually powered digits. To work it relies on the electrical signal generated by muscles in the remaining part of the patient’s limb to open and close the fingers. Electrodes sitting on the surface of the skin pick up the signals.

One of the first patients to be fitted with the bionic hand summarized the significance of the development when he said:

‘It’s truly incredible to see the fingers moving and gripping around objects that I haven’t been able to pick up before. The hand does feel like a real replacement for my missing hand.’

1000–2000 BC

Urinalysis

Studying urine has been part of medical diagnosis for thousands of years. Initially all there was to go on was the colour, smell, and even taste. Fortunately, now we have more sophisticated methods to help identify infections, chemicals and crystals.

Ancient Chinese and Indian records mention observations of the urine from 1000–2000 BC. But the most detailed information we have comes from Hippocrates (c. 460–c. 375 BC), the apocryphal ‘father of modern medicine’ who wrote about urine examination in 400 BC. He observed the different smells and colours of urine. During Hippocrates’s lifetime, it was common practice to pour a sample of urine on the ground to see if it attracted insects. If it did, it was called ‘honey urine’ – later known as diabetes.

Urine examination was developed further in 1000 AD, by a physician called Abu Ibrahim Ismail al-Jurjani (1045–1137) who not only noted that it was possible to observe the smell and colour of the urine, as Hippocrates had observed, but also its quantity, consistency, transparency, sediment and froth.

In the Middle Ages, physicians became known as uroscopists because of their ability to examine urine. Typically the sick patient’s servant would bring a urine sample to the physician, and leave it for analysis – for a fee.

Several physicians were consulted, and they would compete to get the most accurate (or possibly the most attractive diagnosis) by wheedling information out of the servants – often by giving them a drink or two. The physician who did best would get to look after the patient – and therefore a greater fee.

Before written language, symbols were used to represent natural elements, and this was the ancient symbol for urine.

Charlatans went one step further, claiming to be able to tell the future based on examination of the urine. They were known as ‘Pisse Prophets’, and brought the practice of uroscopy into disrepute.

In the 17th century the English physician Thomas Willis (1621–1675) advocated tasting urine to detect the sweetness caused by diabetes. It wasn’t until 1776 that Matthew Dobson (1731–1784), a Liverpudlian physician, evaporated urine from diabetics and found that it left a residue that smelled and tasted like sugar.

Fortunately in the 18th century several tests were developed for testing specific chemicals in urine, including protein and sugar. However, it wasn’t until 1956 that the first test strip for analyzing urine was introduced.

1220 BC Ancient Egypt

Condoms

In the 21st century we have easy access to cheap, single-use polyurethane condoms.

Condoms have been around since Ancient Greek times, although not in their present form. The Greeks used linen ones which, although unreliable, were perhaps more appealing than the tortoiseshell ones that the Japanese favoured, or the leather, animal gut or fish bladder condoms used at different times around the world.

In the 1500s when writing about the prevention of syphilis, the Italian anatomist Gabrielle Fallopio (c. 1522–1562) recommended wearing his invention – a linen sheath over the glans, but under the foreskin, or inserted into the urethra. A more practical version described later by Italian practitioner Hercules Saxonia (1551–1607) involved a larger linen sheath, soaked in a chemical or herbal preparation, which covered the entire penis.

The name condom probably comes from Condus, the Latin for receptacle. There is also an alternative explanation, probably apocryphal, that in the 1600s the physician of English king Charles II (1630–1685) was called Dr Condom, or Quondam. Allegedly, the doctor made sheaths of oiled animal gut to protect the king from syphilis.

Condoms made from sheep’s intestine were more widely available in the 1700s. The gut was soaked, turned inside out, macerated in an alkaline solution, scraped, exposed to brimstone vapour, washed, blown up, dried, cut and given a ribbon tie. This labour-intensive process meant that the result was quite expensive, so they were often reused and only available to a limited proportion of the population.

In 1843 rubber vulcanization, invented by Charles Goodyear (1800–1860) and Thomas Hancock (1823–1871), made it possible to produce cheaper, more reliable condoms that were stretchier and easier to use. However, men were still encouraged to wash and reuse them.

The manufacture and supply of condoms changed radically in the 1930s with the development of liquid latex condoms, superseding rubber completely. Prices plummeted and mass production began. By the mid 1930s, the 15 largest makers in the United States were producing 1.5 million condoms a day.

Since then, new technology has improved the condom considerably. The Durex Avanti, launched in 1994, is made from a unique polyurethane material, DURON, which is twice as strong as latex, enabling a thinner, more sensitive film.

c. 7000 BC

Caesarean Section

The operation to remove a child from its mother’s womb to save the baby’s life was a breakthrough. However, when it was first introduced, it was only done when the mother was dead or dying. It was in effect a medical failure for the woman. The first successful Caesarean in Britain did not take place until the late 19th century.

Caesareans have been carried out since Ancient times. Apollo removed Asclepius from his mother’s womb according to Greek myth, and there are many references to the operation in ancient Hindu, Egyptian, Greek, Roman and other European folklore.

The origin of the word ‘Caesarean’ is unclear. In 7th century BC the Romans passed a law – the Lex Caesara – stating that all pregnant women dying while in labour should have surgery to remove the baby. Popular myth suggests that Julius Caesar (100–44 BC) was born this way, although this seems unlikely, because history records that his mother Aurelia (120–54 BC) lived on to see her son’s invasion of Britain. The word may also originate from the Latin verb caedare that means to cut, or possibly from the word caesones, the name given to infants who were cut from their mother’s womb.

The word ‘section’ comes from the surgical term, which means to divide tissue. In a Caesarean, the wall of the abdomen and uterus are both divided to get the baby out.

A Caesarean was a last resort, and nobody really expected the baby to survive, let alone the mother. This began to change in the 19th century, when doctors began to learn more about the anatomy of the body and so were more likely to be successful if they operated.

New developments in antisepsis meant that women were less likely to die from infection after a surgical procedure. Anaesthetics were developed which transformed the experience for the woman. Whereas in the past surgeons had been afraid to sew up the cut in the uterus because they thought internal stitches might cause infection and a ruptured uterus in subsequent pregnancies, by the late 1880s sutures were commonplace.

As the operation became less risky, obstetricians began to recommend the procedure at an earlier stage, rather than waiting until the baby and mother were almost dead, when their chances of surviving any surgical procedure were limited.

The first recorded Caesarean in which both mother and child survived was probably carried out some time in the late 19th century.

See: General Anaesthetic, pages 48–49; Sanitation, pages 61–62

1280–1300 Italy

Spectacles

The invention of spectacles transformed life from a shapeless blur to sharp focus for many people. It is not clear exactly when they were invented or by whom, but certainly they were in use by the end of the 13th century.

Although there were no spectacles in Ancient times, there were certainly many people with poor vision. Famously, Marcus Tullius Cicero (106–43 BC), the Roman philosopher and orator, wrote to his friend Titus Pomponius Atticus (c. 110–32 BC), one of Rome’s great writers and statesmen, saying that his slaves had to read to him because he could no longer read to himself now that he was old.

Allegedly there was an alternative even in Roman times: the Roman philosopher and dramatist Lucius Annaeus Seneca (4 BC–65 AD) claims to have read ‘all the books in Rome’ by looking through a glass bowl filled with water, which would have acted as a primitive lens.

The theoretical principles behind corrective lenses were already in place, as we have evidence that the Greek astronomer Ptolemy (85–165 AD) had described the basic laws of diffraction in roughly 140 AD. However, it wasn’t until the 17th century that these laws were formalized and developed further by Willebrord Snellius (1580–1626), a Dutch astronomer and mathematician.

The above illustrations shows a selection of early spectacles.

Reading stones, often made from quartz, became popular in the 8th century, first in Spain and spreading to the rest of Europe by the 11th century. These stones relied on a similar principle to Seneca’s bowl of water. A reading stone was simply a hemisphere of glass placed on top of the words to magnify the letters.

At that time the only people capable of making transparent glass were the glass blowers of Venice, so it is likely that the first spectacles were developed in Italy. One of the earliest inventors was probably the Dominican monk Alessandro da Spina from Florence around 1284, but there is no definite evidence for this.

The first mention of actual glasses is found in a 1289 manuscript, when a member of the Florentine Popozo family wrote in a manuscript entitled Traite de con uite de la famille, di Popozo:

I am so debilitated by age that without the glasses known as spectacles, I would no longer be able to read or write. These have recently been invented for the benefit of poor old people whose sight has become weak.

The first designs didn’t have sides and had to be held on the nose. Poorer people had theirs mounted in leather, wood, horn, bone or even light steel, while the upper classes had gold or silver frames.

For centuries, nobody found a good way to hold spectacles in place. Spanish spectacle makers tied silk ribbons on to the frames, which could be looped over the ears, and the Chinese added little weights to the end of the ribbons. It was only in 1730 that an English instrument maker, Edward Scarlett, designed spectacles with rigid side arms.

1579 Italy

Valves in Veins

Hieronymus Fabricius ab Aquapendente (1537–1619)

When Fabricius discovered that veins had valves, his work provided the basis upon which his student William Harvey later described the circulation of the blood – often quoted as one of the most influential discoveries in medical history.

When Fabricius was alive, it was widely accepted that the body had blood in the veins and that there was a completely separate supply of blood in the arteries. This idea was based on the theories of Claudius Galen (129–216 AD), a 2nd-century physician who believed that the arteries were the source of vitality and the veins carried the source of nourishment and growth. In his view, blood was made in the liver and attracted to the different organs when the organs needed nourishment.

Fabricius was an anatomist and embryologist working in Padua, Italy. He studied arteries and veins and made a thorough description of the valves in the veins, published in De venarum ostiolis in 1603, after demonstrating his theory in 1597 to his colleagues. He observed that there were only valves in the veins, and not in the arteries, except in the two large arteries at their origin from the heart. He believed (correctly) that their role was to stop the blood from pooling in the extremities, which would prevent central and upper parts from getting blood.

He had an international reputation, which attracted the English anatomist William Harvey (1578–1657) as his pupil between 1600–1602. Fabricius’ work on the veins is significant because it was a building block in Harvey’s subsequent theory of the circulation in 1628, which completely overtook Galenic theory.

c. 1580 England

Forceps

Peter Chamberlen (1560–1631)

The forceps is a metal instrument designed to ease a prolonged labour and deliver a healthy child. Since its introduction it has saved the lives of millions of women and babies.

Although the mention of the word ‘forceps’ can strike fear into a pregnant woman, when used correctly, a forceps provides less pressure on the baby’s head than the woman’s birth canal.

The two blades of the instrument are inserted separately and cradle the baby’s head, rather like two slim hands. The blades are then locked together in that position so that there is no way they can crush the baby’s head. Then the baby is literally pulled out.

It was probably Peter Chamberlen who first used the instrument. He was the son of a French Huguenot who had fled France to escape persecution and settled in Southampton, England. Chamberlen’s success in obstetrics led to his appointment as physician to King James I (1566–1625) of England (also James VI of Scotland) and his wife, Anne of Denmark (1574–1619).

However, when it was first introduced the forceps was kept a secret. Apparently when called to a birth, Chamberlen would hide the instrument in a box so that no-one saw it.

In the 17th century, when Chamberlen first used the forceps, inexperienced midwives with little medical knowledge attended most deliveries. Gradually these midwives were replaced with accoucheurs, or ‘man-midwives’. They were doctors with some knowledge of anatomy and also some medical instruments. By 1730 accoucheurs were also beginning to use the Chamberlen forceps.

The original Chamberlen forceps were found in 1813 in a trunk in the attic of Woodham Mortimer Hall, near Maldon in Essex, home of the late Peter Chamberlen III (1601–1683), a descendant of the instrument’s inventor.

One of the best-known accoucheurs was Scottish-born William Smellie (1697– 1763), who was described as a great horse godmother of a he-midwife’ by midwife Elizabeth Nihell. Smellie was the first person to teach obstetrics and midwifery on a scientific basis. He wrote the book Midwifery, published in 1752, which provides us with the very first detailed description of the obstetric forceps; he also set down rules for how to use the forceps safely.

1590 Holland

The Microscope

Zacharias Janssen (1580–1638)

Being able to magnify an object many hundreds of times has enabled scientists to discover details about the structure of the world around us and more about the inner workings of the body. In medicine today, the microscope is vital to the identification of diseases and infections.

The Romans discovered that if you looked at an object through a piece of glass that was thick in the middle and thin on the edges, it magnified the object. They called these pieces of glass ‘lenses