History Podcasts

What was it like to have type 1 diabetes in the early 20th century?

What was it like to have type 1 diabetes in the early 20th century?

Someone in my family died in 1924 of type 1 diabetes at the age of 26. What would life have been like for her? Would she have been in the hospital frequently? What was the average life-expectancy for someone with type 1 diabetes back then?


Life for someone with diabetes in the early 20th century would be the same as someone with diabetes now if it was left untreated. The consequences of not treating diabetes include:

  • Heart Disease and Stroke
  • Blindness
  • Kidney Failure
  • Diabetic Neuropathy

The discovery of insulin as a treatment for diabetes occurred in the 1920s, so either this family member was already suffering severe consequences of diabetes or she did not obtain sufficient treatment in time. Any treatments available before the discovery of insulin were only experimental.

On a related note, I found an article in the Diabetes journal that confirms msh210's comment that testing was done by tasting the urine.


A history of obesity, or how what was good became ugly and then bad

Chronic food shortage and malnutrition have been the scourge of humankind from the dawn of history. The current worldwide epidemic of obesity, now recognized as a public health crisis, is barely a few decades old. Only after the technological advances of the eighteenth century did a gradual increase in food supply became available. The initial effect of these advances in improved public health and amount, quality, and variety of food was increased longevity and body size. These early favorable outcomes of technological advances notwithstanding, their incremental effect since the Second World War has been an overabundance of easily accessible food, coupled with reduced physical activity, that accounts for the recent increased prevalence of obesity. Obesity as a chronic disease with well-defined pathologic consequences is less than a century old. The scarcity of food throughout most of history had led to connotations that being fat was good, and that corpulence and increased "flesh" were desirable as reflected in the arts, literature, and medical opinion of the times. Only in the latter half of the nineteenth century did being fat begin to be stigmatized for aesthetic reasons, and in the twentieth century, its association with increased mortality was recognized. Whereas early reports listed obesity as a risk factor for mortality from "chronic nephritis," the subsequent recognition of the more common association of obesity with diabetes, hypertension, and heart disease altered the listings and questioned its being a risk factor for kidney disease. An enlarging body of evidence, accrued over the past decade, now indicates a direct association of obesity with chronic kidney disease and its outcomes.


9 Terrifying Medical Treatments from 1900 and Their Safer Modern Versions

The next time you have to endure a boring stay in a doctor’s waiting room, be thankful you don’t live in the early 20th century. Even as medicine was rapidly improving, these downright scary or dangerous treatments were still lingering.

1. Radium Water

Before radioactivity was fully understood, naturally occurring radium was lauded for its seemingly otherworldly benefits. Water was kept in radium-laced buckets, and people would drink the tainted liquid to cure everything from arthritis to impotence. Of course, this was an awful idea, and when people started to drop dead from this miracle water, the connection was made. Now, non-radioactive prescription drugs are used to combat arthritis and impotence.

2. Ecraseur

This obsolete tool had a chain loop that the doctor would tighten around a cyst or hemorrhoid. This constriction would rob the area of blood flow, which would cause the offending lump to fall off. In modern medical offices, creams are used to ease hemorrhoids away, while more delicate surgery is most often used to remove cysts.

3. Plombage

Plombage was a risky early 20th century treatment for tuberculosis in which a surgeon would create a cavity in a patient’s lower lung and fill it with a foreign material such as lucite balls. This procedure would make the upper, infected lung collapse. The theory maintained that a collapsed lung would eventually heal itself. Thanks to modern vaccines, TB has been largely eradicated throughout much of the developed world, although it is far from completely eliminated globally.

4. Peg Legs

Before the advent of advanced prosthetics, wooden pegs had to be jammed into the hollowed-out cavities of an amputee’s leg or strapped to the patient’s waist. The device would be shaped and carved to the correct height, and occasionally the fit was perfect. Some recipients of the procedure were able to walk for miles without noticing discomfort. Still, they were no match for modern prostheses.

5. Gasoline to Cure Lice

In the early 20th century, a patient with a bad case of head lice would douse his or her dome with gasoline or kerosene in an effort to rid their scalp of the unwanted guests. While this treatment may have been somewhat effective, it was also incredibly dangerous to anyone who walked near an open flame. Modern medicine can solve the infestation much more safely with medicated shampoo.

6. Morphine for Teething

Any parent can understand the necessity of soothing a teething baby’s pain, but even into the 20th century some moms and dads were taking incredibly risky or downright dangerous steps to help their tots. In addition to lancing (cutting the gums to give the new teeth a clear pathway to emerge), parents gave children morphine syrups to ease their crying and dusted their gums with powders that contained deadly mercury. Modern parents are luckier and can use non-toxic pain relievers or chilled teething toys.

7. Mercury for Syphilis

For most of history, a syphilis diagnosis was incredibly grim news, and at the turn of the 20th century, most doctors’ best treatment involved administering toxic mercury to the patient indefinitely, giving rise to a popular quip about lovers spending “one night with Venus, a lifetime with Mercury.” Even as medical knowledge improved in the early 1900s, treatments still involved dire measures like taking arsenic or deliberately inoculating the patient with malaria, which would raise the body temperature and kill the syphilis. Thankfully, these scary treatments all went out the window with the introduction of penicillin in 1943.

8. Starvation Diets for Aneurysms

Doctors sought to treat early 20th century aneurysms by diminishing the force with which the heart pumped. One of the questionable regimens used to achieve this goal was known as Tuffnell’s diet, which consisted of bed rest and meager, dry rations. A 1901 medical text spelled out the treatment’s daily menus: Two ounces of bread and butter with two ounces of milk for breakfast, three ounces of meat and four ounces of milk or red wine for lunch, and two ounces of bread with two ounces of milk for dinner. Today many cases can be treated with minimally invasive surgeries.

9. Hydroelectric Baths for Migraines

Taking the toaster into the bathtub may be fatal today, but for several decades starting in the late 19th century, some doctors recommended treating chronic migraines by lounging in a hydroelectric bath – a warm tub with a small current passing through the water. Doctors eventually became skeptical of this method, and today’s migraine sufferers can turn to more effective pharmaceutical treatments.


Blood Typing

Difference Between O Positive & O Negative Blood

In 1901, Karl Landsteiner published a medical paper identifying three blood types–A, B and C (later changed to O) 1. One year later, his colleagues Alfred Decastello and Adriano Sturli added AB as the fourth and final blood type. Although scientists already understood there were differences in the composition of blood, Landsteiner discovered that human blood is not universally compatible because our immune systems produce antibodies to blood of another type. Landsteiner later won the Nobel Peace Prize for his groundbreaking blood research.

  • In 1901, Karl Landsteiner published a medical paper identifying three blood types–A, B and C (later changed to O) 1.
  • Although scientists already understood there were differences in the composition of blood, Landsteiner discovered that human blood is not universally compatible because our immune systems produce antibodies to blood of another type.

The history of the syringe

The origins of injecting effectively go back into pre-history, with use of weapons such as blowpipes and poison tipped darts to introduce substances into the body – albeit involuntarily for most of the recipients – in many parts of the world.

At its most basic, a syringe is a type of simple pump and it is likely that syringe-type devices were produced by many people. The earliest and most common syringe type device was called a ‘clyster’ a device for giving enemas.

It is impossible to be precise about when this developed, and when injecting as we know it began - the origins of the hypodermic syringe are clouded in uncertainty because there were numerous parallel processes of evolution and experimentation that led to the development of devices to inject drugs and medicines.

Because of this various people have been credited with the 'invention' of the syringe including Christopher Wren, Robert Boyle and Pascal, and intravenous injection is recorded as early as the 17th century.

The first recorded injections
Christopher Wren is the first person recorded to have employed intravenous injecting in Britain – injecting into a dog at Wadham College, Oxford, in 1656.

This was actually of a psychoactive substance: the dog was injected with alcohol because the effect could be proven through observation when the dog became intoxicated! He also experimented by injecting dogs with opium and other substances (Macht 1916). Wren’s ‘syringe’ for these experiments was a crude device, consisting of a quill attached to a small bladder. In order to gain access to a vein, an incision first had to be made in the skin.

Wren also attempted intravenous injection in humans. His subjects for this included “the delinquent servant of a foreign ambassador" . but it didn't go well:

  • “…the victim either really, or craftily, fell into a swoon and the experiment had to be discontinued” (Macht 1915). As a side note, high risk injecting with crude injecting devices hasn't disappeared even today: in places such as prisons where access to modern sterile equipment is often absent or limited makeshift are still sometimes made and used.

In 1807 the Edinburgh Medical and Surgical Dictionary defined a syringe as:

  • “A well known instrument, serving to imbibe or suck in a quantity of fluid and afterwards expel the same with violence. A syringe is used for transmitting injections into cavities or canals.”

In the 17th century, De Graaf made a device that closely resembled the modern syringe, with a metal barrel to which the needle was directly attached. Its purpose was to trace the blood vessels of corpses.

Deliberate subcutaneous injection (under the skin) did not begin until the mid to late 19th century, probably as an extension of the then new practice of inoculation against disease.

The Fergusson syringe of 1853 became the forerunner of the modern syringe when Alexander Wood used it for the subcutaneous injection of opiates for the relief of pain.


Early experiments
Experiments with intravenous injecting continued and techniques were further developed in the 17th century. Numerous drugs were used to attempt to treat various conditions, particularly epilepsy and syphilis.

Opium was one of the first drugs to be injected in this way, but difficulties in reliably accessing veins, the use of substances unsuitable for intravenous injection (such as cinnamon, oil of sulphur and arsenic) gave poor results - which were incorrectly attributed to the route of administration - and probably limited the development of intravenous injecting as a common method of drug delivery.

Absorption of drugs through the skin
The beginning of the 19th century saw an increase in interest in attempts to introduce drugs into the body via the skin itself. Initially, this usually took the form of causing blistering to an area, removing the outer layer of skin and placing a poultice or plaster containing the drug onto it. In 1836, Lafargue further developed this idea by dipping a vaccination lancet in morphine, and pushing it under the skin.

By the middle of the century Lafargue had developed a technique of placing solid morphine based pellets under the skin. Initially this was achieved by simply making a hole with a large needle and pushing the pellet into the hole. Over time and instrument was developed to aid this procedure which Lafargue called the ‘seringue seche’ or dry syringe.

Other variations of this method included that of Crombie, who in 1873 used a technique of coating silk thread with morphia and then drawing the impregnated thread under the skin. Crombie developed this method because he felt that the recently developed hypodermic syringe was expensive and easily damaged.

Subcutaneous injecting
Through the 19th and into the early 20th century, subcutaneous injecting was generally seen as a more valuable route of administration than intravenous injection. This may have been because of the earlier interest in the absorption of drugs through the skin, as well as a lack of realisation of the potentially increased potency of intravenous injections.

In 1880, H.H Kane described intravenous injection as mainly being an unwanted consequence of subcutaneous injection and gave ways to avoid its occurrence. Writing as late as 1916, Macht said:

“however useful intravenous medication may be in special cases, its field of application is certainly more limited than that of hypodermic (subcutaneous) injection…”

The discovery of systemic action
It seems odd now, but early physicians did not realise that substances that were injected would have a systemic effect i.e travel around the whole body, thinking the action of things they injected would be local.

Early understandings of the pain relieving effects of opiates centred on the belief that most of the drug stayed at the site at which it was injected. In fact, drugs administered by any route of injection will eventually permeate throughout the body. Intravenous injection is the fastest route for injected drugs to reach the brain in concentrated form and subcutaneous injection is the slowest injected route.

Alexander Wood, although recognising some systemic action, believed that the action of opiates admistered by subcutaneous injection was mainly localised. The use of the syringe rather than previous methods was thought to allow greater accuracy in administering the drug in close proximity to a nerve, hence it was thought, facilitating better pain relief.

This belief in localised action influenced many doctors at the time. Dr Francis Anstie, editor of The Practitioner, wrote in 1869 that there was no danger associated with the hypodermic injection of remedies, and later:

“it is certainly the case that there is far less tendency with hypodermic than with gastric medication to rapid and large increase of the dose when morphia is used for a long time together”

Charles Hunter, a house surgeon at St George’s Hospital, made the connection that opiates administered by injection exert a systemic action, when he was forced to move away from the original site of injection as a result of abscess formation. He found that the patient still experienced similar relief from pain. This as Berridge and Edwards have noted, “led to a period of sustained and acrimomious debate between Wood and Hunter” about the existence or otherwise of systemic action.

Subcutaneous injecting with a syringe was initially described and popularised by Wood. It has been suggested that the fundamental misunderstanding that dependence could not occur through injected medication was partly responsible for the creation of a large number of patients dependent on morphine, described in the 19th century as ‘morphinists’. This was because the effect of the injected drug was thought to be local, rather than systemic and partly because dependence was thought to be centred on the stomach – so the theory went, avoiding ingestion through the stomach would avoid dependence.

Common problems with early injections
19th century injecting was by no means without incident or problems. The following late 19th century account of the problems associated with medical injections has powerful echoes for street injectors in the UK and other parts of the world today who continue to need to add acids to the base forms of brown street heroin and crack cocaine in order to render them soluble for injection.

“The active agent to be injected subcutaneously must be in perfect solution. The solution itself should be neutral (i.e. neither acid nor alkaline), clear and free of foreign matter, and not too concentrated. The difficulty of fulfilling all of these conditions has in the past very materially hindered the more general use of this method of treatment. But comparatively a few years ago many of the alkaloids were only to be had as bases. They were more or less insoluble without the addition of some acid and the slightest excess of the latter caused intense local irritation.” (Sharpe & Dhome 1898)

19th century descriptions of frequent subcutaneous injectors can sound similar to the appearance of some frequent injectors of street drugs in the 21st century, particularly those who are having difficulties in accessing veins.

“An extraordinary spectacle was revealed on examination. The entire surface of the abdomen and lower extremities was covered with discolored blotches, resembling small vibices, the marks of the injections. He was spotted as a leopard. For four years he had averaged three or four a day – an aggregate of between 5 and 6 thousand blissful punctures! The right leg was red and swollen, and I discovered a subcutaneous abscess extending from the knee to the ankle and occupying half the circumference of the limb.” (Gibbons 1870)

The growth of the medical use of opiates
A powerful influence on the development of widespread and repeated use of opiates by injection, would have been the obvious and immediately beneficial effects of injected morphine, particularly to those experiencing chronic pain. Doctors at the time, with few truly effective treatments available, would have had difficulty in resisting the impulse to treat pain with something as powerful, fast and effective as injected morphine. Courtwright, when discussing 19th century opiate addiction in North America, has said:

“The administration of opium and morphine by physicians was the leading cause of opiate addiction in the nineteenth century…case histories, clinical notes and remarks in the medical literature support the view that although opium and morphine were ultimately given for such unlikely disorders as masturbation, photophobia, nymphomania and ‘violent hiccough’ it was principally in those suffering from chronic ailments that the use of these drugs led to chronic addiction.” (Courtwright 1982)

The combination of the development and spread of injecting, alongside the widespread availability of opiates and opiate-based patent medicines probably contributed significantly to the increase in numbers of injectors of opiates in this period.

Injecting in the 20th century - the growth of intravenous injecting
Throughout the 19th and early 20th centuries, the most common injected route amongst both medical and ‘non-medical’ injectors was by subcutaneous injection.

Interestingly, early accounts of intravenous injection describe it as something unpleasant and to be avoided, although this is probably as a result of using too large a dose. The preference for the intravenous route of drug administration seems to have become particularly prevalent with illicit users during the 1920’s.

Richard Pates reviewed the literature on the spread of illicit intravenous injecting (Pates 2005) and concluded that early intravenous injectors probably discovered the route accidentally, and learned to use smaller doses than would have been needed for subcutaneous injection. Before 1925 intravenous injection amongst illicit users was relatively rare, by 1945 it had become the norm:

“…in the early 20th century addicts were taking doses that were enormous by today's standards and mostly had overdose experiences when they accidentally hit a vein. But when narcotics started to become more difficult to obtain and the doses became smaller, communication in the drug subculture facilitated the diffusion of the intravenous technique. The fact that (intravenous) injecting is more economical and the enjoyable rapid effect, or 'rush', contributed to the quick diffusion.” (Pates et al 2005)

However, it is very important to understand that medicine was beginning to favour the intravenous route for particular medications in the first decade of the 20th Century, particularly a drug called Salvarsan, a treatment for syphilis. The most effective alkaline form of Salvarsan could only be delivered intravenously. As Patricia Rosales says:

“In order for alkaline Salvarsan to maintain its non-toxicity, it had to be administered intravenously. It therefore required what in 1911 was considered a surgical procedure a process much more difficult to achieve than today’s shot in the arm.” (Rosales 1997)

Rosales suggests that improvements and standardisation in the design and manufacture of syringes, needles, ampoules and the formulation of drugs, were largely driven by the precision required in the new need to give intravenous injections. It is therefore very likely that medical advances played a crucial part in the diffusion of the intravenous route.

The non-medical intravenous injection of heroin was first described in 1925 (Kolb 1925). five years previously, B.S. Wyatt had written the following about the intravenous treatment of malaria:

“From the subcutaneous injection to the intramuscular injection was a logical evolution. From the intramuscular injection to the intravenous injection was inevitable. It had to come. It is here to stay. There is every argument for no argument against intravenous therapy. Once admitted that the blood is the medium in which medicine is carried to every organ, tissue and cell of the body…” (Wyatt 1920)

The switch to disposable needles and syringes
Patents for glass disposable syringes were being taken out as early as 1903, but they were probably ideas before their time, and do not seem to have entered production.

The first truly disposable ‘syringes’ to be produced in large quantities were originally designed by James T Greeley around 1912. These were collapsible tin tubes (a bit like a modern tube of superglue) that had an attached needle and contained a contained a specific amount of morphine for subcutaneous injection on the battlefield. These were used in the 1st World War and were further developed during the 1920’s and 30’s to become the morphine Syrette manufactured by Squibb.

Syrettes were a standard part of the 1st aid kit carried by U.S. medical orderlies in World War 2. Used Syrettes were pinned to the collar of a casualty in effort to avoid inadvertent overdosing.

Greeley described the reasons for the development of his disposable device in 1912, talking of the problems with existing syringes he said:

“Asepsis is uncertain, the making of the solution is time-consuming and impossible where water is not available the joints often leak the piston occasionally sticks, and the needle becomes dull and rusty from boiling.”

Throughout the 20th century , the production of precision-made glass syringes was gradually refined. The first major advance came with the manufacture of syringes and needles with interchangeable parts made to exact specifications, rather than as ‘one-off’ items, as has been said above, the impetus for this standardisation was driven by the need to inject the anti-syphilitic drug Salvarsan intravenously.

Until the1960’s the majority of needles and syringes used outside of warfare, were re-useable and were supplied unsterilised. They had to be sterilised before each use.

The development of plastic disposable syringes
There are several competing claims to the design of the first disposable plastic syringe, but the most plausible is that of the Monoject syringe developed in the USA by Roehr products in 1955. The development of the Monoject syringe spurred Becton Dickinson into the development of similar plastic syringes (they had previously been developing glass disposables) and BD introduced their own Plastipack syringe in 1961.

Fears about the transmission of hepatitis B (and the resulting lawsuits) by doctors using inadequately sterilised re-useable syringes led to the takeover of the market by plastic disposables. A 1998 article in the San Francisco Chronicle on healthcare needle stick injuries, quotes a BD executive Joseph Welch as saying in 1990 of hepatitis B:

“It was probably the reason Becton Dickinson is a $2 billion company today,''

Becton Dickinson produced the first one-piece insulin syringe with integral needle in 1970.

Difficult to re-use syringes
There are many types of difficult to re-use syringes, each with a different mechanism to prevent a syringe being used more than once. They were developed for hospitals and other health care settings where they can prevent the inadvertent re-use of syringes.

Although it might seem that supplying these syringes to illicit drug users would reduce needle and syringe sharing, it is widely believed that their introduction would lead to those syringes already in circulation being kept, re-used and shared more frequently - leading to an increase in hepatitis C and HIV transmission. The United Kingdom Harm Reduction Alliance and the National Needle Exchange Forum have both warned of the potential dangers of these types of syringe.

We have written a separate article on this issue, to read it CLICK HERE

Accidental sharing and the development of the Nevershare syringe
A video study of injecting risk in Glasgow by Avril Taylor from Paisley University highlighted the prevelance of 'accidental sharing' in which injecting drug users had difficulty in avoiding sharing because all their syringes looked the same.

Exchange Supplies made a documentary film with the researchers to disseminate their findings, to see it, CLICK HERE

One of the key recommendations of the study was that much of the risk of sharing could be removed if injectors (who are well aware of the risks) were more able to tell their syringes apart.

After several years of lobbying syringe manufacturers to ask them to act on these findings, it became clear that they weren't going to do so of their own volition so Exchange Supplies embarked on it's biggest ever product development project, resulting in the launch of the 1ml insulin type nevershare syringe with plungers in 5 different colours to reduce accidental sharing in May 2007.

The Nevershare was the world's first syringe developed specifically for injecting drug users and in addition to plungers in a range of colours, it has markings in millilitres rather than insulin units, a barrel clear of print so injectors can see the solution, and a 30 gauge needle to reduce vein damage.

In September 2011, we added a 2ml detatchable needle type nevershare syringe to the range so that injecting drug users who require a different needle size to the 'traditional' insulin type syringe can also have access to coloured plungers so they can tell their syringes apart.

References
Macht D I (1916) The history of intravenous and subcutaneous injecting of drugs. The Journal of the American Medical association. LXVI

Morris R and Kendrick J (1807) The Edinburgh Medical and Surgical Dictionary.

Kane H H (1880) The Hypodermic Injection of Morphia. Its History Advantages and Dangers. Chas L Bermingham and Co, New York.

Anstie F E (1871) On the effects of prolonged use of morphia by subcutaneous injection. Practitioner 6: 148-57

Berridge V and Edwards G(1987) Opium and the people. Opiate Use in Nineteenth Century England, pp. 139-40. Yale University press, USA.

Sharp & Dhome (1898), A brief summary of hypodermic medication, 6th edition pp.8-9. Sharp & Dhome, Baltimore. Quoted in Rosales P, A history of the hypodermic syringe 1850’s – 1920’s. Harvard University Thesis, December 1997

Gibbons H (1870). Letheomania: the result of the hypodermic injection of morphia. Pacific medical and surgical journal 12: 481-495. Quoted in Rosales P, A history of the hypodermic syringe 1850’s – 1920’s. Harvard University Thesis, December 1997

Courtwright D (1982) Dark Paradise Opiate Addiction in America before 1940, p.42 Harvard university Press, USA

Pates R, Mcbride A, Arnold K (Eds) Injecting Illicit Drugs Blackwell Publishing 2005

Rosales P, A history of the hypodermic syringe 1850’s – 1920’s. Harvard University Thesis, December 1997

Kolb L, Pleasure and Deterioration from Narcotic Addiction," Mental Hygiene, 9 1925

Wyatt B. S. (1920) The intravenous treatment of malaria, New York Medical Journal 112: 366-369

Editor. (1876) Tetanus after hypodermic injection of morphia. Lancet 2: 873-6

Bartholow R. (1891) A manual of hypodermic medication: The treatment of disease by the hypodermatic or subcutaneous method, 5th Edition, J B Lippincott Company p 38. Philadelphia USA.


Inside the Curl: Surfing's Surprising History

From Thomas Jefferson to drug kingpins, surfing has influenced generations.

What happens when two middle-aged surfers paddle into the lineup at a world-famous California surf break?

When the two surfers in question are serious science historians at preeminent California universities, the result is a fascinating new tome on one of the world's oldest sports: The World in the Curl: An Unconventional History of Surfing.

Peter Westwick, an expert in the history of the aerospace industry at the University of Southern California, and Peter Neushul, a research historian at the University of California, Santa Barbara, were surfing a break called Cojo near Santa Barbara when they decided to combine their passions for history and surfing. But the result isn't just a list of contest winners or a cursory treatment of a pop culture phenomenon that has permeated everything from music to film to fashion.

Instead, Westwick and Neushul attempt to explain major historical events through the lens of what is mistakenly presumed to be one of the most unserious of sports—now a ten-billion-dollar global industry that boasts more than 20 million practitioners worldwide. Along the way they bust several myths about the sport and delve into the rich Hawaiian culture that continues to infuse it, while uncovering fascinating revelations from the surf history vaults.

Who could have guessed that the "pursuit of happiness" phrase in the Declaration of Independence may have been inspired by early accounts of surfing? Or that the modern surfboard actually has deep roots in the military-industrial complex? Or that after Daniel Ellsberg leaked the Pentagon Papers during the Vietnam War, he eased his stress by catching a few waves?

Contributing writer Joel Bourne recently talked story with the authors of the unconventional history of surfing.

You've uncovered some great, little-known moments in surfing history. One of my favorites is that early accounts of surfing may have influenced Thomas Jefferson to make "the pursuit of happiness" an inalienable right in the Declaration of Independence.

PW: That came from another historian named Andy Martin in a book he wrote on the Enlightenment and Romantic period that had fascinating insights. Both the French and American revolutions occurred as these incredible literary images were coming back from explorers in the tropical Pacific. The surfer on a tropical wave is the very antithesis of what we were doing in Europe, which was perfecting the guillotine and better ways to kill each other.

If you are sitting in Europe or colonial America reading these travelers' accounts coming back from the South Pacific who are describing "the most supreme pleasure," it really might give you pause. It might make you think, "Wow, these surfers have it right."

You write that wave riding was actually practiced in many coastal cultures around the tropics, but that it reached its pinnacle in Hawaii not only because of the warm water and constant waves, but also because of the tremendous productivity of the taro fields and fish ponds that made early Hawaiians not only extremely fit, but also able to take three months off each year to surf.

PN: I've been working on aquaculture systems for years. I used to go to Hawaii with my father, who was a famous marine botanist [Captain Cook arrives, he leaves, and when he comes back syphilis is just rampant among the people. After that there really were no more pure Hawaiians. There are people with some Hawaiian blood. They had to bring in Chinese, Filipinos, and Japanese just to work the sugar cane because the Hawaiians were so decimated by disease.

It was similar to what happened to the Native Americans with first contact. They eventually developed resistance and were able to regroup. But when you are on an island there is no getting away from it. It was a real tragedy.

Surfing starts its revival in the early 20th century as a tourist attraction by real estate developers in Hawaii and then California, who wanted to get people into their beachfront hotels. But people tend to forget that the guy largely responsible for reintroducing surfing, as well as Hawaiian culture, to the world was actually an Olympic swimmer and world record holder, Duke Kahanamoku.

PN: Duke was, and will always be, the greatest surfer of all time. How many sports or pastimes have the Michael Phelps of the world as their centerpiece? He was just a phenomenal athlete. The fastest man in the water for 16 years. And he also had this spirit to him, this presence.

But his story is somewhat tragic. To retain his amateur status, he worked at gas stations. He worked for the city of Honolulu. Eventually he worked as the sheriff for a while. Then he became sort of the chief greeter, the representative of aloha. When a president arrived, Duke would be there and he would talk to him. In a way it was sad, because his life was representative of what happened to many of the remaining Hawaiian people. They almost got marginalized in a way. Others made money from his name.

So Hawaii provided the soul of the sport, but you write that California provided the great leaps in technology that enabled it to spread around the world, from lighter, more hydrodynamic boards, to wave forecasting, to wetsuits that expanded surfing into colder climes. Much of that technology came from the military and aerospace research going on at Caltech. Gerard Vultee, a Caltech aeronautical engineer and co-designer of Amelia Earhart's Lockheed Vega with its rigid, hollow fixed wing, was a friend and paddleboard competitor of Tom Blake, who invented the first hollow surfboard along the same lines. Bob Simmons, who brought fiberglass, foam, and advanced hydrodynamics to surfboards, was a Caltech mechanical engineer, while Hugh Bradner, a Caltech engineer who worked on the Manhattan Project, became the father of the modern neoprene wetsuit.

PW: What we were trying to get at was why surfing flourished at particular places at certain times, whether pre-colonial Hawaii or mid-20th century California. The standard historian approach is "why then, why there?" What else is going on that might promote surfing? What's going on in California in the middle of the 20th century is the defense industry, and especially the aerospace industry.

So you try to find connections. And sure enough, even back in the '20s there was Gerry Vultee, then World War II with Bob Simmons, Hugh Bradner, and Walter Munk [the father of wave forecasting], all at Caltech. All these people who were coming through the defense industry were also surfing.

You also delve into the seedier side of the sport that emerged during the 1960s, with the Brotherhood of Eternal Love, a crew of California surfers that created a massive drug-smuggling operation.

PW: Surfers not only reflected the '60s, they also actually helped create the '60s because they were the ones driving this tremendous supply of drugs. This image of surfers as a bunch of longhairs on the beach who can't get their act together may have helped them get away with it.

When you read federal task force reports on the menace of drug smuggling, the feds refused to believe these hippie surfers could possibly pull off something this complex and this organized. It was a major global network that these guys were running out of Laguna. They brought in millions of LSD doses, among other things.

You also address some of the coastal environmental issues that surfers encounter—notably raw sewage that ends up in the ocean.

PN: One of the most shocking things to me is you go out to the North Shore and they don't even have a sewage system there on Oahu. They have cesspools. Basically, on a rainy day you could potentially reacquaint yourself with what you just got rid of that morning.

I took my daughter down to do a science project on fecal coliform counts in the ocean in Goleta, California, where the sewage is treated in our area. The guy there told me that fecal coliform is always there. It's just when they become most acute that you close the beach. So you are always swimming in fecal coliform no matter where you go.

The only issue is whether it's going to be enough to make you sick.

You have a great section in the book about how the rise in women surfers can be traced directly back to the influence of Title IX—part of the federal Education Amendment of 1972 sponsored by legendary Hawaiian Congresswoman Patsy Mink.

PN: I really wanted to know whether Mink ever surfed. So I called her daughter, and she said no, she was a plantation girl. Didn't surf. Surfing is not a collegiate sport. But once Title IX comes you have girls becoming athletes, playing basketball, volleyball, swimming. If you are able to swim, you are able to surf.

Women's swimming takes off because of Title IX and so it had a huge, huge impact on athletics in the United States. I think in the last Olympics we had more women medalists than men medalists.

You also write about the current obsession with riding giant waves and its deadly impact. More surfers have been killed in the last 15 years than in all the previous four decades combined.

PN: Big wave surfing in a way is a revival of the waterman that the Duke was. You have to be really athletic and ocean-wise. It's also an outlet for a different kind of surfer to get remunerative gain in terms of professionalism. Anyone bigger than 6 feet (1.8 meters) tall trying to compete in the current surf contest with the short equipment and focus on airborne performance is not going to do so well.

Yet people are fascinated with riding big waves. There is now a cash prize for the person who gets a picture of himself surfing the biggest wave of the year. And when there's money on the table, people are going to risk their lives to get it.

But let's face it, commercial fishing is a lot more dangerous than surfing. And you are not going to ride those big waves unless you are prepared to do it. You won't even be able to paddle out. The people doing it are well prepared, and despite that people are getting killed. That's the nature of extreme sport.

Final question. Surfing has been around for hundreds if not thousands of years, from early Hawaiians, to Mark Twain, to Duke, to Gidget, to Kelly Slater, and yet its power to attract our attention seems stronger than ever. Madison Avenue is using it to sell everything from cars to cologne. Why does this activity continue to fascinate us so?

PW: I think it traces back to its origins in Hawaii and this idea that surfing is the pure pursuit of pleasure. Its association with tropical paradise and this image of surfing as the antithesis of modern society helps sustain its popularity. We are no longer teenagers, but we still have this identification with it.

I was taking my kid to the skate park the other day and this guy says, dude, you are 45 years old, you should not be out in a skate park anymore. And I was like, well, that's what I do. There's no surf, so I'm going to go down to the skate park with my kids and pretend I'm surfing. It's a perpetual adolescence.

PN: It is pure unadulterated fun. If a good south swell was running and we went up to Cojo, no matter how long you've been surfing you would remember the waves you caught forever. It's a unique wave, very clean. So it's a very unique pastime that creates memories because it is so different. The feel of weightlessness, of the speed, and being in the ocean environment, it stays with you.

It's just a lot of fun. You don't have to be riding a 40-foot wave to get that feel.


An overview of insulin

Insulin is a hormone that is responsible for allowing glucose in the blood to enter cells, providing them with the energy to function. A lack of effective insulin plays a key role in the development of diabetes.

Hormones are chemical messengers that instruct certain cells or tissues to act in a certain way that supports a particular function in the body.

Insulin is essential for staying alive.

In this article, we look at how the body produces insulin and what happens when not enough of it circulates, as well as the different types that a person can use to supplement insulin.

Share on Pinterest Insulin is an essential hormone for controlling blood sugar and energy absorption.

Insulin is a chemical messenger that allows cells to absorb glucose, a sugar, from the blood.

The pancreas is an organ behind the stomach that is the main source of insulin in the body. Clusters of cells in the pancreas called islets produce the hormone and determine the amount based on blood glucose levels in the body.

The higher the level of glucose, the more insulin goes into production to balance sugar levels in the blood.

Insulin also assists in breaking down fats or proteins for energy.

A delicate balance of insulin regulates blood sugar and many processes in the body. If insulin levels are too low or high, excessively high or low blood sugar can start to cause symptoms. If a state of low or high blood sugar continues, serious health problems might start to develop.

Click here to read all about diabetes and how it develops.

In some people, the immune system attacks the islets, and they cease to produce insulin or do not produce enough.

When this occurs, blood glucose stays in the blood and cells cannot absorb them to convert the sugars into energy.

This is the onset of type 1 diabetes, and a person with this version of diabetes will need regular shots of insulin to survive.

In some people, especially those who are overweight, obese, or inactive, insulin is not effective in transporting glucose into the cells and unable to fulfill its actions. The inability of insulin to exert its effect on tissues is called insulin resistance.

Type 2 diabetes will develop when the islets cannot produce enough insulin to overcome insulin resistance.

Since the early 20th century, doctors have been able to isolate insulin and provide it in an injectable form to supplement the hormone for people who cannot produce it themselves or have increased insulin resistance.

Learn about the discovery of insulin here.

A person can take different types of insulin based on how long they need the effects of the supplementary hormone to last.

People categorize these types based on several different factors:

  • speed of onset, or how quickly a person taking insulin can expect the effects to start.
  • peak, or the speed at which the insulin reaches its greatest impact
  • duration, or the time it takes for the insulin to wear off
  • concentration, which in the United States is 100 units per milliliter (U100)
  • the route of delivery, or whether the insulin requires injection under the skin,into a vein, or into the lungs by inhalation.

People most often deliver insulin into the subcutaneous tissue, or the fatty tissue located near the surface of the skin.

Three main groups of insulin are available.

Fast-acting insulin

The body absorbs this type into the bloodstream from the subcutaneous tissue extremely quickly.

People use fast-acting insulin to correct hyperglycemia, or high blood sugar, as well as control blood sugar spikes after eating.

  • Rapid-acting insulin analogs: These take between 5 and 15 minutes to have an effect. However, the size of the dose impacts the duration of the effect. Assuming that rapid-acting insulin analogs last for 4 hours is a safe general rule.
  • Regular human insulin: The onset of regular human insulin is between 30 minutes and an hour, and its effects on blood sugar last around 8 hours. A larger dose speeds up the onset but also delay the peak effect of regular human insulin.

Intermediate-acting insulin

This type enters the bloodstream at a slower rate but has a longer-lasting effect. It is most effective at managing blood sugar overnight, as well as between meals.

Options for intermediate-acting insulin include:

  • NPH human insulin: This takes between 1 and 2 hours to onset, and reaches its peak within 4 to 6 hours. It can last over 12 hours in some cases. A very small dose will bring forward the peak effect, and a high dose will increase the time NPH takes to reach its peak and the overall duration of its effect.
  • Pre-mixed insulin: This is a mixture of NPH with a fast-acting insulin, and its effects are a combination of the intermediate- and rapid-acting insulins.

Long-acting insulin

While long-acting insulin is slow to reach the bloodstream and has a relatively low peak, it has a stabilizing “plateau” effect on blood sugar that can last for most of the day.

It is useful overnight, between meals, and during fasts.

Long-acting insulin analogs are the only available type, and these have an onset of between 1.5 and 2 hours. While different brands have different durations, they range between 12 and 24 hours in total.


Early Twentieth Century Railroads

By 1900, America's railroads were very nearly at their peak, both in terms of overall mileage and employment. In the 20 years leading up to World War I, however, the foundations of railroading would change drastically. New technology would be introduced, and the nation would go to war, during which time the railroads would be run by the government. Most significantly, the railroads would enter the age of government regulation.

The dawn of the twentieth century was, for the most part, eagerly anticipated by America. There was much to celebrate. Things were going well for business, and that meant there was employment for almost everyone.

Railroads capitalized on the prosperity with colorful brochures promoting top-notch passenger trains. The West was glorified as the nation's wonderland, regularly being featured in railroad-commissioned paintings and in the pages of numerous magazines. Posters featuring dreamy damsels lured vacationers to exotic destinations like California, while fast "Limiteds" raced business travelers across the land.

The nation's railroads were still growing. By 1900, more than 195,000 miles of track were in service, and there were still another 16 years of expansion ahead. The biggest opportunities existed in the West and in the South, where large portions of the landscape were still lightly populated.

During the years preceding World War I, the Florida East Coast Railroad extended its rails all the way to Key West the Union Pacific reached Los Angeles by crossing through the Utah, Nevada, and California deserts the Western Pacific completed its line from Salt Lake City to Oakland, California and the Chicago, Milwaukee, St. Paul & Pacific linked the Midwest to the West Coast.

It was around this time that the passenger train achieved levels of dependability, comfort, and speed that rail passengers would generally enjoy for the next 50 to 60 years. Trains became so reliable as to encourage entire generations of business travelers to schedule meetings in distant cities the next day, and the basic amenities of train travel -- a comfortable lounge, impeccable dining car service, sleeping cars with restrooms and running water, and carpets throughout -- were here to stay. Railroads even began to regularly operate their finer trains at speeds that even today's travelers would consider "fast" -- 80 to 100 miles per hour.

Learn more about railroads of the twentieth century:

Nowhere was the railroad more evident than in the newsworthy events and popular culture of the day, which often featured colorful tales of railroading and railroaders. Take the story of Casey Jones, for instance. Although apparently due to Casey's own misjudgment, the famous wreck of his passenger train in 1900 at Vaughan, Mississippi -- in which he perished -- resulted in the deaths of no passengers. By sticking with his locomotive until it was too late to save himself, engineer Casey was able to slow the train appreciably, minimizing the collision's effects. The resultant publicity painted Casey as a hero here was the story of a "brave engineer" who gave his life to save those of his passengers. The tale -- and the popular song that soon followed -- remain a permanent part of American folklore and history today.

With the rise of motion pictures and movie theaters, railroads and railroaders would enjoy a lengthy stay in the cinematic limelight. The success of the 1903 film The Great Train Robbery -- a simple, fast-paced "shoot-'em-up" Western -- guaranteed that more train-related movies would follow.

In 1905, a record-breaking dash by train, from Los Angeles to Chicago via the Santa Fe, was another railroading event that captured the nation's attention. The instigator, Walter Scott -- popularly known as "Death Valley Scotty" and widely remembered for his colorful claims about his mining exploits -- apparently hired the train purely for publicity's sake.

Periodicals, literature, and even the Post Office featured railroads in ways that could not escape public notice. Following its celebrated dash in 1893 at a top speed of 112.5 miles per hour. New York Central's famous locomotive, No. 999, had its likeness reproduced on a two-cent postage stamp. From accounts of cross-country rail travel in Harper's Weekly to Frank Norris's emotional, less-than-flattering fictional account of the struggle between farmers and the railroads in "The Octopus," the nation's train system was constantly being observed and scrutinized.

Threats to Early 20th Century Railroads

Of course, there had to a be a downside, too. And indeed there was, in the form of a growing uneasiness among Americans about the ownership and management of the nation's biggest business -- which the railroads had collectively become -- being concentrated in the hands of a relative few. How much power was too much? Was government regulation or control necessary, or were market forces the best way to keep these empires in check? Widely talked about by citizens and politicians alike, and discussed in books such as "The Railroad Question," these were issues that would not go away in the first decades of the twentieth century.

Before the turn of the century, railroads were engaged in an ongoing process of innovation, expansion, and consolidation. Railroads shaped the nation, and were in turn shaped by it.

The new century was no different in a basic sense the changes continued. But while some of the changes offered promise others seemed of less use, at least to the railroads. There were even innovations that, down the road, would pose competitive threats to railroads, though these were largely unrecognized at first.

Consider the telephone, for instance. In the early 1900s it was supplanting the telegraph on American railroads. The idea was the same -- electrical impulses carried over wires -- yet "telephony" represented a way to make these transmissions accessible to everyone. Previously, the station agent in many small towns was often the only person who had the "power" to translate telegraphic messages sent in Morse code.

The telephone held tremendous possibilities for business as well. It offered a way to communicate in real words, in real time -- and at a moment's notice. Some observers speculated that there would be less need for traveling and face-to-face meetings in the future there was even the possibility that telephones might prove useful in the home.

The internal-combustion engine also held promise for railroads. As early as 1890, a primitive 18-horsepower gasoline engine was used near Chicago to demonstrate the usefulness of self-propelled railcars. Just after the turn of the century, primitive gas-mechanical and gas-electric cars (the distinction being manifest in the transmissions) were built for such railroads as the Erie, the Pennsylvania, the Union Pacific, and the Southern Pacific.

As it turned out, self-propelled cars offered savings in the form of labor, but were generally quite troublesome to keep functioning properly. The gasoline engine would turn out to be better suited to the personal automobile, which was also being developed at this time.

Then there was the diesel engine. In the early years of this century, the diesel -- named for Rudolf Diesel, its German inventor -- was already being put to work in a variety of industrial uses.

The Corliss Engine Works, considered the world's largest in 1902, ran its huge manufacturing plant entirely with diesel power. Brewer Adolphus Busch built the first diesel engine constructed in America for use at his brewery, eventually forming a new firm, Busch-Sultzer, to manufacture diesel engines for American and Canadian users. Even mighty American Locomotive Works, the nation's second-largest builder of steam locomotives, had tested the diesel with favorable results. Still, it would take American locomotive builders another quarter of a century to begin a serious program of building and testing these prototype designs.

Railroad Passenger Improvements

The railroad passenger benefitted greatly from technology's advance. For example, the introduction of steam heating got rid of the coal stove, always prone to uneven warming and unsafe in the event of collision. Following Edison's successful demonstration of the incandescent light bulb, electric lighting was introduced aboard passenger trains (although only on the finer trains it would take until World War II for many railroads to fully convert to electricity for lighting). ­Tanks for fresh water would be introduced as well, allowing drinking and washing to be undertaken in good hygiene. And the all-steel passenger car's introduction in 1906 helped to assure greater safety in the event of a collision, at the same time reducing the likelihood of fire if such a misfortune did occur.

Electricity eventually provided clean, safe lighting aboard passenger cars, but a related event in Richmond, Virginia, in 1887 was almost immediately of concern to America's steam railroads. When inventor Frank J. Sprague successfully electrified that city's street railway system, the stage was set for the large-scale application of street railways to towns and cities of all sizes. Up to this time, only the largest cities could support the necessary high ridership or large capital investments required for horse- or cable-propelled railway systems.

In a pre-automobile age, Sprague's success meant that city workers could now get to and from their jobs much more efficiently it also meant that development was spurred to the edges of cities, a precursor to our modern-day pattern of suburban living.

Soon the new technology of the trolley car was being applied to elevated railways as well, allowing large cities such as New York, Chicago, and Boston to continue to grow rapidly. As the century-turned, the boom was on. The electric railway industry mushroomed in size until by 1920 it was the fifth largest industry in the United States. In 1890, street railways carried two billion passengers by 1902, the number had risen to five billion, more than several times the number carried on the nation's steam railroads.

Another variation, the interurban electric railway, competed directly with steam railroads for the first two decades of the twentieth century. These interurbans, as they were called, followed major streets in urban areas, then set out -- often paralleling existing railroads -- across the countryside to serve nearby towns.

Although the trip often was slower than the paralleling steam road's service, it was offered more frequently. Thus the interurban grew to its biggest proportion in regions that had scattered towns and suburbs surrounding a major metropolitan core -- such as Los Angeles and Indianapolis -- or had concentrated development along a population corridor, such as those connecting Chicago-Milwaukee, Cincinnati-Day ton, or Oakland-Sacramento-Chico (California).

The interurban turned out to be little more than a transitional step between sole reliance on the steam railroad for intercity transit and almost sole reliance on the personal automobile (which was still several decades in the future). Although a few interurban systems actually prospered -- usually due to the fact that they also carried freight, in direct competition with steam railroads -- few industries have grown so rapidly or declined so quickly, and no industry of its size ever had a more dismal financial record.

Not surprisingly, the interurbans began their precipitous decline on the eve of World War I -- as the automobile was becoming available to all -- and during the Depression the industry was virtually annihilated.

Early 20th Century Railroad Competition

Competition is expected to be keen in a free-market society, but railroads prior to the turn of the century were engaged in a particularly cutthroat version. Railroad mileage was expanding, but particularly in the East and Midwest -- where the railroad network by 1900 was densely packed -- this new mileage was often built at the expense of competing lines. "The day of high rates has gone by got to make money now on the volume of business" said W. H. Vanderbilt, eldest son of "Commodore" Vanderbilt and head of New York Central.

Controlling costs was one way of helping make railroads more profitable, and the many improvements in technology around the turn of the century helped to accomplish just that. At the same time, the American railroad system was going through a period of consolidation that was unprecedented. By 1906, seven major interest groups controlled approximately two-thirds of all railroad mileage in the United States.

The Harriman lines -- Union Pacific, Southern Pacific, and Illinois Central -- comprised 25,000 miles the Vanderbilt roads -- New York Central and Chicago & North Western -- 22,000 the Hill roads -- Great Northern, Northern Pacific, and the Chicago, Burlington & Quincy -- 21,000 the Pennsylvania group -- the Pennsylvania Railroad, Baltimore & Ohio, and Chesapeake & Ohio -- 20,000 the Morgan roads -- Erie and Southern systems -- 18,000 the Gould roads -- Missouri Pacific and several other southwestern systems -- 17,000 the Rock Island group -- Chicago, Rock Island & Pacific system -- 15,000.

Consolidation, interestingly, went largely hand-in-hand with a trend toward less expansion. By 1910, the nation's railroads aggregated 240,293 miles by 1916, the total reached 254,037 -- America's all-time record for railroad mileage.

Railroad employment grew as well, to a 1916 peak of 1.7 million persons, but the trend would be downhill from there. The era of the big-name "empire builders" was also coming to a close the last, James J. Hill, died in 1916.

Increasingly, business managers and bankers -- rather than entrepreneurs -- would assume the challenges of running the nation's railroads. And difficult though it may be to comprehend today, a number of forces were at work to drastically alter the competitive picture -- just as the railroads, it seemed, had reached some kind of equilibrium.

Those forces had actually been at work for some time.

Early 1900s Railroad Laws

As early as 1871, railroad regulation had been enacted within individual states, in response to agitation by farmers for rate controls. The first significant federal regulation -- the Interstate Commerce Act -- followed in 1887 even then, the railway industry had little to fear, since "supervision is almost entirely nominal," wrote Attorney General Richard S. Olney in 1892.

The following year, President Benjamin Harrison signed the Railroad Safety Appliance Act into law, requiring air brakes (replacing manual ones cranked down "at speed" by brakemen atop swaying railroad cars) and automatic couplers (replacing the infamous "link and pin" variety that was responsible for the crushing of dozens of brakemen each year, and the loss of thousands of their fingers) to be phased in on most locomotives and cars around the turn of the century.

Although the Interstate Commerce Commission was largely ineffectual prior to 1900, the onset of the Progressive Movement revived the issue of regulation. Most Americans were of the opinion that more stringent controls were needed to prevent abuses such as those perceived within the financial markets -- and which on occasion had led to great collapses of railroad systems, as well as the resultant loss of investor fortunes. It was obvious that something needed to be done to restore the public's confidence.

In this light, President Theodore Roosevelt in 1901 directed his attorney general to file suit -- under the provisions of the Sherman Anti-Trust Act -- against Northern Securities, a giant holding company formed by railroad consolidationists Edward H. Harriman and James J. Hill. The company was outlawed in 1904, and later that year Roosevelt was reelected to a second term. Before the year was out, Roosevelt asked Congress to increase the powers of the I.C.C. This was done overwhelmingly with passage of the Hepburn Act, which empowered the commission to establish "just and reasonable" maximum rates.

"Within two years of [the Hepburn Act's] passage, more rate complaints -- some 1,500 -- were made with the I.C.C. than had been filed in the two preceding decades," writes historian John F. Stover in his book "The Life and Decline of the American Railroad." A related bill strengthened the I.C.C.'s powers in 1910, requiring railroads to prove that any future rate hikes were reasonable and necessary. A related piece of legislation in 1913 provided for the regulatory agency to begin assessing the true value of each railroad, information that was needed if rates were to be established that would provide a fair return for investors.

Not unexpectedly, rate increases requested by the railroads were not always granted by the I.C.C. Rates between 1900 and 1916 dropped slightly, even though the nation's general price level increased by almost 30 percent.

Investment in railroads fell maintenance standards went down and new freight and passenger equipment was not ordered in sufficient quantities to keep up with the ongoing demands for replacement and modernization of railroad fleets. The nation had succeeded in regulating its railroads, but with unintended results.

Railroads During World War I

On the eve of World War I, America's railroads were afloat in a sea of dramatic contrasts. On the one hand, the railroad's influence could still be felt in the towns and cities of America, and long-distance travel was still almost exclusively the domain of the passenger train.

And yet, in contrast to these healthy signs, wooden passenger cars were still in use on many railroads, as were outdated and underpowered locomotives. Freight-car fleets still were made up, in large part, of older, lower-capacity (30-ton) cars, even though the increasing use of steel had made the 40-ton car a reality by now.

The outbreak of war in August of 1914 at first resulted in decreased American industrial activity. Rail ton-miles decreased four percent in 1914 and another four percent the following year. It was not until 1916 that the allied nations began to draw upon the economic resources of the United States. That year, ton-miles increased dramatically -- 32 percent -- and soon the nation's railroads were feeling the strain. As the flow of traffic was mostly eastward, serious congestion was experienced in the yards, terminals, and ports of the Northeast and New England.

A car shortage developed as a result, primarily in the West and South. Car shortages were not unusual during peak periods of business prosperity, and a number had occurred before this time. Yet this one would he different. Things went from bad to worse, and in January of 1917 the Interstate Commerce Commission reported that, "The present conditions of car distribution throughout the United States have no parallel in our history . . . mills have shut down, prices have advanced, perishable articles of great value have been destroyed. . . . Transportation service has been thrown into unprecedented confusion."

By the time war was actually declared by the United States, in April of that year, the situation had grown intolerable. American railroads experienced their heaviest traffic in history during the preceding eight months, and the onset of war simply increased the burden. Yet the American spirit of individualism prevailed, and an executive committee called the Railroads' War Board was formed by industry leaders. This body succeeded in lessening car shortages and other problems. Unfortunately, the winter of 1917-1918 struck with a vengeance. That, plus a series of conflicting "priority shipment" orders from the federal government's own war agencies, finally brought things to a standstill.

On December 26, 1917, President Woodrow Wilson finally proclaimed: "I have excercised the powers over the transportation system of the country, which were granted me by the act of Congress of last August, because it has become imperatively necessary for me to do so." He addressed Congress just a few days later, on January 4, 1918, telling all assembled that he had excercised this power "not because of any dereliction on their [the Railroads' War Board's] part, but only because there were some things which the government can do and private management cannot."


History Underfoot: Flooring in the 19th Century Home

I like to go to open houses with friends who are looking to buy, or for myself, to satisfy my curiosity about places in my neighborhood that I’ve always wanted to see. And hey, you never know…..To me, the best old houses are the ones that no one has touched in years. The floors are covered in wall to wall carpeting of dubious antiquity, or layers upon layers of linoleum.

The moment of truth arises when you can grab the end of the carpet, or lift up the linoleum and there they are, protected for umpteen years from wear and tear and the foibles of bad decor: parquet floors! Even better is going to a corner and catching sight of an ornate border, ringing the room, the different colored woods forming lines and patterns, artistry in wood. Love it! However, sometimes you can pull up the carpet, and there is nothing special there.

A house with ornate woodwork, marble fireplaces, the works, and there you go an eh floor. What happened? Were the original owners cheap? Did someone tear out the floors? Why do some houses have such wonderful original floors, and others don’t? When did parquet become popular, and what did homeowners use in our Brooklyn homes before that?

‘The Sargent Family’ painted by an unknown artist in 1800 shows stenciled walls and a patterned floor cloth or mat. Image via National Gallery of Art

With the exception of a handful of Colonial era houses, most of the oldest brownstones and frame houses in our oldest neighborhoods are from the 1830s to the late 1850s. In these earliest houses, the original floors were softwood plank floors, like pine, laid in random widths. The original finish was never a gleaming waxed or varnished finish. To clean these floors, they were usually scrubbed with sand and a wire brush, or sometimes bleached with lye. Most of the time, the floor was either painted, or covered.

Painted floors were often stenciled with border or rug patterns. Coverings ranged from woven matting, somewhat similar to our modern day sisal rugs, to heavy canvas painted floorcloths, to a covering called drugget, or carpet. Drugget was a cheap woolen or cotton/flax plain woven fabric, sewn together to the desired width.

‘Joseph Moore and His Family’ by Erastus Salisbury Field in 1839 shows the family in their carpeted parlor. Image via Museum of Fine Arts Boston

Depending upon one’s budget, drugget was often used to cover a better carpet, to protect it, and was also popular underneath the carpet to provide an attractive border where the carpeting stopped. Matting, much of it imported from India and China, also was used as a carpet padding, and also protection of the carpet in well traveled areas, such as near stairs and at entrances.

As manufacturing techniques for carpeting improved, more and more households were able to afford carpeting. One popular carpet was the rag rug, often made at home by braiding strips of fabric, or weaving lengths of fabric through a loom, creating the kind of rugs most of us are familiar with today as small bathroom or casual rugs.

Most carpeting of the time was woven on looms in narrow lengths and then sewn together to achieve the desired width. The term broadloom comes from this time, and referred to the first larger looms invented that were able to weave wider and wider carpets. Carpet from this period was reversible, as the weave was not the tufted punched carpet that we are used to today.

The designs and patterns were woven into the rug, like a French Aubousson rug. The jacquard loom was invented in France by Joseph Marie Jacquard in 1804. It utilized punch cards that were read by the steel needles in the loom, which raised and lowered the harness of the loom allowing different colors to be woven in, creating patterns. The technology came to the US by 1825, and by 1832, jacquard looms were used in the carpet factories of Lowell, Massachusetts, creating a booming rug manufacturing center in the US.

The entry floor of a Bed Stuy brownstone. Photo by Susan De Vries

Up through the 1870s, the trends in flooring stayed much the same. Painted floors were recommended, especially for service areas, hallways and bedrooms. Stenciling was still popular, and a viable substitute for carpet in these areas. Tile floors were becoming more popular, especially encaustic tile in vestibules, hallways and sometimes verandahs and porches. The tile was expensive, but long lasting, and worth the expense, as it was easy to clean, and the patterns were very attractive. Very wealthy homes began to tile their receiving rooms and foyers in the European manner, often with encaustic tile, but also with marble, sometimes in patterns of different colored stone.

Floorcloths were also still popular, as was grass matting, especially in the summer months. Drugget was now used mostly as a rug underlining, and as an insulation for plank floors in winter, when contracting wood allowed drafts to seep up through the cracks and separations in the flooring. But carpeting was king.

‘The Contest for the Bouquet: The Family of Robert Gordon in Their New York Dining Room’ by Seymour Joseph Guy, 1866. Image via The Metropolitan Museum of Art

The consumer of the mid 19th century had options. Carpeting had become so inexpensive that middle class homeowners could afford to cover all of their floors in their public rooms in carpet. Carpeting had become a basic household item, not a luxury. Magazines and books on home decor advised “as it is customary in this country to carpet every room in the house, flooring need not be laid with a view to appearance. It is cheap to lay down an undressed floor, covering the joints with slips of brown paper and then spreading old newspapers instead of straw, under the carpet.” (The Economic Cottage Builder) Many floors were covered with what was called Venetian carpet, a narrowly woven rug materials made of striped bands. This was popular on stairs, as well as in larger rooms. Tufted pile carpet, called Brussels or Wilton carpet, had been invented in Europe since the late 1700’s, but was hugely expensive.

By the mid 1900s, the techniques had become more mechanized, and production had begun in the United States. The looms for these carpets were still relatively narrow, and the carpet was still sewn together to create the width needed for a home. Axminster carpet, another pile carpeting from England, woven with very realistic natural patterns and colors, was also imported to better homes during this time period, as well.

As pile carpets grew in popularity, so did the patterns and colors used. To our eyes today, many of the patterns and colors are extremely fussy and bright, and the patterns overwhelming. Domestic critics of the day thought so too. They decried the realistic looking shaded floral patterns, which gave three dimensionality, and said “Carefully shaded flowers and other vegetative decoration always appear out of place upon the floor to be trodden on.”

A wool carpet fragment from circa 1885. Image via Cooper Hewitt

“Crunching living flowers under foot, even to inhale their odor, is a barbarity, but to tread on worsted ones, odorless and without form, certainly seems senseless.” (Rural Homes). Another publication, the World of Art and Industry, said, “One is almost afraid to walk here, lest his inadvertent foot should crunch the beauty of the roses, or tread out the purple juices of the grapes. We do not strew bouquets or pile fruit upon our parlor floors to decorate them . . . common sense should teach that these are inappropriate.”

Unfortunately for the critics, people loved their patterned and floral rugs, and bought miles of it, well into the 1870s and beyond. Wall to wall carpet was king. To help a confused homeowner do it right, numerous publications laid down rules regarding the colors one should use, the size and scale of the pattern in relationship to the room, and the most important rule how the overall colors of the carpet affected the color scheme of the rest of the room, so that rooms could be called the green room, the red room, etc, instead of just the southwest chamber, or the room your grandmother had last summer.

As America rushed headlong through the second half of the 19th century, clutched firmly in the hands of householders in 1872 were the Bible, and Charles Eastlake’s” Hints on Household Taste in Furniture, Upholstery and Other Details.” The influence of this English taste master cannot be diminished.

An ad for oriental rugs at Abraham & Straus in 1900. Image via Brooklyn Daily Eagle

Among his many pronouncements was to advocate the return of the hardwood floor. Eastlake maintained that the best flooring for a proper house was an Oriental rug over a hardwood floor.

Along with William Morris, whose own ideas were gaining ground in the US, Eastlake wanted the weaver’s art and craftsmanship of the rug to be noticed and appreciated. He also stated that Oriental rugs were not as wasteful as wall to wall carpet, which could not easily be reused in another room, and most importantly, hid the floor, contrary to the first principles of decorative art, which require that the nature of construction….should always be revealed. American writers agreed, advocating hard wood and parquet floors.

All of a sudden, carpets were reviled as carriers of dirt and filth from outside the home and very liable to putrefy when subjected to damp and warmth. In the space of a few years, the trend shifted from wall to wall carpet to hardwood floors and exotic carpets from the Orient, a trend which would carry well into the next century.

Unfortunately for homeowners not buying new homes, that left them with now inferior and outdated softwood floors. What to do? The critics advocated several options: lay down parquet, which Eastlake wrote was much in vogue in England, but that could be quite expensive.

Parquet border samples from 1892 manufactured by the Interior Hardwood Company of Indianapolis, Ind. Illustrations via “Parquet Floors and Borders“

One could compromise and lay a parquet border around three feet into the room from the walls, and put a carpet in the center of the room, (what we today would call faking it), paint, or put down a wood carpet. I recently saw a house with the second option. The room had a plain hardwood perimeter extending about four feet into the center of the room.

It had an attractive border, as well. In the middle of the room, which was originally a bedroom, was a large square space which was not parqueted, and was originally subfloor, although someone had taken up the carpet, probably long ago, and substituted vinyl tile. This was in a high end house by a well known architect, which was built in the 1890’s, so this practice was not restricted to retrofitted earlier houses, or that uncommon. If a large Oriental carpet had been laid, the bed would have covered most of it anyway, and no one would ever have known.

An Axel Hedman designed Bed Stuy brownstone from 1888. Photo via Corcoran

For many homeowners, wood carpeting was the way to go. Wood carpet was thinner than regular parquet, and was composed of strips of hardwood, about a quarter of an inch thick, glued to a heavy muslin backing. It could be installed right on top of existing flooring.

During the 1880s, the Decorative Wood Carpet Company of Warren, Ohio, carried over 50 patterns. The carpet could also be used as wainscoting, and the company also made floor medallions and borders. The price was competitive with that of Brussels and other carpets, at around $1.75 a linear yard.

Painting was also still an option, but now, instead of trying to paint faux carpeting, the critics advocated stains and paint that mimicked inlaid woods, or for less work, better grades of wood. Tile, especially in hallways and entryways was also more popular than ever, especially with the popularity of Minton tiles gaining strength.

These were expensive, however, and soon American companies were following suit, making both glazed Minton-like tile, as well as encaustic tile. Floorcloths were still popular in hallways and service areas, but the critics condemned the popular practice of painting faux marble and stone cloths, and advocated simple geometric designs.

Wood carpet offered in 1900 by The Foster-Munger Co., Chicago. Image via “Patterns of Grilles, Mantels, Wood Carpet”

Linoleum also had been invented, and soon became a popular flooring, especially in halls and kitchens. Straw matting was still extremely popular for bedrooms, especially in the summer, and in warmer climates. It was now dyed, or stained in colors and patterns, and usually had fabric borders that coordinated with the room, much like today’s sisal and grass flooring.

Ironically, now that wall to wall carpet was at its cheapest, because of so many advances in the technology of weaving, the critics and domestic reformers had thrown it under the bus. Oriental rugs were the new carpet, rugs made by the hand of craftsmen, not machines. Eastlake, Morris, and their American counterparts were adamant on this principle. But the majority of Americans, ie those not rich, stuck with their machine made carpets and rugs. The only thing that changed was the popularity of Eastern style geometric patterns, replacing the fad for fruits and flowers and foliage underfoot.

By the last decade of the 19th century, and well into the 20th, hardwood floors became the norm for all new construction. Plain strips of tongue and groove flooring, usually in oak, was now throughout most custom made and spec houses, in hallways, ground floors, and upper floors. Most parlor floors enjoyed parquet floors, usually with a decorative border. The ornateness of the patterns, and the types of wood used were dictated by the money spent.

The best parquet was 7/8 thick, put together with tongue and groove edges. The borders, often quite ornate and complicated, are also tongue and groove, and used different woods for color changes. Good parquet was about 1/2 thick, and was nailed to the subfloor. Some styles featured alternating blocks, consisting of strips of wood, others are a more European herringbone pattern, and some floors just narrow strips of wood laid on the diagonal running from one end of the room to the other, with a wide border.

These borders are also nailed down, and may be all oak, with some stained a darker color. Wood carpet was still in use, and was still extremely popular, and is in many of our homes today. By the 1890s, it was 3/8 thick, still glued to a muslin back, and then nailed down with finishing nails. Many homeowners today have discovered that they have this kind of parquet, when too many sandings have rendered the parquet paper thin.

The woodwork of the day was made to show off the natural colors and grains of the wood, whether oak, walnut, mahogany or other exotic wood. The floors would have done the same. We often think the woodwork in late 19th century was dark and oppressive. Often that is because we are seeing it after over 100 years of old shellac and varnish, dirt, wear and tear. When the detritus of a century is removed, the natural wood shines. The same holds true on hardwood floors.

In closing, by the end of the century, wood floors were not what they were at the beginning of the century. In 1898, the Wood-Mosaic Company of Rochester, N.Y., wrote in an article entitled “How to Treat a Soft Pine Floor”: “If very bad use it for kindling wood.”

The article elaborated, “Most soft pine floors are very bad. If in fair condition, cover it with thin parquetry or wood carpet. Or, if it must be scrubbed and mopped like a barroom or a butcher’s stall, cover it with linoleum or oil cloth. In this case don’t cover with parquetry. Don’t cast pearls before swine. Or it may be painted. Paint adheres well to pine. Don’t cover it with a dusty, dirty, disease disseminating carpet.” Luckily, today we can make up our own minds.

My source for most of this information was the wonderful “Victorian Interior Decoration: American Interiors 1830-1900” by Gail Casky Winkler and Roger W. Moss.


What was it like to have type 1 diabetes in the early 20th century? - History

After the depression of the 1890s, immigration jumped from a low of 3.5 million in that decade to a high of 9 million in the first decade of the new century. Immigrants from Northern and Western Europe continued coming as they had for three centuries, but in decreasing numbers. After the 1880s, immigrants increasingly came from Eastern and Southern European countries, as well as Canada and Latin America. By 1910, Eastern and Southern Europeans made up 70 percent of the immigrants entering the country. After 1914, immigration dropped off because of the war, and later because of immigration restrictions imposed in the 1920s.

The reasons these new immigrants made the journey to America differed little from those of their predecessors. Escaping religious, racial, and political persecution, or seeking relief from a lack of economic opportunity or famine still pushed many immigrants out of their homelands. Many were pulled here by contract labor agreements offered by recruiting agents, known as padrones to Italian and Greek laborers. Hungarians, Poles, Slovaks, Bohemians, and Italians flocked to the coal mines or steel mills, Greeks preferred the textile mills, Russian and Polish Jews worked the needle trades or pushcart markets of New York. Railroad companies advertised the availability of free or cheap farmland overseas in pamphlets distributed in many languages, bringing a handful of agricultural workers to western farmlands. But the vast majority of immigrants crowded into the growing cities, searching for their chance to make a better life for themselves.

Immigrants entering the United States who could not afford first or second-class passage came through the processing center at Ellis Island, New York. Built in 1892, the center handled some 12 million European immigrants, herding thousands of them a day through the barn-like structure during the peak years for screening. Government inspectors asked a list of twenty-nine probing questions, such as: Have you money, relatives or a job in the United States? Are you a polygamist? An anarchist? Next, the doctors and nurses poked

Medical examination
Ellis Island, 1910
and prodded them, looking for signs of disease or debilitating handicaps. Usually immigrants were only detained 3 or 4 hours, and then free to leave. If they did not receive stamps of approval, and many did not because they were deemed criminals, strikebreakers, anarchists or carriers of disease, they were sent back to their place of origin at the expense of the shipping line.

For the newcomers arriving without family, some solace could be found in the ethnic neighborhoods populated by their fellow countrymen. Here they could converse in their native tongue, practice their religion, and take part in cultural celebrations that helped ease the loneliness. Often, though, life for all was not easy. Most industries offered hazardous conditions and very low wages--lowered further after the padrone took out his share. Urban housing was overcrowded and unsanitary. Many found it very difficult to accept. An old Italian saying summed up the disillusionment felt by many: "I came to America because I heard the streets were paved with gold. When I got here, found out three things: First, the streets weren't paved with gold second, they weren't paved at all: and third, I was expected to pave them." In spite of the difficulties, few gave up and returned home.

References:
Kraut, Alan, The Huddled Masses: The Immigrant in American Society, 1880-1921 (1982) Handlin, Oscar, The Uprooted (1951).