Comp. by: GVasenthanProof0000876234 Date:16/11/08 Time:01:51:47 Stage:First Proof File Path://spiina1001z/Womat/Production/PRODENV/0000000001/0000011393/0000000016/ 0000876234.3D Proof by: QC by: ProjectAcronym:BS:FINGER Volume:02130 Chapter 30 Historical aspects of the major neurological vitamin deficiency disorders: the water-soluble B vitamins DOUGLAS J. LANSKA * Madison, Wisconsin, USA Veterans Affairs Medical Center, Tomah, WI, USA, and University of Wisconsin School of Medicine and Public Health, Madison, WI, USA INTRODUCTION This chapter will review the major neurological disor- ders associated with deficiencies of the water-soluble B vitamins, including particularly beriberi, Wernicke– Korsakoff disease, pellagra, neural tube defects, and subacute combined degeneration of the spinal cord. THIAMIN (VITAMIN B 1 ) DEFICIENCY: BERIBERI AND WENICKE^KORSAKOFF DISEASE Peripheral nervous system manifestations of thiamin defi- ciency have been recognized for millennia in Asia in the form of a sensorimotor polyneuropathy called beriberi. People affected by beriberi first develop nonspecific con- stitutional symptoms including weakness, fatigue, irrit- ability, anorexia, and abdominal discomfort. As the disease progresses, patients develop symptoms of periph- eral polyneuropathy with paresthesias, neuropathic pain, and numbness (referred to as “dry beriberi”), often accom- panied by congestive heart failure with pedal edema, pleural effusions, and pulmonary edema (“wet beriberi”). The prevalence of beriberi increased greatly in Asia with a change in the milling process for rice in the late 19th century, around the time that the central nervous system manifestations of thiamin deficiency—Wer- nicke’s encephalopathy and Korsakoff’s psychosis— were recognized in Europe. Only in the 20th century were these disorders all clearly linked to a deficiency of a specific dietary factor, which was ultimately deter- mined to be the vitamin now called thiamin. The isola- tion and synthesis of thiamin in the 1930s greatly improved the acute treatment of these disorders, but more importantly made possible the prevention of large outbreaks of beriberi, as well as the prevention of many sporadic cases of all neurological forms of thiamin deficiency, through food fortification. Brontius’s description of the sensorimotor neuropathy of beriberi Dutch physician Jacobus Brontius (1592–1631), frustrated by the meager earnings from his practice in Leyden, accepted a job in 1627 as physician for the Dutch East India Company in Batavia (now Jakarta), on the island of Java (in what is now southern Indonesia). In Batavia, Brontius observed and studied a wide range of novel tro- pical diseases, and gave the first European description of the sensorimotor polyneuropathy of beriberi in a book, De Medicina Indorum Libri IV, which was first published posthumously in 1642 (Brontius, 1745/1945). Brontius noted that the word beri-beri, meaning sheep, was applied because afflicted individuals had a steppage gait that resembled the gait of sheep. Among the clinical features recognized by Brontius were generalized weakness, tre- mulousness, and paresthesias. Takaki and the dietary prevention of kakke ´ (beriberi) in the Japanese navy Although beriberi had been recognized in Asia for sev- eral thousand years, its incidence increased dramati- cally in the 1870s, when it became one of the most common diseases in Asia as an unrecognized conse- quence of a change in diet of the population. By this time, steam-driven mills had been introduced to Asia from Europe and were replacing the previous milling process (Verhoef et al., 1999). The new steam-powered mills efficiently removed the so-called “polishing” and * Correspondence to: Douglas J. Lanska, MD, VA Medical Center, 500 E Veterans St., Tomah, WI 54660, USA. E-mail: [email protected], Tel: +1-608-372-1772, Fax: +1-608-372-1240. Handbook of Clinical Neurology, Vol. 95 (3rd series) History of Neurology S. Finger, F. Boller, K.L. Tyler, Editors # 2009 Elsevier B.V. All rights reserved
34
Embed
Historical aspects of the major neurological vitamin deficiency disorders… · were these disorders all clearly linked to a deficiency of a specific dietary factor, which was ultimately
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Historical aspects of the major neurological vitamin
deficiency disorders: the water-soluble B vitamins
DOUGLAS J. LANSKA*
Madison, Wisconsin, USA Veterans Affairs Medical Center, Tomah, WI, USA, andUniversity of Wisconsin School of Medicine and Public Health, Madison, WI, USA
INTRODUCTION
This chapter will review the major neurological disor-ders associated with deficiencies of the water-solubleB vitamins, including particularly beriberi, Wernicke–Korsakoff disease, pellagra, neural tube defects, andsubacute combined degeneration of the spinal cord.
Peripheral nervous systemmanifestations of thiamin defi-ciency have been recognized for millennia in Asia in theform of a sensorimotor polyneuropathy called beriberi.People affected by beriberi first develop nonspecific con-stitutional symptoms including weakness, fatigue, irrit-ability, anorexia, and abdominal discomfort. As thedisease progresses, patients develop symptoms of periph-eral polyneuropathy with paresthesias, neuropathic pain,and numbness (referred to as “dry beriberi”), often accom-panied by congestive heart failure with pedal edema,pleural effusions, and pulmonary edema (“wet beriberi”).
The prevalence of beriberi increased greatly in Asiawith a change in the milling process for rice in the late19th century, around the time that the central nervoussystem manifestations of thiamin deficiency—Wer-nicke’s encephalopathy and Korsakoff’s psychosis—were recognized in Europe. Only in the 20th centurywere these disorders all clearly linked to a deficiencyof a specific dietary factor, which was ultimately deter-mined to be the vitamin now called thiamin. The isola-tion and synthesis of thiamin in the 1930s greatlyimproved the acute treatment of these disorders, butmore importantly made possible the prevention of
large outbreaks of beriberi, as well as the preventionof many sporadic cases of all neurological forms ofthiamin deficiency, through food fortification.
Brontius’s description of the sensorimotorneuropathy of beriberi
Dutch physician Jacobus Brontius (1592–1631), frustratedby the meager earnings from his practice in Leyden,accepted a job in 1627 as physician for the Dutch EastIndia Company in Batavia (now Jakarta), on the islandof Java (in what is now southern Indonesia). In Batavia,Brontius observed and studied a wide range of novel tro-pical diseases, and gave the first European description ofthe sensorimotor polyneuropathy of beriberi in a book,DeMedicina Indorum Libri IV, which was first publishedposthumously in 1642 (Brontius, 1745/1945). Brontiusnoted that theword beri-beri, meaning sheep,was appliedbecause afflicted individuals had a steppage gait thatresembled the gait of sheep. Among the clinical featuresrecognized by Brontius were generalized weakness, tre-mulousness, and paresthesias.
Takaki and the dietary prevention of kakke(beriberi) in the Japanese navy
Although beriberi had been recognized in Asia for sev-eral thousand years, its incidence increased dramati-cally in the 1870s, when it became one of the mostcommon diseases in Asia as an unrecognized conse-quence of a change in diet of the population. By thistime, steam-driven mills had been introduced to Asiafrom Europe and were replacing the previous millingprocess (Verhoef et al., 1999). The new steam-poweredmills efficiently removed the so-called “polishing” and
*Correspondence to: Douglas J. Lanska, MD, VA Medical Center, 500 E Veterans St., Tomah, WI 54660, USA. E-mail:
Handbook of Clinical Neurology, Vol. 95 (3rd series)History of NeurologyS. Finger, F. Boller, K.L. Tyler, Editors# 2009 Elsevier B.V. All rights reserved
with these went essential nutrients, including thiamin.The new polished rice was considered to be superiorin taste and quality and became a dietary staplethroughout Asia.
From 1878 to 1882, approximately one third ofenlisted Japanese sailors reported ill with kakke (i.e.,beriberi) annually (Takaki, 1906a, b, c; Itokawa, 1976;Hawk, 2006). Japanese military physician KanehiroTakaki (1849–1915) noted that the diets of Japanese sai-lors were relatively deficient in nitrogen content (i.e.,protein) compared with the diets of British and Germansailors, who were not susceptible to beriberi. As aresult, Takaki incorrectly attributed the beriberi amongJapanese sailors to a dietary deficiency of protein.
On a training cruise in 1883, 161 of the 278 Japanesesailors (58%) developed beriberi and 25 died (9%),prompting Takaki to push for dietary reforms. Afterreceiving permission for a trial of a modified diet,Takaki arranged for a repetition of the training cruisethe following year, with all factors held constantexcept for the diet, which was modified by increasedamounts of meat, barley, and fruit (thus increasingthe presumptively deficient nitrogen content). In con-trast to the heavy toll the previous year, there wereno deaths and only 14 cases of beriberi, all among sai-lors who refused to eat the full rations of meat andmilk. With these dramatic results, the diets of all Japa-nese sailors were similarly modified, so that, by 1887,Takaki reported that there were only three cases withno deaths over the previous year, compared to morethan 1000 cases annually prior to 1884.
Eijkman and the polyneuriticchickens of Java
In 1886, the Dutch government, hoping to find the causeof beriberi that had become a tremendous problem forits colonies in the East Indies, sent a commission toinvestigate under the direction of Cornelius Pekelharing,Professor of Pathology at the University of Utrecht, withCornelius Winkler, a neurologist (Eijkman, 1929/1965;Carpenter, 2000). When Pekelharing went to Berlin tolearn from Robert Koch (1843–1910) the latest microbio-logical techniques for use in his investigations of beri-beri, he met Dutch military physician ChristiaanEijkman (1858–1930), who had been studying with Kochsince 1885 (only 3 years after Koch’s revolutionary dis-covery that tuberculosis is caused by a specific bacter-ium). Impressed with Eijkman, Pekelharing asked thathe be assigned as an assistant to the commission.
From late 1886 through the summer of 1887, thecommission focused on possible infectious causes ofberiberi at a laboratory established in theMilitary Hospitalin Batavia, Java (now Jakarta, Indonesia). In late 1887,
Pekelharing and Winkler were recalled to Holland (whereWinkler was appointed as the first professor of neurologyin the Netherlands) and Eijkmanwas appointed director ofthe laboratory (Verhoef et al., 1999; Carpenter, 2000). Eijk-man subsequently tried unsuccessfully to infect rabbitsand monkeys with the microorganisms that his colleagueshad isolated from people who had died of beriberi.Undaunted, Eijkman concluded that the responsible infec-tion must be slowly progressive, with considerable timeneeded for clinically evidentmanifestations. Tomake surethat extraneous factors were not responsible for theobserved results over a long time interval of disease devel-opment, many control animals were needed. By late 1889,he had begun using chickens for these injection studies,presumably because the chickens were cheaper and easierto maintain.
At that point, Eijkman was fortunate to observe a ser-endipitous event, astute enough to understand its possiblesignificance, and diligent enough to pursue the necessarystudies to evaluate the possibilities: a “polyneuritis” brokeout among the laboratory’s chickens, which was charac-terized by an unsteady gait with frequent falls and diffi-culty in perching, later an inability to stand or flyattributed to ascending weakness, and finally slowedrespiration, cyanosis, hypothermia, progressive lethargy,and neck extension preceding death (Eijkman, 1929/1965, 1990). Histological examination of peripheralnerves stained by the Marchi method demonstratedaxonal degeneration, most pronounced in the legs, whichwas thought to resemble the changes seen in the periph-eral nerves of people who had died of beriberi (Eijkman,1990; Carpenter, 2000). Curiously, both injected andcontrol chickens were affected, but, because they hadbeen kept together in large cages, Eijkman suspected thatthe chickens he had injected with microorganisms hadsomehow contaminated the control chickens. However,in additional experiments he found that keeping the chick-ens in separate cages made no difference, causing him towonder whether the entire institute had become infected.Further studies to elucidate the putative infection wereunrevealing, but a new possibility presented itself whenEijkman learned that the onset of polyneuritis in chickenscoincided with a change to polished rice, which onlyresolved when they were serendipitously switched backto brown rice (Eijkman, 1929/1965, 1990).
Eijkman then began “deliberate feeding experi-ments” with chickens. In 1889, Eijkman found thatchickens fed on a diet restricted to cooked polishedrice developed polyneuritis generally after 3 or 4weeks, but recovered if returned to feed-gradeunpolished rice (Eijkman, 1990). Despite his dietaryexperiments, Eijkman had trouble abandoning hisinitial microbiological framework: in keeping with thelate-19th-century concept of “ptomaines,” Eijkman
suggested that “cooked rice favored conditions for thedevelopment of micro-organisms of a still unknownnature in the intestinal tract, and hence for the forma-tion of a poison causing nerve degeneration” (Eijkman,quoted by Carpenter, 2000, p. 39).
Later experiments, from 1891 to 1895, demonstratedthat the difference between polished and whole rice couldnot be attributed to inadequate preservation or contami-nation of the polished rice (e.g., by a microbial toxin),because: (1) freshly prepared polished rice could alsocause beriberi; and (2) beriberi did not develop frombrown or “rough rice” (i.e., with only the coarse huskremoved but still containing the “silver skin” or pericar-pium and the germ largely intact), even though this formof rice deteriorates much more quickly (Eijkman, 1929/1965, 1990). Disease development also did not dependon whether the polished rice was cooked or raw, on thewater used for cooking (as it even developed with artesianor distilled water), or on the presence of coarse rice husksas a source of dietary fiber. Importantly Eijkman foundthat the polyneuritis could be cured or prevented by feed-ing the chickens either unpolished rice or the discardedrice polishings.
Feeding the chickens other starchy plant foods (e.g.,sago and tapioca) produced identical results to thosewith cooked rice, making Eijkman wonder whether ber-iberi is at least in part due to lack of adequate food, assome of the birds were considerably emaciated—a sus-picion seemingly confirmed when the birds recoveredwhen fed only meat. However, birds fed on acombined diet of starch and meat ultimately developedberiberi, even though they did not become emaciated.Furthermore, simple starvation did not produce beri-beri. Eijkman (1929/1965) concluded that “inanition initself could not be the main cause of the disease (anymore than ‘protein’ or ‘salt’ deficiency), even thoughit promoted it.”
Vorderman’s observational studiesof beriberi in Java prisons
In late 1895, Eijkman discussed the etiology of beriberiwith his friend Adolphe Vorderman (1844–1902), Inspec-tor-General of Public Health, who was the governmentphysician responsible for the medical inspection of pris-ons across Java. Because it was known that differentprisons had different frequencies of beriberi, the twoconsidered the possibility that the different prisons wereusing rice processed in different ways. Rations werehighly standardized in Java prisons (e.g., 750 gm rice, 1chili pepper, 150 gm of other mixed vegetables, etc.),but the type of rice was not specified and thereforeopen to the discretion of the prison governors,local market availability, prices, etc. (Carpenter, 2000).
Vorderman wrote a form letter to each prison governorasking for the incidence of beriberi and the type of ricein use (Carpenter, 2000; Vandenbrouke, 2003). By 1896,preliminary results from available replies suggested thatberiberi was almost exclusively confined to prisonsusing polished white rice. Based on these results, thegovernment approved a larger and more in-depth study.Unfortunately, Eijkman was ill with malaria and had toreturn to Holland.
Volderman spent the next 5 months visiting all 101prisons scattered across the large island (some 54 000square miles), taking samples of the rice used, recordingthe frequency of beriberi over the previous 18 months,and recording environmental information about eachprison. When the rice samples were examined at thelaboratory in Batavia, there was considerable variationin the completeness of deskinning, so that the rice hadto be categorized as mostly polished (¼ 75% of grainswere deskinned), mostly unpolished (< 25% of grainswere deskinned), or intermediate. Of the 96 000 peopleimprisoned at institutions using unpolished rice, lessthan 1 in 10,000 developed beriberi, whereas of the 150000 people at institutions using polished rice, 1 in 39(2.8%) developed beriberi (Verhoef et al., 1999; Carpen-ter, 2000). Although beriberi prevalence varied dramati-cally with the type of rice used, it did not vary with thesource of the rice (imported or locally produced), theage of the buildings, floor permeability, adequacy ofventilation, or degree of overcrowding. On the basis ofthese data, Vorderman persuaded governmental authori-ties to modify the prison diets to include more unpol-ished rice, and also more beans and other vegetables.Any subsequent change in incidence was apparentlynever formally studied (Carpenter, 2000), although itwas reported that this public health measure rapidlyeliminated beriberi from the prison populations (Ver-hoef et al., 1999; Vandenbrouke, 2003).
Vorderman’s study was a valuable early effort atobservational epidemiology, but not without limita-tions. For example, Vorderman relied on retrospectiveclinical diagnoses (of variable validity), did not assesseach prisoner’s duration of exposure to polished rice(especially when some sentences were as short as afew days and others were much longer), and didnot address potential confounding factors (e.g., a dif-ference in disease frequency for coastal and centralprisons) (Carpenter, 2000).
Grijns’ dietary deficiency explanation ofberiberi presaged the “vitamin doctrine”
Dutch physician Gerrit Grijns (1865–1944) from theUniversity of Utrecht was assigned to continue thestudies of beriberi in Java after Eijkman returned to
Au1HISTORICAL ASPECTS OF THE MAJOR NEUROLOGICAL VITAMIN DEFICIENCY DISORDERS 447
Holland in 1896. Having no preconceived ideas con-cerning etiology, Grins considered additional possibili-ties that had not been addressed by Eijkman.Eijkman’s work had already indicated that it was notthe protein in the silverskin of the rice that was impor-tant for preventing beriberi, so Grijns systematicallyexcluded potential deficiencies of minerals or fats inthe polished rice associated with beriberi (Carpenter,2000). Grijns tested other foods and discovered thatboth mung beans and “pigeon peas” had antineuriticproperties (Eijkman, 1929/1965; Carpenter, 2000). Grijnsalso excluded a toxic effect specific to rice starch bydemonstrating that polyneuritis also developed in chick-ens fed only on autoclaved meat or on potato flour plusa protein supplement (i.e., mung beans in which the anti-neuritic properties were destroyed by autoclaving) (Eijk-man, 1929/1965; Carpenter, 2000).
What dietary factor was left to consider, Grijnswondered, since the various components of a physiolo-gically complete diet as then understood (i.e., suffi-cient proteins, carbohydrates, fats, inorganic salts,and water) had all apparently been excluded as possibi-lities? Grijns noted two well known but insufficientlyappreciated facts that suggested foods might containpreviously unidentified nutrients: (1) sailors sufferingfrom scurvy could be cured by fresh meat and freshgreen vegetables (or by citrus fruits or their juices, aswas well known from James Lind’s experiments in1747); and (2) infant formulas were not a satisfactorysubstitute for breast milk even with the same concen-trations of protein, sugar, fat, and salts. Grijns’ subse-quent attempts at extracting the antineuritic factorfrom rice bran were unsuccessful, but he discoveredthat the antineuritic factor had been destroyed by theprocessing method he used.
In 1901, Grijns considered two possibilities to explainthe known facts concerning the etiology of beriberi: first,a “deficiency or partial starvation” of a substance thatwas necessary in small amounts for maintaining meta-bolic functions of the peripheral nervous system and themuscles; or second, the lack of a protective dietary factorthat normally acts to maintain resistance of the peripheralnervous system to an environmental agent (e.g., a micro-organism) that otherwise causes neural degeneration. Ineither case, Grijns supposed, beriberi was actually causedby a dietary deficiency of a specific natural substancefound in certain foods (Carpenter, 2000).
Unfortunately, Grijns’ important work presaging the“vitamin doctrine” was published in Dutch and notwidely recognized at the time. In 1929, Eijkman sharedthe Nobel Prize in Physiology or Medicine with Freder-ick Hopkins “for their discovery of the growth stimu-lating vitamins,” but Grijns’ important work wasoverlooked by the Nobel Prize Committee and Eijkman
failed to give Grijns appropriate recognition in hisNobel Prize Lecture. Grijns research achieved widerrecognition only after colleagues had his work trans-lated and published in English in 1935 (Grijns, 1935).
Grijns’ work did lead to a clinical trial after Vorder-man visited a mental hospital in Buitenzorg and dis-cussed the treatment of beriberi with Hulshoff Pol, thephysician in charge (Carpenter, 2000). Pol set up a con-trolled trial of 300 patients to compare the value ofmung beans, green vegetables (suggested by Vorder-man, because these were a dietary staple in the sur-rounding villages), regular disinfection (to assess theproposal of another physician that beriberi was spreadby cockroaches), and a control with no specific treat-ment. As the patients were housed in 12 separate pavi-lions, the different treatments were allocated to fourdifferent groups of three pavilions each. After 9months, none of the patients in the mung bean grouphad developed beriberi, compared with 19% in the greenvegetable group, 42% in the regular disinfection group,and 33% in the control group. The three pavilions receiv-ing beans were protected from developing beriberi, andfurther tests demonstrated that beans could reverse thepedal edema and congestive heart failure, though theydid not restore the function of severely damaged nerves.
Over a 1-year period in the late 1870s, German neurop-sychiatrist Carl Wernicke (1848–1905), working at theCharite in Berlin, treated three patients with an unusualconstellation of neurological findings including mentaldisturbances, ophthalmoparesis, nystagmus, and ataxia(Wernicke, 1881/1977). The first was a 20-year-oldwoman Wernicke treated in early 1877 after she hadswallowed sulfuric acid and developed protractedvomiting. After some degree of recovery, she devel-oped photophobia, impaired vision, drowsiness, andgait ataxia. Examination disclosed somnolence, disor-ientation, apathy, and later extreme anxiety, nocturnalagitation, convergent strabismus with asymmetric lat-eral rectus weakness, vertical nystagmus, impairedconvergence, optic disc swelling, and multiple flame-shaped retinal hemorrhages. The symptoms progres-sively worsened and she died within 2 weeks of onset.Wernicke subsequently observed similar symptoms intwo alcoholic men, both of whom were hospitalizedwith agitated delirium.
Wernicke summarized the clinical features of histhree cases, noting particularly the progressive opthal-moparesis, ataxia, and encephalopathy with either agi-tation or somnolence. In all three cases, pathologicalexamination of the brain at autopsy demonstrated
numerous punctate hemorrhages symmetricallyarranged in the grey matter around the third andfourth ventricles and the aqueduct of Sylvius. Wer-nicke considered these pathological changes to haveresulted from an acute, inflammatory disease of therostral brainstem involving cranial motor nerve nucleiof the extraocular muscles, analogous to poliomyelitisand its involvement of the grey matter of the anteriorhorns of the spinal cord—hence his term, “acutehemorrhagic polioencephalitis superior.”
From 1887 to 1889, in a series of three articles, Russianpsychiatrist Sergei Sergeievich Korsakoff (sometimesspelled Korsakov, 1853–1900) gave a comprehensivedescription of a cognitive disorder now known as Kor-sakoff psychosis, occurring in conjunction with periph-eral polyneuropathy (Korsakoff, 1889/1955; Victor andYakovlev, 1955). As early as 1887, Korsakoff felt thatthe cognitive disorder and the polyneuropathy repre-sented “two facets of the same disease . . . The patholo-gic cause provoking multiple neuritis may affectseveral parts of the nervous system, central as well asperipheral, and according to where this cause is loca-lized there will be symptoms either of neuritis orof the brain” (quoted by Victor and Yakovlev, 1955,p. 395); hence his initial terms, “psychosis associatedwith polyneuritis” and “polyneuritic psychosis.”
By 1889, Korsakoff recognized that, “At times . . .the symptoms may be so slight that the whole diseasemanifests itself exclusively by psychic symptoms”(quoted by Victor and Yakovlev, 1955, p. 395): there-fore, as Korsakoff stated in his final publication,“One might also call it psychosis polyneurotica, butusing this designation one must remember that an iden-tical psychic disturbance may occur also in cases inwhich the symptoms of multiple degenerative neuritismay be very slight or even entirely wanting” (Korsak-off, 1899Au2 /1955, p. 402). The other terms suggested byKorsakoff—“toxemic cerebropathy (cerebropathiapsychica toxemica)”—were based on his concept thatthe diverse conditions associated with the disordercould all be “reduced to an incorrect constitution ofthe blood, developing under their influence and leadingto an accumulation in the blood of toxic substances,”which could poison the central nervous system, thenerves, or both (Korsakoff, 1889/1955, p. 402).
Although many modern authors consider only arestricted form of cognitive disorder under the eponymof Korsakoff’s psychosis—i.e., the combination ofanterograde amnesia and confabulation—Korsakofforiginally described a much wider range of mental
states, often occurring sequentially, including an agi-tated delirium, an apathetic acute confusional state,and a confabulatory anterograde amnestic state. Itwas, however, the confabulatory amnestic state,typically following an agitated delirium, that mostintrigued Korsakoff:
Korsakoff based his conclusions on at least 46patients—approximately two thirds of whom werealcoholics, with the remainder having a wide varietyof conditions often associated with protracted vomit-ing, including postpartum infections, intestinalobstruction, abdominal tumor, typhoid fever, and jaun-dice. Korsakoff’s writings do not include a pathologicdescription of the disease. Also, he was apparentlyunaware of the important association of the cognitiveand neuropathic features with the oculomotor findingsand ataxia described by Wernicke in 1881: Korsakoffdid mention that “sometimes there are ophthalmople-gia externa, nystagmus, and like manifestations,” buthe attached no great significance to this, consideredthese symptoms among a range of other manifesta-tions that indicated “a disturbance of the entire organ-ism,” and did not pursue this further (Korsakoff, 1889/1955, p. 399).
Relationship between Wernicke’sencephalopathy and Korsakoff’s psychosis
Neither Wernicke nor Korsakoff appreciated the closerelationship between the disorders the two of themdescribed. It was not until the early years of the 20thcentury that Bonhoffer recognized the close relation-ship between Korsakoff’s psychosis, delirium tremens,and Wernicke’s encephalopathy (Bonhoffer, 1901,1904). Bonhoffer also recognized that the lesions inWernicke’s encephalopathy are not inflammatory.By 1904, Bonhoffer concluded that neuritis (neuropa-thy) and a memory disorder can be found in allpatients with Wernicke’s encephalopathy. During thesubsequent decade, several authors noted the frequentco-occurrence of Wernicke’s encephalopathy andKorsakoff’s psychosis.
Because Wernicke’s encephalopathy was often ful-minant, autopsies were done on many of the cases,including Wernicke’s first three cases (Wernicke,1881). In contrast, Korsakoff’s psychosis oftenrequired survival from Wernicke’s encephalopathy tobe manifest, so that pathological material was lessavailable. Consequently Korsakoff was unable todescribe the pathology, despite having evaluated atleast 46 patients. The clinico-pathologic overlap wasnevertheless ultimately recognized between Wernicke’sencephalopathy and Korsakoff’s psychosis (Gamper,1928; Kant, 1932–1933; Campbell and Biggart, 1939).
HISTORICAL ASPECTS OF THE MAJOR NEUROLOGICAL VITAMIN DEFICIENCY DISORDERS 449
For example, Kant (1932–1933) noted an amnestic dis-order in all patients presenting with Wernicke’s ence-phalopathy, and also found the characteristic brainstem pathology of Wernicke’s encephalopathy in allfatal cases of Korsakoff psychosis.
In the 1920s and 1930s, careful pathological studiesnoted the selective distribution of symmetric lesionsaffecting the mamillary bodies, the grey matter imme-diately surrounding the third ventricle and involvingthe hypothalamus and the medial portion of the thala-mus, the periaqueductal grey matter (including theoculomotor nuclei), the posterior colliculi, and lessfrequently the floor of the fourth ventricle (involvingthe dorsal vagal nuclei and the median eminence)(Gamper, 1928; Campbell and Biggart, 1939). Specifichistological changes in affected areas included hyper-emia and sometimes small hemorrhages, vascularirregularities and proliferation of small blood vessels,relatively slight evidence of damage to nerve cells,variable microglial and astrocytic glial reaction, andabsence of inflammatory infiltration. Althoughpredominantly a polioencephalopathy, white matterwas sometimes affected, including the columns ofthe fornix adjacent to the mamillary bodies and theoptic nerves.
Subsequent pathological studies have repeatedlydocumented a high frequency of cases of Wernicke-Korsakoff disease that went unrecognized during life,for example 86% in Harper’s (1979) series of 51 cases.Characteristic clinical findings of Wernick’e encephalo-pathy (e.g., a triad of organic mental syndrome,ophthalmoparesis, and ataxia) were typically reportedin clinical studies, but in only a minority of cases inpathological studies, suggesting various selection andreporting biases in both types of study, but also thatcases with atypical clinical features were seldom beingrecognized during life (Cravioto et al., 1961).
Isolation and synthesis of thiamin
In the late 1800s and early 1900s, several investigatorstried unsuccessfully to isolate the antineuritic sub-stance (Jansen, 1956; Williams, 1961; Carpenter,2000). In 1926, Jansen and Donath, working in Batavia(where Eijkman had worked), finally crystallized thesubstance from rice polishings (Jansen and Donath,1926; Jansen, 1956; Williams, 1961; Carpenter, 2000).Jansen and Donath had known that the protective fac-tor was a relatively small molecule that was dialyzableand probably an organic base (Jansen, 1956). Work wasfrustratingly slow, however, until they were able toidentify a small tropical bird (i.e., the bonbol) thatwas more susceptible than chickens and thereforedeveloped manifestations of deficiency more quickly
and more reliably. Jansen and Donath ultimately iso-lated about 100 mg of a chemically pure substance thatwas extraordinarily potent in the prevention of poly-neuritis in bonbols and also effective in the treatmentof affected pigeons. The investigators sent 40 mg ofthese crystals to Eijkman in Utrecht where he was ableto demonstrate their prophylactic and curative proper-ties in the pigeon polyneuropathy model (Jansen,1956; Carpenter, 2000).
In 1931, A. Windaus and colleagues in Gottingen iso-lated the pure, crystalline vitamin from yeast and demon-strated the presence of a sulfur atom in the molecule thathad previously been overlooked by Jansen and Donath(Jansen, 1956; Williams, 1961; Carpenter, 2000). Withknowledge of the empirical chemical formula, RobertWilliams and colleagues proceeded to elaborate the che-mical structure, and in 1936Williams and J. K. Cline com-pleted the chemical synthesis (Williams and Cline, 1936;Williams, 1961), followed nearly simultaneously by thesame feat in two other laboratories (Jansen, 1956; Carpen-ter, 2000). The structure proved to include a pyrimidinering linked by amethyl group to a thiazole ring. As a resultof the chemical synthesis of thiamin, dietary supplemen-tation became feasible, and by the 1950s synthetic formsof the vitamin were produced cheaply and used to enrichpolished rice (Jansen, 1956).
Vitamin B1 was initially named aneurin (foranti-neuritic vitamin), but was subsequently namedthiamine (for thio = sulfur-containing vitamin), ormore recently thiamin (Carpenter, 2000).
The metabolic role of thiamin
In the 1930s, Rudolph Peters and colleagues in Oxforddeveloped a biological assay of thiamin deficiencyusing acute opisthotonous in pigeons as the biomarker.They discovered that: (1) lactic acid was elevated in thebrain of thiamin-deficient pigeons, particularly in thebrainstem, even before clinical signs were evident(whereas exercise increased brain lactate levels fairlyevenly across different brain areas); (2) elevated brainlactate levels were associated with decreased oxygenuptake, especially in the brainstem when pyruvatewas the substrate; (3) elevated brain lactate was asso-ciated with a decreased rate of oxidation of pyruvate;(4) thiamin increased the respiration of brain tissue,with the essential biochemical step being the additionof thiamin pyrophosphate (cocarboxylase) as a cofac-tor for pyruvate dehydrogenase; and (5) ingested non-phosphorylated thiamin (i.e., from plant rather thananimal sources) must be phosphorylated to be activeas a cofactor (Kinnersley and Peters, 1930; Meiklejohnet al., 1932; Peters, 1936; Ochoa and Peters, 1938;Banga et al., 1939; Ochoa, 1939; Victor et al., 1989).
Subsequent studies demonstrated the range of activ-ity of thiamin in intermediary carbohydrate metabolism,including acting as a coenzyme for pyruvate dehydro-genaase (in the decarboxylation of pyruvate to acetyl-CoA), a-keto-glutarate dehydrogenase (in the decarboxy-lation of a-keto-glutarate in theKrebs cycle), and transke-tolase (in the pentose phosphate pathway) (Platt and Lu,1939; Horecker and Smyrniotis, 1953; Racker et al.,1953). Although many of the biochemical pathways inwhich thiamin is utilized have been well studied, how thia-min deficiency actually produces the clinical and patholo-gical manifestations of beriberi and Wernicke-Korsakoffdisease is less well understood, with conflicting evidenceavailable concerning the roles of the different thiamin-dependent enzymes (Victor et al., 1989).
Etiology of Wernicke–Korsakoff disease
After the reports of Wernicke and Korsakoff in the1880s, alcoholism was recognized as the most fre-quently associated underlying cause of the disorder.Still, a wide range of other associated conditions wasalso recognized in which thiamin deficiency could beattributed to inadequate intake, impaired absorption(e.g., protracted vomiting and gastrointestinal distur-bances), increased requirements (e.g., fever, carbohy-drate loading), or some combination of these.Wernicke’s encephalopathy was often associated withprotracted vomiting, particularly in pregnancy (hyper-emesis gravidarum) (Wernicke, 1881; Henderson, 1914).
In the 1930s and early 1940s, a deficiency of B vitaminswas proposed as the cause of Wernicke–Korsakoffdisease, and shortly thereafter the therapeutic effects ofthiamin were demonstrated in this condition. In 1933,Bender and Schilder suggested that Wernicke’s encepha-lopathy was due to a vitamin deficiency rather than toalcohol toxicity. In 1937, Wagoner and Weir reportedclinical improvement with B vitamins in Wernicke’sencephalopathy following protracted vomiting duringpregnancy. In 1939, Bowman and colleagues reportedthe therapeutic effects of thiamin in patients withKorsakoff’s psychosis. Campbell and Biggart (1939)implicated thiamin deficiency as the common etiologicfactor resulting from various conditions associatedwith Wernicke’s encephalopathy, including alcoholism,hyperemesis gravidarum, and carcinoma.
In 1941, Jollife and colleagues at the Psychiatric Insti-tute of Bellevue Hospital in New York documented therapid resolution of ocular palsies, and the slowerimprovement in ataxia and peripheral neuropathy withthiamin treatment, whereas the cognitive changes—especially Korsakoff’s psychosis—proved to be intract-able. They concluded that the ophthalmoparesis ofWernicke’s encephalopathy responded to thiamin, but
the entire syndrome was probably due to a “combinationof several nutritional deficiencies” (Jolliffe Au3et al., 1941).
In the late 1930s and early 1940s, similar pathologicchanges were produced in animal models by maintain-ing the animals on thiamin-deficient diets (e.g., rats,foxes, fish, and pigeons) (Prickett, 1934; Alexanderet al., 1938; Alexander, 1941). For example, in 1934, C.O. Prickett of the Alabama Polytechnic Institutedemonstrated neuropathological changes in the brain-stem of rats maintained on a diet deficient in thiaminbut supplemented with other vitamins: after 40 daysthe animals became ataxic and soon died with bilateralpetechial hemorrhages in the floor of the fourth ventri-cle and involving the vestibular nuclei and the nucleussolitarius (Prickett, 1934). Similarly, in 1938, Leo Alex-ander of Boston and colleagues demonstrated thatexperimental beriberi in pigeons produced lesionswhich appeared similar to those described by Wernickein 1881.
Even into the 1950s, the details of the clinical fea-tures associated with isolated thiamin deficiency werecontested. In 1952, Phillips and colleagues reporteddetailed studies of nine patients with classic Wer-nicke’s encephalopathy (i.e., with ophthalmoparesis,nystagmus, ataxia, and mental disturbances) who weregiven a diet composed solely of glucose and minerals,with specific vitamins added after periods of observa-tion (Phillips et al., 1952). None of the clinical featuresimproved before administration of thiamin, despitealcohol withdrawal, bed rest, and addition of othervitamins (i.e., niacin, calcium pantothenate, pyridoxine,folic acid, ascorbic acid, riboflavin, or cyanocobala-min). Instead, the ophthalmoparesis progressed andthe nystagmus decreased only in association with theincreasing oculomotor paresis. With administration ofthiamin, the ophthalmoparesis improved markedly infrom 1 to 6 h, confirming that ophthalmoparesis isdue to a specific lack of thiamin. The nystagmus andataxia improved more slowly and less completely,while the mental changes improved only minimallywith improved attention but with some greater confa-bulation. The authors felt that the evidence for a causalassociation between thiamin deficiency and the nystag-mus and ataxia was “less conclusive,” and that no defi-nite conclusions were possible concerning therelationship between mental disturbances and vitamindeficiency.
Studies from the 1930s and thereafter tried to utilizeassays of blood or urine thiamin, or assays of bloodpyruvate and lactate, for clinico-pathological correla-tion, clinical diagnosis, and treatment monitoring.However, neither blood nor urinary thiamin levels aresensitive indicators of tissue stores of thiamin, and ele-vated blood pyruvate and lactate levels are not suffi-
HISTORICAL ASPECTS OF THE MAJOR NEUROLOGICAL VITAMIN DEFICIENCY DISORDERS 451
ciently specific (Platt and Lu, 1939; Wortis et al., 1942;Sauberlich, 1967; Victor et al., 1989).
In the early 1960s, the etiological relationshipbetween thiamin deficiency and Wernicke’s encephalo-pathy was further supported by chemical analyses thatdemonstrated elevation of erythrocyte transketolaselevels in the blood of patients with the disorder, consis-tent with thiamin deficiency (Brin et al., 1956, 1958;Dreyfus, 1962; Victor et al., 1989). Subsequently trans-ketolase levels have been developed into a useful clin-ical test for Wernicke’s encephalopathy (Dreyfus,1962; Dreyfus and Hauser, 1965; Sauberlich, 1984;Victor et al., 1989).
NIACIN DEFICIENCY: PELLAGRA
Delirium, dementia, psychosis, and depression werecommon neuropsychiatric features of pellagra as itwas seen in the 18th and 19th centuries in Europe andin the early portion of the 20th century in the UnitedStates (Lombroso, 1892, quoted by Marie, 1910;Lanska, 1996, 2004).
Recognition of pellagra
Pellagra was apparently unknown prior to the introduc-tion of maize into Europe from the New World. Gas-par Casal (1691?–1759), physician to King Ferdinandof Spain, described the signs and symptoms of pellagrain 1735, noted that the condition was known locally asmal de la rosa (disease of the rose), because of theerythematous rash on sun-exposed areas of the body,and linked it with poverty and a diet with little milk,meat, or other foods of animal origin (Marie, 1910;Etheridge, 1972; Bollet, 1992; Rajakumar, 2000). In1771, the Italian Francesco Frapolli noted that, in Italy,the disease was associated with poverty and a diet lar-gely restricted to maize-based polenta, exacerbated bysun exposure, and known locally as pellagra (pelle,skin, and agra, rough) (Frapolli, 1771/1945; Marie,1910; Niles, 1916). In addition to the dermatitis recog-nized by Casal and Frapolli, other clinical manifesta-tions included dementia (or depression), diarrhea, anddeath—the “4 D’s.”
From the time of Casal’s description, endemic pella-gra was recognized across large areas of Europe, par-ticularly Spain and Italy, where peasants subsisted onnutritionally marginal corn-based diets, but also inFrance, Romania, Bulgaria, Yugoslavia, Austria, Hun-gary, Russia, as well as Egypt and North Africa. In theUnited States, endemic pellagra arose much later as aresult of dietary deficiencies arising from the cottonmonoculture of the South following the Civil War(Etheridge, 1972; Bollet, 1992; Lanska, 1996). A numberof scattered cases of pellagra were reported from the
time of the Civil War up into the early-20th century,although not all reported cases were recognized assuch at the time of the reports, and the diagnoses inthe others were doubted. Beginning in 1907, outbreaksof pellagra were reported in various asylums, and by1910 the disease was recognized throughout most ofthe southern states and in several other states (Searcy,1907; Etheridge, 1972; Bollet, 1992; Lanska, 1996).
Even if there were occasional (at that time unrecog-nized) cases in the United States prior to 1900, it wasonly after 1900 that pellagra became a significant pub-lic health problem, particularly in the South. As wasthe case with beriberi in Asia in the 19th century, theepidemic of pellagra in the South followed the intro-duction of a new grain processing method that effec-tively removed much of the vitamins from theprocessed grain. Specifically, in the case of pellagrathere was a shift from use of coarsely ground cornmeal produced in local, water-driven, grist mills before1900 to use of finely bolted meal produced by largemilling companies, which was degerminated to preventdevelopment of rancidity during storage and shipment(Sydenstricker, 1958).
Pellagrous dementia
From the earliest descriptions of pellagra in the UnitedStates around 1907, several investigators noted promi-nent neuropsychiatric manifestations including depres-sion, delirium, and dementia. In 1907, Ray et al. (1907–1908) from the State Hospital for the Insane, in Colum-bia, South Carolina, reported to the South CarolinaState Board of Health similarities between the neurop-sychiatric features of pellagra and those of syphiliticgeneral paresis and acute delirium. Similarly, in 1909,in a report for the 35th annual meeting of the Ameri-can Neurological Association, neurologist Eugene Bon-durant (?–1950) from Mobile, Alabama, noted thatpellagra may begin with lassitude and dysthymia, withsubsequent development of emotional lability, psycho-sis, depression, delirium, and dementia, typically withdepression predominating (Bondurant, 1910).
In 1915, psychiatrist H. Douglas Singer of the Illi-nois State Psychopathic Institute in Kankakee arguedthat previous statistics concerning the frequency ofneuropsychiatric disturbance in pellagra were biased,among other things having been ascertained in psychia-tric facilities without careful clinical evaluation andwithout reference to a population base in the commu-nity. Using data from Spartanburg County, South Car-olina over the period January 1912 to June 1913,obtained from community surveys conducted by theThompson-McFaddin Pellagra Commission and frompersons adjudged insane with and without pellagra in
the county, Singer provided data supporting a inci-dence rate of insanity of approximately 520 cases per10,000 pellagrins per year, 75-times the rate of diag-nosed insanity in the general population of that county(calculated from Singer, 1915, p. 149). Singer also notedthat there was an “extraordinary frequency of pellagraarising [de novo] in hospitals for the care of theinsane” and that patients with neuropschiatric disor-ders were also somehow predisposed to develop pella-gra (Singer, 1915, p. 150).
By the 1920s and 1930s, the triad of neuropsychiatricdysfunction with skin and gastrointestinal manifestationswas well known and thought to be specific, but insensi-tive, for the diagnosis of pellagra, particularly in the earlystages (Stevens, 1922; Meakins, 1936). In 1943, VirgilSydenstricker (1889–1964), Chairman of the Departmentof Medicine at the University of Georgia School of Med-icine (now the Medical College of Georgia), again notedthat the neuropsychiatric manifestations of pellagra werevariable, not specific, and could be the presenting mani-festation of pellagra (Sydenstricker, 1943).
Etiologic theories of pellagra
By the early-20th century, toxic, infectious, nutritional,and hereditary theories of the etiology of pellagra foundlimited empiric support (Roberts, 1913a; Niles, 1916).Each new theory spawned various, often aggressivetherapies, including cecostomy with colonic irrigation,arsenical administration, and dietary manipulation. Butinitial therapeutic results were misleading and ultimatelydisappointing. Identification of effective therapies washampered by poorly controlled studies with small sam-ple sizes, failure to consider placebo effects or naturalhistory, and inclusion of cases with mistaken diagnosesor unrecognized concurrent conditions.
The major etiologic theory of pellagra at the time ofinitial recognition in the United States was the “zeist”theory (a term derived from Zea mays, the scientificname for maize or Indian corn), which attributed pella-gra to the ingestion of corn. Although corn was widelyheld to be in some way responsible for pellagra, pro-posed mechanisms varied widely. Support for the corntheory came mainly from ecologic observations inwhich groups of individuals served as the unit of ana-lysis: (1) The appearance and certainly the recognitionof pellagra followed historically the introduction ofcorn as a staple food into Spain, France, Italy, andother countries of southern Europe; (2) Endemic pella-gra occurred only in countries where corn was grownand used extensively among the rural poor; (3) Coun-tries where corn was not grown or used as food werefree of pellagra, even if contiguous to or surroundedby pellagrous countries; and (4) A change of food
generally resulted in a diminution or disappearanceof pellagra, especially if all corn or corn products wereremoved from the diet. Such evidence, while suppor-tive, was not definitive and not universally accepted.
Opponents of the zeist theory (“anti-zeists”) coun-tered that: (1) Although pellagra was first recognizedas a specific disease in the early-18th century, this didnot prove it was not present earlier; (2) Pellagra wasendemic only over a small part of the extensive areawhere corn was cultivated, and, indeed, it was absentfrom many places where maize was a staple food; (3)Cases were reported among people who reportedlyhad not eaten corn or corn products, and from placeswhere corn was not cultivated; (4) In many places theapparent frequency of pellagra increased or decreasedwithout any apparent change in dietary habits.
One subset of zeist theories proposed that corn wasassociated with a pellagra-causing toxin: for example,toxins could be a component of natural corn, couldbe elaborated by microorganisms such as bacteria orfungi involved in corn spoilage (Reed, 1910; Bass,1911), or might be produced in the alimentary canal(MacNeal, 1913). By the time pellagra was recognizedas endemic in the United States, world opinion wasstrongly against corn containing a toxic substance,unless the corn had been modified in some way (e.g.,by microorganisms), because corn was consumed inmany places with apparent impunity.
Some authorities, particularly the Italian physicianCesare Lombroso (1836–1909), suggested that a toxinwas produced in spoiled corn, which was not present ingood corn, and many investigators subsequently consid-ered pellagra to be analogous to ergotism resulting fromingestion of toxic products of a fungus growing on ryeused as a foodstuff (Lombroso, 1892; Marie, 1910; Reed,1910; Bass, 1911; Voegtlin, 1914; Lanska, 2004); however,the distribution of pellagra did not coincide well with thevery irregular and variable distribution of spoiled corn.
Another subset of the zeist theories considered cornto have inadequate nutritional value. The initial formu-lation of this theory was that a corn-based diet providesinsufficient protein, but chemical analyses of corn didnot substantiate this and many recognized authoritiesargued persuasively against this possibility (Lavinder,1909; Niles, 1916). For example, in 1916, Niles stated“this explanation is inadequate. If corn is lacking incertain nutritive qualities—in gluten, in nitrogenousmatter—so is rice, which, nevertheless does not pro-duce pellagra” (Niles, 1916, p. 62). More sophisticatedanalyses later showed that corn is deficient in certainamino acids, and this imbalanced protein becameviewed as the responsible factor.
A number of infectious theories were also pro-posed, and various investigators and authoritative
HISTORICAL ASPECTS OF THE MAJOR NEUROLOGICAL VITAMIN DEFICIENCY DISORDERS 453
groups (e.g., Sambon, 1905; Marie, 1910; Siler et al.,1914a,b, 1917) supported an infectious etiology,although no consistent organism was implicated (Sam-bon, 1905), affected individuals were not febrile, therewas no evidence of inflammation, and attempts attransmitting the condition with body secretions or skinscrapings failed (McCafferty, 1909; Anderson, 1911;Lavinder, 1911; Singer et al., 1912; Lavinder et al.,1914). Some considered that a microorganism waseaten with corn and subsequently set up an intestinalinfection (which could of course explain some of thegastrointestinal manifestations of pellagra), whileothers suggested that a filterable virus is responsible(Harris, 1913), and still others considered that a vector,such as a mosquito, common sand fly, or gnat isresponsible for transmitting a parasitic condition simi-lar to malaria, filariasis, or trypanosomiasis (Sambon,1905; Lavinder, 1910; Marie, 1910; Roberts, 1913a, b).
Based on human-to-monkey transmission experi-ments using a combination of subcutaneous, intrave-nous, and intracerebral injections of large quantities ofmaterial originating from two fatal human cases of pel-lagra, Harris (1913) claimed to have produced pellagra intwo of three injected monkeys and therefore concludedthat pellagra was caused by a filterable virus (Harris,1913, p. 1950). However, these results were not repli-cated, and other studies attempting to transmit the dis-ease from human cases to monkeys were “uniformlynegative” (Singer et al., 1912, p. 175; Anderson, 1911;Lavinder, 1911; Lavinder et al., 1914). Around 1909, atthe Mt. Vernon Hospital “for the colored insane” inTuscaloosa, Alabama, Assistant Superintendant E. L.McCafferty, MD, also performed unethical humanexperiments in unsuccessful attempts to transmit pella-gra to unaffected institutionalized patients: “We triedto infect some patients by swabbing out the sick one’smouth and rubbing it into the well one’s mouth; alsoswabbed the sores on the hands and feet, and scarifiedwell ones’ hands, but failed to get any results,” he wrote(McCafferty, 1909, p. 229). Unfortunately, misuse ofvulnerable human subjects for experimental researchstudies in attempts at transmitting what would later befound to be deficiency diseases was not isolated to thisreport. For example, as noted by Vedder (1913, p. 144):
Fraser and Stanton, in the course of their workon beriberi [in Malaya beginning in 1907], per-formed a large number of human experiments, inwhich they tried by every conceivable method,including insect transmission, to infect healthyindividuals from beriberi patients. These experi-ments were all negative, but were unfortunatelysuppressed by the [British] Government for poli-tical reasons.
Indeed, these experiments are not reported in a volumeof Fraser and Stanton’s collected papers on beriberi(Fraser and Stanton, 1924).
Joseph Goldberger and the “P-P factor”
In 1914, with pellagra mordbidity and mortality expand-ing rapidly in the South, the US Public Health Servicecommissioned Dr Joseph Goldberger (1874–1929) tostudy the disease. From 1914–1929, Goldberger com-pleted well-designed epidemiologic investigations, testedtheories with human experiments, and developed an ani-mal model (Parsons, 1943; Terris, 1964; Kraut, 2003).
Goldberger’s first paper was published in June 1914.He emphasized three epidemiologic facts that had beenrepeatedly observed: (1) In various medical institutions,the inmates developed pellagra after varying periods ofresidence, but ward personnel, attendants, and nurseswere uniformly exempt despite prolonged and closecontact with affected patients; (2) Pellagra preferen-tially affected rural areas; and (3) Pellagra is associatedwith poverty (Goldberger, 1914a). Based on these con-siderations, Goldberger boldly proposed a plan of pre-vention that emphasized dietary changes with areduction in cereals, vegetables, and canned foods,and an increase in fresh meats, eggs, and milk (Gold-berger, 1914a; Terris, 1964).
Goldberger initiated a series of observational stu-dies to clarify factors that might have etiologic signifi-cance, and that could be tested with more formalexperiments or applied with presumptive preventiveapproaches. In 1914, surveys of state institutionsshowed that those developing pellagra ate an unba-lanced diet low in protein. In 1920, community surveysshowed that pellagrous households had diets deficientin animal protein (lean meat, milk, butter, cheese,eggs), recognized vitamins (particularly the “fat solu-ble A” factor), and minerals (Goldberger et al.,1920a,b,c,d; Sydenstricker, 1933; Kasius, 1974).
No differences were observed between pellagrousand nonpellagrous households in terms of cereal sup-plies (particularly maize), caloric content of the diets,or the proportion of calories derived from carbohy-drate and fat combined (Goldberger et al., 1920a,b,c,d; Sydenstricker, 1933; Kasius, 1974). The incidenceof pellagra varied by age, gender, family income, andseason, but there was no clear relation to sanitary con-ditions (Goldberger et al., 1920a,b,c,d; Goldberger andSydenstricker, 1927; Sydenstricker, 1933; Kasius, 1974).
Goldberger proceeded to test his theory and excludealternatives with careful human experiments. From1915 to 1923, he and his colleagues showed that initia-tion of a well-balanced diet at two Mississippi orpha-nages and the Georgia State Asylum eliminated
pellagra from these institutions, places where it hadpreviously been prevalent (Goldberger et al., 1915a,b,1923). Those fed a diet of fresh meat, milk and vegeta-bles, instead of a corn-based diet, did not develop pel-lagra, and those already affected rapidly recovered(Goldberger, 1914a,b; Goldberger et al., 1915a,b, 1923).A subsequent return to the institutional diet resultedin a return of pellagra, which again disappeared withresumption of the well-balanced diet (Goldbergeret al., 1923).
In 1915, with the cooperation of Governor EarlBrewer, Goldberger experimented on 11 healthy volun-teer prisoners at Rankin State Prison Farm near Jack-son, Mississippi, with the prisoners offered pardonsin return for their participation (Goldberger andWheeler, 1915, 1920a,b; Bollet, 1992; Harkness, 1996;Kraut, 2003). The prisoners were fed a milk- andmeat-free cornmeal diet, and six were felt to havedeveloped cutaneous manifestations of pellagra within6 months, although later studies suggested that someof the cutaneous manifestations (particularly the scro-tal lesions) were those of riboflavin deficiency (Sebrelland Butler, 1938; Oden et al., 1939; Horwitt et al., 1949,1956; Carpenter and Lewin, 1985).
Proponents of competing theories, particularly thosechampioning an infectious etiology, challenged Gold-berger’s results (e.g., MacNeal, 1916). Intending to con-vert his critics, Goldberger performed a remarkableseries of demonstrations. In 1916, he showed that pella-gra could not be transmitted by injection of blood frompellagrous patients, by swabs of nasal and pharyngealsecretions swabbed onto healthy volunteers, by inges-tion of capsules containing scabs of pellagrins’ rashes,or by ingestion of capsules containing pellagrins’ fecalmaterial (Goldberger, 1916; Kraut, 2003). Goldbergerused himself, his wife, and his colleagues as the sub-jects of these “filth parties.” None of them contractedpellagra, but they did not convert all of their criticseither (MacNeal, 1916; Kraut, 2003).
In the 1920s, Goldberger and colleagues determinedthat pellagra developed despite supplementation withminerals, known vitamins, and a liberal supply of pro-tein of presumably good biological quality (Goldbergerand Tanner, 1922, 1925; Goldberger et al., 1926; Gold-berger, 1927). Goldberger and colleagues demonstratedthat yeast extract, milk, and fresh lean beef are capableof preventing pellagra, whereas soy beans, cowpeas,butter, cod-liver oil, and canned tomatoes are not(Goldberger and Tanner, 1925; Terris, 1964). After rais-ing the possibility that pellagra resulted from a defi-ciency or imbalance in dietary amino acids(Goldberger and Tanner, 1922), Goldberger ultimatelyconcluded that pellagra is a dietary deficiency diseasethat could be cured by a pellagra preventive factor or
“P-P factor” (later, “vitamin P-P”) that was lacking incorn, but that could be found in meat and milk.
Goldberger and colleagues also began experimentswith dogs, after learning in 1922 that the dog diseasenamed “black tongue” (sometimes spelled blackton-gue) was the canine equivalent of human pellagra(Chittenden and Underhill, 1917; Goldberger et al.,1922, 1926, 1928; Goldberger and Wheeler, 1928). From1922 to 1928, Goldberger and colleagues demonstratedthat black tongue could be produced experimentallywith a diet containing mainly corn meal (Goldbergeret al., 1926, 1928; Goldberger and Wheeler, 1928).Further studies evaluated the black tongue-preventiveproperties of a wide variety of foods and correlatedthese results with the pellagra-preventive propertiesof the same foods.
By 1926 Goldberger and associated established that asmall amount of dried brewer’s yeast could cure or pre-vent pellagra less expensively than fresh meat, milk, andvegetables (Goldberger and Tanner, 1925). A heat-stablecomponent of yeast was shown to prevent the develop-ment of black tongue (Goldberger et al., 1928). Enter-prising yeast manufacturers, in need of businessduring Prohibition, seized the opportunity and fillednewspapers with propaganda: “only brewer’s yeast willcure and prevent pellagra.” A gullible populace of wor-ried well “pellagraphobiacs” responded by buying moreand more yeast. After 1928, yeast was provided free inendemic areas by state and county health departmentsand the American Red Cross (Davies, 1964).
After Minot and Murphy showed that liver and liverextracts could cure pernicious anemia (1926–1928),Goldberger and Sebrell showed that liver extract couldalso prevent black tongue (Goldberger and Sebrell,1930). In the early 1930s, many physicians tried liverextracts in the treatment of pellagra with equally posi-tive results, but although liver extracts were morepotent than yeast, they were prohibitively expensiveand often ineffective parenterally in the dosages used(Sydenstricker, 1958; Davies, 1964).
Goldberger never identified the elusive P-P factor,because his research was cut short by his death in1929, but the year he died the Committee on VitaminB Nomenclature of the American Society of BiologicalChemists recommended naming the P-P factor vitaminG in his honor (Seidell et al., 1929).
Endemic pellagra was a manifestationof complex social issues
Once the relationship between poverty, diet, and pella-gra was established—primarily by Edgar Sydenstricker(1881–1936) and Goldberger—it became clear that pel-lagra was a manifestation of complex social issues
HISTORICAL ASPECTS OF THE MAJOR NEUROLOGICAL VITAMIN DEFICIENCY DISORDERS 455
and could not be easily eradicated even with the avail-ability of a medicine that was curative (Sydenstricker,1915, 1933; Goldberger et al., 1918, 1920a, b, c, d; Gold-berger and Sydenstricker, 1927; Kasius, 1974). Indeed,elimination of endemic pellagra would require improv-ing the diet of a large portion of the rural Southernpopulation, a behavioral and social change of tremen-dous magnitude and complexity. Nevertheless, withthe impetus of further economic collapse of the South-ern cotton monoculture and the manpower require-ments of World War II, the needed political resolvedeveloped and sufficient social reform occurred, sothat eradication of endemic pellagra in the UnitedStates was accomplished by the 1950s.
The antebellum cotton- and tobacco-based agricul-ture of the South had initially been compromised bythe lack of slaves after the Civil War, but soon pro-duced further economic servitude in the form of share-cropping, in which the landowners got a quarter to ahalf of the crop. Financially-strapped sharecroppersdevoted all available land to the cash crops (i.e., cottonand tobacco), and lived on inadequate unbalanced dietsconsisting largely of cheap corn meal. This entrenchedsharecropping system would not be abandoned volun-tarily by the powerful landowners or by some indebtedsharecroppers, who had no ready alternatives.
Unlike the elimination of slavery, which requiredpolitical resolve and a civil war, the social changeneeded to eliminate sharecropping was initiated inlarge part by a tiny beetle. The boll weevil crossed theRio Grande River from Mexico in the early 1890sand spread east and north to affect all the cotton grow-ing regions of the United States by 1920. The boll wee-vil contributed greatly to the increasing economic woesof the Southern farmers during the 1920s, and conse-quently to a marked rise in pellagra incidence and mor-tality in the late 1920s. Boll weevil devastation of thecotton crops was a major reason for the subsequentdevelopment of crop diversification and crop rotationprinciples, including those developed and promotedby agricultural chemist George Washington Carver(1864–1943).
During the depression, cotton was no longer an eco-nomically viable crop. The agricultural extension ser-vices encouraged farmers to reduce the acreageunder cotton, keep livestock for personal food use,and diversify cultivated crops. Southern farmerslearned to alternate soil-depleting cotton or tobaccocrops with soil-enriching crops, such as peanuts, peas,soybeans, sweet potatoes, and pecans. From 1928 to1933, total acreage under cotton or tobacco declinedmarkedly and the production of vegetables and farmproducts for home use increased equally dramatically(Davies, 1964). With these changes, pellagra mortality
declined precipitously in the early 1930s, andsubsequently plateaued until the advent of an effectivetreatment (i.e., niacin) was discovered in 1937.
Niacin
In 1937, Conrad Arnold Elvehjem (1901–1962), an agri-cultural biochemist at the University of Wisconsin,finally isolated the P-P factor from active liverextracts, showed that the P-P factor is nicotinic acid(subsequently named niacin for nicotinic acid vita-min), and demonstrated that nicotinic acid and nicoti-nic acid amide cure black tongue in dogs (Elvehjemet al., 1937, p. 938).
Soon after Elvehjem’s initial report in 1937, furtheranimal trials demonstrated that niacin cured pellagrain pigs and monkeys (Harris, 1938), and human clinicaltrials confirmed that niacin had dramatic therapeuticeffects and rapidly cured pellagra in people, includingthe cutaneous and cognitive manifestations (Foutset al., 1937; Smith et al., 1937; Elvehjem et al., 1938;Matthews, 1938; Schmidt and Sydenstricker, 1938;Spies, 1938; Sydenstricker et al., 1938; Spies et al.,1938a,b,c,d, 1939). In 1937, Paul Fouts of the LillyLaboratory for Clinical Research, Indianapolis CityHospital, Indiana University, along with colleagues,noted that, “All patients showed distinct improvementin general condition and mental attitude within 48 hof onset of therapy” (Fouts et al., 1937, p. 406). Alsoin 1937, David Smith and colleagues from Duke Uni-versity noted a dramatic recovery in a 42-year-old pel-lagrin treated with intravenous nicotinic acid (Smithet al., 1937). In 1938, Tom Spies of the University ofCincinnati College of Medicine and the Cincinnati Gen-eral Hospital, along with colleagues, reported thattreatment with nicotinic acid had dramatic results in60 pellagrins with acute or subacute psychosis (Spieset al., 1938d). Spies subsequently lauded the beneficialeffects of nicotinic acid in the treatment of pellagra,but nevertheless recognized that simply replacing theniacin deficiency was not a long-term solution; instead,maintenance of an adequate diet was necessary (Spieset al., 1939).
Dietary modification and food fortification
Dietary modification was the first truly effectiveapproach for the prevention of pellagra, but on a widescale social reform was needed to ensure implementa-tion. Initially dietary modification proved impracticalbecause of economic conditions and dietary habits.Even with free distribution of yeast by the AmericanRed Cross and by state and county health agencies,or with availability of inexpensive nicotinic acid, pella-gra persisted in endemic areas, because of ignorance,
inertia, poor food habits, and an “enormous backlog ofchronic malnutrition” (Sydenstricker, 1958, p. 413).
Subsequently, the fortification of foodstuffs byvitamin supplementation was implemented largely toensure adequate numbers of fit soldiers for WorldWar II. Despite reasoned pleas for food fortification byacademics in the late 1930s (e.g., Jolliffe, 1938; Cowgill,1939) and patriotic campaigns in the early 1940s, initialefforts to enrich bread and flour were not very effectivebecause of limited public interest and lack of economicincentive for millers and bakers (Wilder, 1956; Parket al., 2000, 2001; Backstrand, 2002; Bishai and Nalubola,2002; Mensah et al., 2004). At the beginning of WorldWar II, only 40% of the nation’s manufactured flourwas enriched, because small companies produced cheaperunenriched flour to compete with the larger manufac-turers of enriched flour. In 1940, the Council on NationalDefense requested that the National Academy ofSciences establish a Food and Nutrition Board, which in1941 established recommended intake levels for about adozen nutrients, including niacin (Committee on Foodand Nutrition, 1943).
Because a significant percentage of recruits wereineligible for military service as a result of nutritionaldeficiency disease (Jolliffe, 1938; Cowgill, 1939; Krupp,1942), the US Army decided in 1942 to purchase onlyenriched flour, which encouraged many more manu-facturers to produce enriched flour. Enrichment ofbread with niacin, thiamin, riboflavin, and iron wassubsequently mandated by Food Distribution OrderNo. 1, issued on December 29, 1942 and effective asof January 18, 1943. This action, combined withimproved economic conditions as a result of wartimeincreases in employment, augmented the decline in pel-lagra morbidity and mortality that had begun by1930—and finally resulted in eradication of endemicpellagra in the United States.
The vast majority of sporadic cases in the UnitedStates and other developed countries are now seen inalcoholics, although very rarely other patients candevelop the disease, because of malabsorption, iatro-genic situations (e.g., total parenteral nutrition withinadequate supplemental niacin), or when they subsiston bizarre diets owing to mental illness or extraordin-ary circumstances (Sydenstricker, 1958; Spivak andJackson, 1977).
The niacin-tryptophan connection andniacin neurochemistry
By the 1940s, it became clear that the total niacin con-tent of foods was not the only factor in the develop-ment of pellagra. Diets in some areas where pellagrawas rare were found to contain less niacin that did
corn diets where pellagra was common, and somefoods that were known to be effective in the preven-tion of pellagra (e.g., milk) were found to have a lowcontent of niacin (Goldsmith, 1958). Subsequent workshowed that the amino acid tryptophan is convertedto niacin in vivo and that this conversion is not depen-dent of intestinal bacteria as it occurs with parenterallyadministered tryptophan. Approximately 60 mg ofdietary tryptophan are needed to produce 1 mg of nia-cin (Goldsmith et al., 1961).
Corn-based diets were found to cause pellagra, lar-gely because they had low concentrations of trypto-phan and because any endogenous niacin was boundin the form of nicotinyl esters which are not hydro-lyzed on digestion so that the nicotinic acid is not bioa-vailable. Pellagra was produced experimentally inhuman subjects given diets low in niacin and trypto-phan (Goldsmith, 1956) and tryptophan was found tobe effective in treating pellagra (Vilter et al., 1949).Thus, in the 1940s and 1950s pellagra was reformulatedas a deficiency disease due to inadequate niacin and itsamino acid precursor tryptophan (Goldsmith, 1956,1958; Goldsmith et al., 1956, 1961). Later studiesshowed that both riboflavin and pyridoxine are neces-sary for the metabolism of tryptophan, and that pyri-doxine is specifically needed for the synthesis ofniacin from tryptophan.
The general term “niacin” now includes nicotinicacid and its amide, i.e., nicotinamide, and any deriva-tives convertible in vivo to biologically active com-pounds. Two derivatives in particular—nicotinamideadenine dinucleotide (NAD) and nicotinamide adeninedinucleotide phosphate (NADP)—are essential to allcells, and are involved in multiple biochemical reac-tions, including glycolysis, pyruvate metabolism, andpentose, fatty acid, and sterol biosyntheses (Goldsmith,1965). Previous terminology (e.g., coenzyme I or dipho-sphopyridine nucleotide for NAD, and coenzyme II ortriphoosphopyridine nucleotide for NADP) is no longerused.
The basis for the cognitive manifestations of pella-gra has not been fully elucidated, but is probablyrelated at least in part to disrupted brain serotoninmetabolism (Krishaswamy and Ramanamurthy, 1970).Serotonin is synthesized from tryptophan by a simplepathway involving sequential hydroxylation and decar-boxylation. In pellagra, much of the limited availabletryptophan is utilized to maintain nitrogen balanceand most of the remainder is metabolized in the liverto synthesize nicotinamide, thus greatly limiting thetryptophan available for serotonin synthesis. The mostimportant factor regulating synthesis of serotonin inthe brain is the availability of tryptophan, which canbe further limited by concentrations of other large
HISTORICAL ASPECTS OF THE MAJOR NEUROLOGICAL VITAMIN DEFICIENCY DISORDERS 457
neutral amino acids (e.g., leucine, isoleucine, valine,methionine, and phenylalanine) that compete forblood–brain barrier uptake through a single amino acidcarrier mechanism. Low platelet levels of serotonin andlow urinary excretion of 5-hydroxyindolacetic acid(i.e., the main metabolite of serotonin) have been docu-mented in pellagrous dementia (Krishnaswamy andRamanamurthy, 1970).
FOLATE DEFICIENCY: NEURALTUBE DEFECTS
Folate deficiency was initially recognized clinically as amacrocytic anemia in the 1920s, which was only clearlyseparated from pernicious anemia by the mid-20th cen-tury. When folate was finally isolated in the mid-1940s,it was shown to correct the macrocytic anemia asso-ciated with pernicious anemia, while the neurologicalmanifestations progressed. Beginning in the 1960s,folate deficiency was increasingly recognized as themajor cause of preventable neural tube defects.In the early 1990s well-designed randomized trials estab-lished that folate supplementation could prevent neuraltube defects. Subsequent studies have establishedgenetic predispositions for neural tube defects in off-spring in the form of gene polymorphisms for enzymesinvolved in folate-dependent homocysteine metabolism.These latter findings help to explain how the genotypeof the mother, the genotype of the unborn child, andenvironmental factors (e.g., folate intake) can all impacton the risk of neural tube defects.
Isolation and synthesis of folic acid
Folate was identified as the active substance inbrewer’s yeast in the late 1930s. In 1941, Mitchell andcolleagues isolated this factor from spinach leavesand named it folic acid, a derivative of folium, theLatin word for leaf (Mitchell et al., 1941; Hoffbrandand Weir, 2001). In 1945, Angier and colleaguesreported the successful synthesis of folic acid (Angieret al., 1945), several years before the synthesis of vita-min B12 (in 1948). Angier and colleagues showed thatfolic acid is composed of a pteridine ring, paramino-benzoic acid, and glutamic acid, and they called it pter-oylglutamic acid (Angier et al., 1945; Stokstad, 1979).Soon after the synthesis of folic acid was achieved in1945, it was demonstrated to be effective in the treat-ment of macrocytic anemias that are generally refrac-tive to treatment with refined liver preparations,including the macrocytic anemias of malnutrition,pregnancy, sprue, and celiac disease.
Naturally occurring folates were subsequently foundto vary in composition from the synthetic pteroylgluta-mic acid, with, for example, multiple polyglutamate
residues, additional single-carbon units attached to thenitrogen atoms, etc. (Hoffbrand and Weir, 2001). Theterm “folic acid” now refers to the synthetic compound,pteroylglutamic acid, which is not present in naturalfoods, while the term folate refers to the large familyof natural or synthetic compounds with similar vitaminactivity, including synthetic folic acid and the naturalfolates (Hoffbrand and Weir, 2001; Wald, 2001).
Risks of administering folic acid topatients with pernicious anemia
In 1939, M. M. Wintrobe had reported that yeast oryeast extracts could produce a hematological responsein patients with pernicious anemia (Wintrobe, 1939), afinding confirmed by others in the early 1940s (Vilteret al., 1945). Although administration of folic acid wassubsequently found to be safe in normal people andthose with various neurological disorders, and to betemporarily effective in correcting the anemia of Addi-sonian pernicious anemia (Moore et al., 1945; Vilteret al., 1945; Harvey et al., 1950; Weissberg et al., 1950),beginning around 1946 a number of reports appearedindicating that folic acid would not prevent the progres-sion of the central nervous system dysfunction inpatients with pernicious anemia, and several anecdotalreports suggested that administration of folic acid mightaccelerate neurological dysfunction or even precipitateabrupt neurological worsening in some cases (Vilteret al., 1946; Heinle and Welch, 1947; Vilter and Spies,1947; Berk et al., 1948a; Bethel and Sturgis, 1948; Haden,1948; Wagley, 1948; Wills, 1948; Dickinson, 1995).
Despite the concern at the time (and since), avail-able data from historical cases and case series(although methodologically limited) do not support adifference in the rates of progression of neurologicaldeterioration in patients with untreated pernicious ane-mia (i.e., without administration of vitamin B12) withand without administration of folic acid (Dickinson,1995). The rate of progression of neurological manifes-tations in patients with untreated pernicious anemia iswidely variable, and in some cases marked deteriora-tion develops over periods of a few weeks (Dana,1899a,b; Duckworth, 1900; Richmond and Williamson,1905; Kennedy, 1913; Globus and Strauss, 1922; Dickin-son, 1995).
Folate metabolism and the expandingrole of folates in pathogenesis of
various diseases
In the 1950s and 1960s, the biochemical reactions invol-ving folates were elucidated, and folates were found tobe essential for transfer of single carbon units in theconversion of the amino acid homocysteine tomethionine
(by the B12-dependent enzyme methionine synthase), inpurine (adenine, guanine) and pyrimidine (thymine) bio-synthesis (and thus in the biosynthesis of DNA andRNA), in DNA methylation, and in numerous othercellular reactions. The need for folate was found toincrease with rapid tissue growth and cell division(at least in part because of the need for folate in DNAand RNA biosynthesis), as in hematopoiesis, epithelialgrowth and differentiation, spermatogenesis, preg-nancy, and fetal development. Among the earliestclinical manifestations of folate deficiency arehematological changes, including hypersegmentationof neutrophils, production of megaloblasts in the bonemarrow, and eventually development ofmacrocytic ane-mia. The increased rate of folate-dependent tissuegrowth and differentiation during pregnancy (McPartlinet al., 1993) increases dietary folate needs by about0.2–0.3 mg per day (Czeizel, 1995): as a result, pregnantwomen with marginal folate levels, as in the casesstudied by Wills in India in the 1930s (Wills andMehta, 1930, 1931; Wills, 1931), were found to besusceptible to potentially fatal macrocytic anemias andto fetal malformations, particularly neural tube defects(Hibbard, 1964).
From the 1950s to the 1990s, the range of folate-responsive disorders has expanded. Recognition thatfolic acid therapy enhances tumor growth led to devel-opment of folate antagonists for anticancer therapy,including development of methotrexate. Studies ofchildren with inborn errors of metabolism identifiedforms of homocysteinuria with methylmalonic acid-uria, impaired methionine synthase, and resultingdefective remethylation of homocysteine; pathologicstudies in such cases demonstrated marked vascularpathology, suggesting an association between vasculardisease and hyperhomocysteinenia. Subsequent studiesdemonstrated abnormal methionine metabolism in sig-nificant proportions of patients with unexplainedatherosclerotic cardiovascular disease, and later studiesdemonstrated elevations of serum homocysteine inpatients with cerebrovascular and peripheral vasculardisease.
Prevention of neural tube defectswith folic acid
In approximately one in 500 to one in 1000 pregnan-cies, the neural tube fails to close properly 28 daysafter conception, producing a neural tube defect—either with failure of closure of the cranial end produ-cing anencephaly or encephalocele, or with failure ofclosure of the caudal end producing spina bifida ormyelomeningocele. Neural tube defects are the mostcommon congenital malformations and are thought to
have multifactorial causes, including a combinationof both genetic and environmental factors.
In 1964, British obstetrician Brian Hibbard sug-gested an association between fetal neural tube defectsand maternal deficiency or defective metabolism offolates (Hibbard, 1964). In 1976, Richard Smithels atthe University of Leeds and colleagues demonstratedthat women with megaloblastic anemia during preg-nancy have a high frequency of neural tube defectsin their offspring (Smithells et al., 1976). In 1980,Smithells and colleagues reported a non-randomizedtrial of multivitamin supplementation among womenwho had previously given birth to one or more infantsaffected with neural tube defects (Smithells et al.,1980): there was a 5% recurrence rate for the non-sup-plemented group compared with a 0.6% recurrencerate for the supplemented group.
Additional observational studies and non-randomizedclinical trials were published during the 1980s and 1990sthat documented protective effects of higher folic acidintake or of vitamin supplements containing folic acidduring the periconceptional period (i.e., from 1 monthbefore pregnancy through the first trimester) amongwomen who had not previously had a pregnancy affectedby a neural tube defect (occurrence studies) and amongwomen who had a previous pregnancy affected by aneural tube defect (recurrence studies). These studiesshowed a wide range of estimated efficacy in the occur-rence of neural tube defects with folic acid supplementa-tion, but the summary efficacy estimate across thevarious studies indicated an overall 50% reduction in riskof neural tube defects (Wald, 1993).
The strongest evidence in support of periconcep-tional folic acid supplementation came from two largerandomized trials published in the early 1990s (MRCVitamin Study Research Group, 1991; Czeizel andDudas, 1992; Wald, 1993; Czeizel et al., 1994; Czeizel,1993a, b, 1995). The Medical Research Council studyunder the direction of Nicholas Wald at St. Bartholo-mew’s Hospital Medical College in London was amulti-center, multinational, randomized, double-blind,controlled, recurrence prevention trial conducted in33 centers in seven countries (MRC Vitamin StudyResearch Group, 1991; Wald, 1993). The MRC studyfound a 72% reduction in recurrence of neural tubedefects with 4 mg of folic acid daily over the periodfrom before conception and during the first trimesteramong women with a previous neural tube defect-asso-ciated pregnancy (MRC Vitamin Study ResearchGroup, 1991; Wald, 1993). The Hungarian study, con-ducted by Andrew Czeizel of the National Instituteof Hygeine in Budapest, was a randomized, double-blind, controlled, occurrence prevention study withpericonceptional supplementation with multivitamins,
HISTORICAL ASPECTS OF THE MAJOR NEUROLOGICAL VITAMIN DEFICIENCY DISORDERS 459
including 0.8 mg of folic acid (Czeizel and Dudas,1992; Czeizel et al., 1994; Czeizel, 1995). Amongapproximately 5000 women with confirmed pregnancyand an “informative offspring,” maternal periconcep-tional folic acid supplementation produced a signifi-cant decrease in the first occurrence of neural tubedefects compared to a placebo-like (i.e., trace element)control group.
A meta-analysis of data from these trials and a pre-vious small (underpowered and not statistically signifi-cant) trial by Laurence and colleagues collectivelyindicated that periconceptual folate administrationreduces both the occurrence and recurrence risks ofneural tube defects by at least 70% (Laurence et al.,1981; MRC Vitamin Study Research Group, 1991;Wald, 1993). Subsequent studies have generally sup-ported these findings and suggest that periconceptionalmultivitamin supplementation can significantly reducethe occurrence of other congenital abnormalitiesin addition to neural tube defects (Kirke et al., 1992;Czeizel, 1993a, b; Czeizel et al., 2004).
Data from the randomized controlled trials havebeen used to establish governmental recommendationsconcerning folic acid intake, and have also been usedto establish health policy concerning vitamin fortifica-tion of foodstuffs (Honein et al., 2001). In 1991, theUS Centers for Disease Control published a review ofthe evidence for the prevention of recurrent neuraltube defects and recommended 4 mg of folic acid forwomen who had previously had an infant or fetus witha neural tube defect (Centers for Disease Control,1991). In 1992, the US Public Health Service recom-mended that all women capable of becoming pregnantshould consume 0.4 mg (4000 mg) of folic acid daily(Centers for Disease Control and Prevention, 1992;Cornel and Erickson, 1997). Because naturally occur-ring folate is less readily absorbed than synthetic folicacid, in 1998 the Institute of Medicine recommendedthat women of childbearing age consume 0.4 mg dailyfrom dietary supplements or fortified foods for theprimary prevention of neural tube defects (Instituteof Medicine, 1998).
Potential strategies for increasing folate levelsamong women are dietary modification, folic acidsupplementation, and food fortification (Centers forDisease Control and Prevention, 1992; Wald andBower, 1995; Czeizel, 2000; McNulty et al., 2000).Despite various education campaigns, the estimateddietary folate intake for US women averages only 0.2mg daily and it was considered impractical to havewomen systematically increase their intake of folate-rich foods (e.g., fruits, leafy green vegetables, andgrains) sufficiently to raise daily folate intake to 0.4mg daily (Centers for Disease Control and Prevention,
1999). Folic acid supplementation can also be effective,but vitamins are used consistently by less than a thirdof women of childbearing age, and the remainder donot consider taking vitamin supplements until afterthey discover that they are pregnant (McNulty et al.,2000). Unfortunately, neural tube defects develop inthe fourth week post-conception, i.e., before a preg-nancy is confirmed. Furthermore, about half of preg-nancies are unplanned (Grimes, 1986; Forrest, 1994),but even women who plan their pregnancies are poorlycompliant with folate supplementation (Clark and Fisk,1994; Scott et al., 1994; Wild et al., 1997; McNulty et al.,2000). Therefore, an approach relying on supplementa-tion will not prevent most of the folate-preventablecases of neural tube defects. Food fortification, in con-trast, can cost-effectively increase folate levels acrossthe population without requiring a change in behavior(Romano et al., 1995).
In 1996, the US Food and Drug Administrationselected flour, corn meal, pasta, and rice for manda-tory folic acid fortification beginning in January 1998at a level of 140 mg per 100 g of cereal grain product(Food and Drug Administration, 1993a,b, 1996a,b). Thiswas estimated to result in an average adult consump-tion of approximately 100 mg of folic acid daily fromfortified cereal grain products, effectively ensuringthat about half of reproductive-age women will receivethe recommended 0.4 mg daily from all sources (Foodand Drug Administration, 1993a,b; Romano et al.,1995). This level of fortification—deemed the best pos-sible accommodation between concerns for the fortifi-cation of the target population of women of child-bearing age and the safety of the much larger non-tar-get population—was expected to prevent many but notall neural tube defects that might be prevented by suf-ficient maternal folic acid intake. Folic acid fortifica-tion was limited to relatively low levels because of afear that folic acid would correct the hematologicalabnormality in patients with vitamin B12 deficiency,potentially delaying diagnosis, and allowing progres-sion of central and peripheral nervous system manifes-tations of vitamin B12 deficiency (e.g., Reynolds,2002). Many have challenged the logic and ethics ofthis rationale and the resulting national fortificationdecisions (e.g., Wald and Bower, 1994), but levels offortification remain modest. As a result, dietary modi-fication and folic acid supplementation continue to benecessary and appropriate modes of intervention.
Since 1996, folic acid fortification has produced asignificant improvement in population folate status inthe United States (Jacques et al., 1999; Lawrenceet al., 1999; Honein et al., 2001; Erickson, 2002;Mathews et al., 2002; Rader, 2002; Centers for DiseaseControl and Prevention, 2004; Pfeiffer et al., 2005), and
similar improvements have been observed in other coun-tries that have adopted this strategy (Ray et al., 2000).In 1999, data from the Framingham Offspring Cohortshowed that fortification of enriched grain products withfolic acid was associated with a substantial improvementin folate status of the population (Jacques et al., 1999;Rader, 2002). Similar results were demonstrated in popu-lations enrolled in large managed care plans (Lawrenceet al., 1999), and in representative samples of women par-ticipating in the National Health and Nutrition Examina-tion Survey (NHANES) (Centers for Disease Controland Prevention, 2000; Pfeiffer et al., 2005).
By 2001, findings using birth certificate data forlive births in 45 states and the District of Columbiabetween 1990 and 1999 suggested that a decline ofapproximately 20% in the prevalence of neural tubedefects at birth followed fortification of the US foodsupply with folic acid (Honein et al., 2001). A later ana-lysis by the Centers for Disease Control and Prevention(2004) suggested a 27% decline in the average annualproportion of pregnancies affected by neural tubedefects after fortification (i.e., 1999–2000 comparedwith 1995–1996).
Gene polymorphisms for enzymes involvedin folate-dependent homocysteine
metabolism
In the 1990s, several studies demonstrated hyperhomo-cysteinemia in mothers of children with neural tubedefects, despite the absence of folate or vitamin B12 defi-ciency in these mothers, while other studies identifiedhyperhomocysteinemia in children with spina bifida,despite the absence of folate or vitamin B12 deficiencyin these children. Because of this, several groups exam-ined enzymes involved in homocysteine metabolism forpotential mutations or polymorphisms linked to anincreased risk of neural tube defects, suspecting that dis-turbed homocysteine metabolism in either mothers ortheir offspring could cause neural tube defects.
In 1995, Nathalie Van der Put and colleagues reportedthat a common thermolabile variant (i.e., 677C!TAu4 ) in the5,10-methyleneterahydrofolate reductase gene amongmothers is associated with decreased function of theenzyme and with increased risk of neural tube defects intheir offspring (Van der Put et al., 1995, 1996, 1997,2001). Methylenetetrahydrofolate reductase catalyzesthe formation of 5-methyltetrahydrofolate, the biologi-cally active form of folate needed for the remethylationof homocysteine to methionine, so either inadequatefolate, a defective methylenetetrahydrofolate reductaseenzyme, or a combination of these increases plasmahomocysteine concentrations. Among folate-deficientpeople, homozygotes for the 677C!T mutation (i.e., the
TT genotype with two mutant T alleles) have higherplasma homocysteine concentrations than those with theCC genotype.
Other studies have noted the same polymorphism inchildren with hyperhomocysteinemia and spina bifida(Bj�rke-Monsen et al., 1997; Shaw et al., 1998; Shieldset al., 1999), and Denis Shields and colleagues found amuch stronger relationship between the genotype and phe-notype of the child than between the genotype of themother and the phenotype of the child (Shields et al.,1999). When the genotypes of both the mother and thechild were considered, the 677C!T mutation accountedfor atmost 25%of the observed protective effect of folate(Van der Put et al., 1995), suggesting that other defectiveenzymes either in folate metabolism or folate transportmay also be involved (Van der Put et al., 1998; Brodyet al., 2002). Subsequent studies have demonstrated thatelevated maternal homocysteine concentrations in womenwith such polymorphisms can be lowered by supplementalfolate intake, even in women with normal folate levels tobegin with (Kang et al., 1988; Nelen et al., 1998; Brouweret al., 1999; Fohr et al., 2002).
Although methyleneterahydrofolate reductase genepolymorphisms are only moderate risk factors forneural tube defects (Van der Put et al., 1997), at apopulation level they contribute to a significant portionof the observed burden of neural tube defects. Theyalso help to explain how the genotype of the mother,the genotype of the unborn child, and environmentalfactors (e.g., folate intake) can all impact on the riskof neural tube defects.
Pernicious anemia was recognized clinically in the mid-19th century, but the associated neurological manifes-tations—particularly the myelopathy now known assubacute combined degeneration of the spinal cord—was not recognized clinically and linked with perni-cious anemia until the end the 19th century. In the1920s, Minot and Murphy showed that large quantitiesof ingested liver could be used to effectively treat per-nicious anemia. Cyanocobalamin (vitamin B12) wasfinally isolated by the mid-20th century, and thisgreatly improved the treatment of pernicious anemiaand the associated neurological manifestations.
Pernicious anemia
Originally at a meeting of the South London MedicalSociety in 1849, and subsequently in a monograph in1855, Thomas Addison (1793–1860) at Guy’s Hospitalin London described several cases with “idiopathic”
HISTORICAL ASPECTS OF THE MAJOR NEUROLOGICAL VITAMIN DEFICIENCY DISORDERS 461
anemia characterized by pallor, weakness, and progres-sively worsening health leading to death (Addison,1855/1942). Later this condition was called Addisoniananemia, at least until Biermer of Zurich named itperniciöse Anämie (i.e., pernicious or fatal anemia)when describing 15 cases of severe anemia (of mixedetiologies) in 1872 (Biermer, 1872; Haden, 1948).
In 1870, Fenwick in London associated stomachatrophy with this form of anemia and demonstratedthat stomach mucosa from an affected fresh cadavercould not digest boiled egg white with prolonged incu-bation, whereas mucosa from a control stomach could(Fenwick, 1870; Mackay and Whittingham, 1968;Florkin, 1973; Okuda, 1999). Subsequently, Cahn andMering showed that a patient with pernicious anemiahad no hydrochloric acid in the stomach contents, afinding later demonstrated to be pervasive in thisdisorder (Cahn and Mering, 1886; Faber and Block,1900; Hurst and Bell, 1922; Levine and Ladd, 1924).Soon it was recognized that achlorhydria precedes thedevelopment of anemia (Hurst and Bell, 1922).
It was not until the 1850s – after Addison’s originalcommunication – that the first red cell counts were doneby Vierordt, and that hemoglobin was discovered byFunke (Funke, 1851; Vierordt, 1851; Haden, 1948). In1875, William Pepper (1843–1898) of Philadelphia notedthe extreme hyperplasia of the bone marrow in patientswith pernicious anemia (Pepper, 1875). In 1880, Paul Ehr-lich (1854–1915), using aniline dyes developed by his cou-sin Carl Weigert (1845–1904), identified large erythroidprecursor cells that he called “megaloblasts” in stainedblood smears of patients with pernicious anemia(Ehrlich, 1880). Subsequent hematologists noted charac-teristics of megaloblastic anemia in the peripheral blood(i.e., macrocytes, poikilocytes, and hypersegmentedneutrophils) and bone marrow (e.g., megaloblasts, meta-myelocytes, and megakaryocytes) (Billings, 1902). Later,Francis W. Peabody of the Thorndike Memorial Labora-tory in Boston hypothesized that this macrocytic anemiawas due to maturational arrest of erythroblasts in thebone marrow (Peabody, 1927; Florkin, 1973; Kass, 1978).
Subacute combined degenerationof the spinal cord
In 1884, Lichtenstein described cases of perniciousanemia with neurologic manifestations felt to be sug-gestive of tabes dorsalis (Lichtenstein, 1884; Haden,1948). Following Lichtheim’s report in 1887 of progres-sive myelopathy associated with pernicious anemia inthree cases, two with autopsy, similar cases werereported by a number of authors over the next severaldecades (Dana, 1891a,b, 1899a,b, 1908; Putnam, 1891;Minnich, 1893; Taylor, 1895; Lloyd, 1896; Russell,
1898;Russell et al., 1900; Putnam and Taylor, 1901;Billings, 1902; Bramwell, 1915; Woltman, 1918; Hurstand Bell, 1922; Weil and Davison, 1929; Greenfieldand O’Flynn, 1933).
Clinicians of the period struggled to distinguish theclinical and pathologic features of this condition fromthose of other recognized causes of progressive myelo-pathy (Dana, 1891a, b), particularly those that couldaffect multiple white matter tracts of the spinal cord,i.e., what New York neurologist Charles Loomis Dana(1852–1935) had referred to categorically as “combinedsclerosis or mixed-system myelitis” (Dana, 1887, p. 1).In the absence of a valid diagnostic test (i.e., beforerecognition of vitamin B12 deficiency as the cause),early cases were particularly hard to distinguish fromother conditions, and many of the early reportsincluded cases with other etiologies (Pant et al., 1968).Nevertheless, subacute combined degeneration of thespinal cord was clearly linked with pernicious anemiaby about 1900.
In 1891, Boston neurologist James J. Putnam(1846–1918) of Harvard Medical School and theMassachussetts General Hospital reported eightcases, four with autopsy. The neurological presenta-tion and course in these early cases typically includedprogressive paresthesias, incoordination, impairedsensation and position sense in the arms and legs,preserved and even exaggerated muscle stretchreflexes and ankle clonus, and weakness progressingto terminal paraplegia (Putnam, 1891).
Early neuropathological studies by Putnam andDana, along with later neuropathological studies,demonstrated that the degeneration of white mattertracts was initially uneven and patchy, with small focienlarging and coalescing to involve entire white mattercolumns and the most severe involvement generallyaffecting the posterior columns of the lower cervicaland thoracic cord, extending in some cases to themedulla (Putnam, 1891; Dana, 1891a, b; Pant et al.,1968). The degenerative process affects most dramati-cally the myelin sheaths (although axons are alsodamaged), with marked swelling of myelin sheathsgiving a vacuolated “sieve-like” appearance to stainedspinal cord sections, evident even in some of the ear-liest published pathological illustrations (Putnam,1891; Dana, 1891a, b; Pant et al., 1968).
Sensory symptoms and signs are early and promi-nent features of the disease. Although Putnam initiallyremarked that “the nerve roots were more or less dis-eased” (Putnam, 1891, p.72), by 1901 he and Taylorreported on a “common freedom from degenerationof nerve roots, both motor and sensory, and peripheralnerves” (Putnam and Taylor, 1901, p. 92). Even ifin individual cases “somewhat imperfectly staining
bundles [within the nerve roots] may be made out . . .[the] dorsal roots at no point show the changes whichare an essential part of the pathological process intabes [dorsalis]” (Putnam and Taylor, 1901, p. 76).Severe cases showed some involvement of the poster-ior roots (Putnam, 1891), but not enough to explainthe severe degeneration of the posterior columns as asecondary phenomenon (Pant et al., 1968).
Involvement of peripheral nerves was suggestedclinically (Woltman, 1918), but initial pathologicalreports of peripheral nerve involvement with subacutecombined degeneration of the cord were scanty andinconsistent (Putnam, 1891; Putnam and Taylor, 1901).Peripheral nerve involvement was recognized patholo-gically by the 1930s and 1940s (Greenfield and Carmi-chael, 1935; van der Scheer and Kock, 1938; Foster,1945; Ungley, 1949), with later studies also demonstrat-ing slowed peripheral nerve conduction velocities(Mayer, 1965Au5 ).
Clinical manifestations of optic atrophy were elabo-rated since the 1930s by a number of authors (Courvilleand Nielson, 1938; Kampmeier and Jones, 1938; Turner,1940), with degeneration of the papillomacular bundlepreviously demonstrated pathologically by Bickel(1914) and more extensive degeneration of the opticnerves anterior to the optic chiasm demonstrated laterby Adams and Kubic (1944).
The association of subacute combined degenerationof the spinal cord with anemia was recognized by manyof the early authors describing this form of myelopathy,although only some emphasized a strong or universalrelationship with pernicious anemia specifically(Lichtheim, 1887; Billings, 1902; Bramwell, 1915;Wolt-man, 1918; Hurst and Bell, 1922; Weil and Davison,1929). By the 1920s, subacute combined degenerationof the spinal cord was clearly associated with bothpernicious anemia and gastric achlorhydria (Hurst andBell, 1922; Greenfield and O’Flynn, 1933).
In 1900, Russell and colleagues suggested the name“subacute combined degeneration of the spinal cord”(Russell et al., 1900, p. 40). Other names proposedincluded Putnam’s earlier “system sclerosis, associatedwith diffuse collateral degeneration” and “primarycombined sclerosis” (Putnam, 1891), and later “diffusedegeneration of the spinal cord” (Putnam and Taylor,1901); Dana’s earlier “sub-acute combined sclerosis ofthe spinal cord” (Dana, 1899a) and “subacute ataxicparalysis and combined sclerosis” (Dana, 1899b);“combined sclerosis of Lichtheim-Putnam-Dana type,”and “combined systems disease” (Pant et al., 1968).Although Putnam and Taylor (1901) objected to this ter-minology, after about 1910 the term most often usedwas “subacute combined degeneration of the spinalcord” (Bramwell, 1915; Hurst and Bell, 1922; Weil and
Davison, 1929; Greenfield and O’Flynn, 1933; Weirand Gatenby, 1963; Robertson et al., 1971).
Toxic and infectious theories for perniciousanemia and subacute combined
degeneration
At the beginning of the 20th century, pernicious ane-mia and the associated subacute combined degenera-tion of the spinal cord were considered by manyinvestigators to result from infectious or toxic causes(Dana, 1899a, b; Russell et al., 1900; Hunter, 1901;Billings, 1902; Bramwell, 1915; Hurst and Bell, 1922;Weil and Davison, 1929; Weiss, 1991). As early as1901, Hunter suggested that pernicious anemia is theresult of infection and release of an exotoxin. Heattributed the glossitis to a specific microorganism,which when swallowed produced gastritis and ulti-mately gastric atrophy, and he further attributed theanemia to the release of a toxin in the intestinal tract,which when absorbed into the portal circulation causedhemolysis (Hunter, 1901; Haden, 1948).
In 1922, Hurst and Bell similarly proposed that per-nicious anemia resulted from “oral sepsis, absence offree hydrochloric acid from the stomach contents,and consequent intestinal infection and intoxication”(Hurst and Bell, 1922, p. 266). In support of this the-ory, they noted the constant association of perniciousanemia with gastric achlorhydria, evidence that the gas-tric achlorhydria precedes the development of anemia,the frequent presence of gastrointestinal symptoms(e.g., diarrhea, “bilious attacks” with occasional vomit-ing, “flatulent dyspepsia,” etc.), the fact that gastroin-testinal symptoms precede the development ofneurological manifestations, the frequent finding ofbacteria in the sockets of infected teeth, and thefinding of identical bacteria colonizing the atrophicstomach. Hurst and Bell’s bacterial toxin theory ledto specific (albeit ineffective) treatments, in an attemptto rid the body of the causative infection and the sec-ondary generation of the putative toxins, includingextraction of all teeth; tonsillectomy; ingestion of largedoses of hydrochloric acid and milk soured by activelactic acid bacilli; administration of a vaccine preparedfrom streptococci isolated from extracted teeth, duo-denal contents, or feces; administration of arsenic;blood transfusion, etc. (Hurst and Bell, 1922).
In 1926, Minot and Murphy noted—at the time oftheir initial presentation of the first truly effectivetherapy for pernicious anemia (i.e., oral liver)—thatthe bacterial toxin theory was widely held. As they pre-dicted, proponents of toxic or infectious theories werenot easily swayed by studies demonstrating improve-ment with certain diets, even if others at the time felt
HISTORICAL ASPECTS OF THE MAJOR NEUROLOGICAL VITAMIN DEFICIENCY DISORDERS 463
that “the prompt and regular beneficial effect of liverfeeding on erythropoiesis seemed unlikely to be theresult of elimination of bacterial infection in thebowel” (Castle, 1974, p. 25), and even if intestinal florawere unchanged after such patients were returned to(relative) health after regularly consuming a diet withlarge amounts of liver (Castle, 1929). Many investiga-tors nevertheless continued to consider the presenceof pathologic bacterial colonization as persuasive evi-dence in support of a bacterial toxin theory, withoutadequately considering that such colonization couldbe a non-causal association.
Microbial exotoxin theories gained further supportin the early 1930s with the discovery in Finland of amacrocytic anemia associated with infestation of thesmall bowel by a tapeworm, particularly when it wasshown that the anemia promptly improved after elimi-nation of the parasite (Birkeland, 1932; Castle, 1974).Even in the 1950s, after the identification of vitaminB12, toxic and infectious theories of the pathogenesisof pernicious anemia were still debated (Crosby,1983). However, the postulates of Robert Koch (1843–1910) for establishing an infectious cause for perniciousanemia were never fulfilled for any of the putativeorganisms, nor was a systematic experimental effortever undertaken with that goal in mind: specifically,the investigators never showed that the putativeorganisms isolated from people would cause perniciousanemia in susceptible animals, nor did they demonstratethat the organisms could then be recovered from suchanimals and re-grown in pure culture.
Minot and Murphy and the liver therapyfor pernicious anemia
During the first quarter of the 20th century, a widevariety of therapies were employed in the treatmentof pernicious anemia – hydrochloric acid, iron, arsenic,removal of sources of infection including all teeth,drainage of the intestinal tract, splenectomy, bloodtransfusion, and special diets (Haden, 1948). With theexception of transfusion, which could prolong lifesomewhat, these therapies were largely ineffectiveand life expectancy was less than 2 years.
Beginning around 1917, George H. Whipple(1878–1976) and colleagues, first in San Francisco andlater in Rochester, New York, established that certainsupplemental foods, such as spinach, beef muscle,and particularly liver “had a powerful effect uponhemoglobin regeneration” (Whipple, 1934/1965, p. 347;Weiss, 1991). Although Whipple did not test or applyhis ideas in people, he did suggest in 1922 that perni-cious anemia is a disease “in which all pigment factorswere present in the body in large excess but with a
scarcity of stroma-building material or an abnormalityof stroma-building cells” (Whipple, 1922, 1934/1965,p. 348). By 1925, Whipple with Freida Robscheit-Robbins(1893–1973) demonstrated that liver increased theamount of hemoglobin regenerated in dogs maintainedon a basal diet and kept chronically anemic by weeklybleedings accomplished by aspiration from the jugularvein (Robscheit-Robbins and Whipple, 1925). Althoughnot immediately recognized, the magnitude of assimila-tion of inorganic iron was the most important factor inhemoglobin production in this experimental paradigm.
Prior to 1925, dietary supplementation with smallamounts of liver had been tried in a small number of casesof pernicious anemia without marked or consistentresults. But these early investigators had neither systema-tically nor persistently fed these often anorexic patientslarge quantities of liver (e.g., a half pound or more daily),nor had they carefully quantified the amounts consumedor the results (Weiss, 1991). In 1925, George R. Minot(1885–1950), at Peter Bent Brigham Hospital in Boston,and William P. Murphy (1892–1987), at the Collis P. Hun-tington Memorial Hospital of Harvard University,decided to hospitalize a group of patients with perniciousanemia to systematically assess liver as a treatment.
By 1926, Minot and Murphy reported clinical andhematological improvement in 45 patients with perni-cious anemia treated with a dietary regimen that incor-porated large quantities of liver—“From 120 to 240Gm., and even sometimes more” (Minot and Murphy,1926, p. 472). Clinically the patients improved, oftendramatically so, in conjunction with improvements inhematological indices. This clinical improvement couldbe sustained for many years, well beyond the previouslife expectancy of such patients (Murphy, 1934/1965).A major component of the hematological response insuch patients was, in retrospect, likely due to folatewhich the subjects could readily absorb, rather than tovitamin B12, which the subjects with pernicious anemiacould at best only marginally absorb (Chanarin, 1991).Importantly, patients with (relatively mild) neurologicaldysfunction also improved, for example with signifi-cant improvements in gait, suggesting in retrospectthat sufficient vitamin B12 was also absorbed with thisregimen. However, patients with more severe neurolo-gical dysfunction showed at best slow and limitedimprovement (Minot and Murphy, 1926).
Using the reticulocyte response as an index of ery-thropoiesis, Minot and Murphy were able to recognizeliver as the essential component of the regimen. WithEdwin J. Cohn, a physical chemist in the Laboratoriesof Physiology at Harvard Medical School (later recog-nized for his fractionation of plasma proteins), theythen tried unsuccessfully “to isolate the active princi-ple” (Minot, 1934/1965). Nevertheless, they did demon-
strate that potent extracts could be given parenterallyin very small quantities (Minot, 1934/1965; Murphy,1934/1965).
Whipple, Minot, and Murphy were subsequentlyrecognized jointly with the 1934 Nobel Prize in Physiol-ogy or Medicine “for their discoveries concerningliver therapy in cases of anemia” (Minot, 1934/1965;Murphy, 1934/1965; Whipple, 1934/1965).
Castle’s intrinsic and extrinsicfactors—linking gastric andhematological abnormalities
In 1926, after Minot and Murphy’s success with the livertherapy for pernicious anemia, William B. Castle (1897–1990)—then an assistant resident at the Thorndike Mem-orial Laboratory of Boston City Hospital, which hadrecently come under the direction of Minot as successorto Francis Peabody—decided to pursue his belief that gas-tric achlorhydria (“achylia gastric”) was etiologicallylinked to pernicious anemia (Castle, 1966, 1974; Kass,1978; Crosby, 1983; Herbert, 1984; Jandl, 1995). Castle, inconsidering why normal individuals do not have to eatlarge amounts of liver every day to maintain a normalblood count, noted that: (1) gastric achlorhydria precedesthe other clinical manifestations of pernicious anemia(including the hematological and neurological manifesta-tions); and (2) even when the blood of a patient with per-nicious anemia is returned to normal with liver feeding,there is “a total lack of any amelioration in the secretoryincapacity of the stomach” (Castle, 1929, 1974, p. 5;Johansen, 1929Au6 ). Based on these observations, Castle pro-posed that “a virtual dietary deficiencymight be producedin the presence of a diet entirely adequate for a normalindividual, by the notable defect in the process of gastricdigestions necessarily imposed by the absence of func-tional gastric juice . . .” (Castle, 1929, 1974, p. 6). Castlesuggested first that an essential step of gastric digestionwas impaired, thereby disrupting absorption of an essen-tial dietary factor, and second that this defective processmight be circumvented by utilizing gastric juices or someother component of the gastric digestive process fromindividuals with normal stomachs.
To test this idea and its subsequent elaborations,Castle devised and implemented an ingenious seriesof experiments (Castle, 1929; Castle and Townsend,1929; Castle et al., 1930; Castle and Ham, 1936). Cas-tle’s first patient was a 60-year-old woman withuntreated pernicious anemia who was first given tworare beef patties (200 gm) daily for 10 days, withouta reticulocyte response (Castle, 1929). Castle thenfed himself raw beef patties daily instead of break-fast and an hour later regurgitated his semi-liquid sto-mach contents using pharyngeal stimulation (Castle,
1929; Kass, 1978; Weiss, 1988). This mixture was treatedwith hydrochloric acid and incubated for 6 to 30 h, thenfiltered and neutralized, and immediately introducedinto the stomach of the patient using a Rehfuss tube.Within 5 days the subject’s reticulocyte began toincrease, peaking at 10 days, with a subsequent increasein the red blood cell count of over 1 million red cells percubic millimeter within 30 days (Castle, 1929). Similarresponses were observed in seven of nine additionalpatients, but two patients remained refractory (appar-ently because they lived in another city and the predi-gested material had to be transported to them,necessitating a prolonged delay after neutralization withsodium hydroxide). Although not stated in the originalpaper, Castle later acknowledged being the source ofthe normal human gastric juice (Weiss, 1988, p. 157).
Castle concluded “that in contrast to the conditionswithin the stomach of the pernicious anemia patient,there is found within the normal stomach during thedigestion of beef muscle some substance capable ofpromptly and markedly relieving the anemia of thesepatients” (Castle, 1929, 1974, p.13). He and his cowor-kers further demonstrated that this response is notdue to gastric juice alone (Castle and Townsend,1929), and that contact between normal gastric intrinsicfactor and dietary extrinsic factor is necessary for anerythropoietic response (Castle and Ham, 1936).
Castle subsequently labeled the essential substancesecreted by a normal stomach as “intrinsic factor,” andthe substance present in food as “extrinsic factor” (Castleet al., 1930). Castle and colleagues showed that intrinsicfactor is a thermolabile substance present in gastric juice,but not present in saliva or duodenal contents free of gas-tric juice (Castle et al., 1930). Although Castle did notclearly identify the role of intrinsic factor as an intestinaltransport vehicle for extrinsic factor (later identified asvitamin B12), he did establish that: (1) the intrinsic andextrinsic factors have to interact for effective erythropoi-esis in patients with pernicious anemia; and (2) nutritionaldeficiencies could result from malabsorption or impairedmetabolism in addition to inadequate intake.
Around this time, desiccated, defatted whole hog sto-mach as a replacement source of intrinsic factor wasshown to be modestly successful in clinical trials. Laterstudies demonstrated that intrinsic factor is a glycopro-tein with a molecular weight of 60 kD secreted by gastricparietal cells (Hoedemacher et al., 1964).
Isolation, structure, synthesis, andbiochemical reactions of Vitamin B12
Over two decades, from the late 1920s until the late1940s, increasingly potent liver extracts were manufac-tured that could be given either intramuscularly or
HISTORICAL ASPECTS OF THE MAJOR NEUROLOGICAL VITAMIN DEFICIENCY DISORDERS 465
intravenously (Castle, 1966). Progress was slow in iso-lating the active substance in these factors, in partbecause of the initial need for bioassays usinguntreated cases of pernicious anemia, and in partbecause of inadequate separation methods (Castle,1966; Okuda, 1999). In 1947, following the develop-ment of microbiological assay techniques (Shorb,1947) and improved chromatographic techniques, vita-min B12 was finally isolated as pink crystals of cyano-cobalamin—containing cobalt, nitrogen, andphosphorus—by Karl Folkers and colleagues at Merckand Company in the United States, and nearly simulta-neously by E. Lester Smith at Glaxo Laboratories inEngland (Rickes et al., 1948; Smith, 1948; Smith andParker, 1948; Okuda, 1999). Shortly thereafter, Castleand colleagues identified vitamin B12 as Castle’s extrin-sic factor, but found that oral vitamin B12 even with asource of intrinsic factor was still not as potent as par-enteral vitamin B12 (Berk et al., 1948b).
By 1955, Dorothy Crowfoot Hodgkin (1910–1994) ofCambridge University determined the molecular struc-ture of cyanocobalamin using computer-assisted x-raycrystallography, work for which she received the 1964Nobel Prize in Chemistry (Hodgkin et al., 1955; Hodg-kin, 1964/1972). The complex structure of vitamin B12
included a single cobalt atom at the center of a tetra-pyrrole or “corrin” macro-ring structure. A completechemical synthesis of vitamin B12 was finally achievedin 1960 by an international consortium of chemists.Subsequent biochemical work demonstrated that onlytwo enzyme systems required forms of vitamin B12 inman: adenosylcobalamin in the conversion of methyol-malonyl coenzyme A to succinyl coenzyme A bymethylmalonyl-coenzyme A mutase, and methylcobala-min in the conversion of homocysteine to methionineby methionine synthase (Sakami and Welsh, 1950;Flavin and Ochoa, 1957).
Selected clinically important studiesafter isolation of vitamin B12
Shortly after the isolation of vitamin B12, RandolphWest (1948) demonstrated the efficacy of injected vita-min B12 in pernicious anemia, and West and Reisner(1949) were among the first to assess the response toparenteral vitamin B12 of the neurologic manifestationsof subacute combined degeneration of the cord inpatients with pernicious anemia. In five patients withspinal cord lesions, West and Reisner observed “vary-ing degrees of improvement” and noted that “nonehas become worse,” and that “All of these patientswith spinal cord lesions are walking readily” aftertreatment. Changes noted included improvements inambulation, improvements in position sense with
improvement or normalization of a previously positiveRomberg sign, and in some cases resolution of abnor-mal muscle stretch reflexes or cutaneous reflexes, butgenerally no evident improvement in vibratory senseabnormalities.
In 1957, Dorscherholmen and Hagen subsequentlydemonstrated that there are two mechanisms involvedin B12 absorption (also see Schilling, 1958). With physio-logic (i.e., 1–2 mg) doses of oral radioactive vitamin B12,radioactivity appeared in plasma within several hoursand reached a peak at 8–12 h, a process dependent uponintrinsic factor, but if much larger oral doses wereadministered the radioactivity appeared in plasma muchsooner as a result of passive diffusion, independent ofthe presence or absence of intrinsic factor. The passivediffusion mechanism has subsequently been utilizedclinically for the treatment of pernicious anemia withlarge (1000 mg) daily oral doses of vitamin B12.
In 1957, Booth and Mollin showed that patients inwhom the ileum had been resected did not absorb vita-min B12 well, and then that radioactive vitamin B12
administered orally prior to laparotomy was localizedto the terminal ileum several hours later using a Geigercounter during surgery (Booth and Mollin, 1957, 1959).Additional studies showed that when vitamin B12 isreleased from foods by peptic digestion it is bound tointrinsic factor, affording partial protection againstgut microorganisms and parasites during transportthrough the gut to the terminal ileum, where the com-plex binds to microvilli of the intestinal epithelial cells.The vitamin B12 is released into the interior of thesecells, and then enters the blood stream, where it istransported by specific serum proteins (particularlytranscobalamin II) to target cells.
In 1957 and 1958, Michael Schwartz and colleaguesin Copenhagen observed that the therapeutic efficacyof hog intrinsic factor preparations declined with usein pernicious anemia patients (Schwartz et al., 1957,1958). Shortly thereafter, and through the 1960s, sev-eral lines of evidence converged in support of an auto-immune basis for pernicious anemia: (1) corticosteroidsimproved B12 absorption and reduced anemia; (2) gas-tric and serum autoantibodies to intrinsic factor andgastric parietal cells are present in the majority ofpatients; and (3) other autoimmune diseases (e.g.,Hashimoto’s thyroiditis, insulin-dependent diabetesmellitus, Addison’s disease, and vitiligo) are commonin such patients (Mackay and Whittingham, 1968;Okuda, 1999; Whittingham and Mackay, 2005).
Pernicious anemia is now understood to begin withan autoimmune gastritis in which parietal cell antibo-dies produce atrophic gastritis with resultant declinein intrinsic factor production over decades. Althoughthe recognition of antibodies to hog intrinsic factor
led to discovery of the autoimmune nature of perni-cious anemia, these antibodies apparently do notdecrease the effectiveness of hog intrinsic factor inpromoting the absorption of vitamin B12, and the wan-ing of efficacy of hog intrinsic factor was thereforeattributed to “local effects in the intestinal tract” (Cas-tle, 1966). In 1988, the principle target of these antibo-dies was identified by Karlsson and colleagues as theacid-producing H+/K+-adenosine triphosphatase(ATPase) in the cell membrane of gastric parietal cells(Karlsson et al., 1988).
REFERENCES
Adams RD, Kubik CS (1944). Subacute degeneration of the
brain in pernicious anemia. New Engl J Med 231: 1–9.
Addison T (1855/1942). On the constitutional and local
effects of disease of the supra-renal capsules. In:
Clendening L (Ed.), Source Book of Medical History.
Dover Publications Inc., New York.Au7
Alexander L (1941). Wernicke’s disease: Identity of lesions
produced experimentally by B1 avitaminosis in pigeons
with hemorrhagic polioencephalitis occurring in chronic
alcoholism in man. Am J Pathol 16: 61–69.
Alexander L, Pijoan M, Myerson A (1938). Beriberi and
scurvy. Trans Am Neurol Assoc 64: 135–139.
Anderson JF (1911). An attempt to infect the rhesus monkey
with blood and spinal fluid from pellagrins. Public Health
Rep 30: 1003–1004.
Angier RB, Boothe JH, Hutchings BL et al. (1945). Synth-
esis of a compound identical with the L. casei factor iso-
lated from liver. Science 102: 227–228.
Backstrand JR (2002). The history and future of food fortifi-
cation in the United States: A public health perspective.
Nutr Rev 6: 15–26.
Banga I, Ochoa S, Peters RA (1939). Pyruvate oxidation in
brain. VI. The active form of vitamin B1 and the role of
C4 dicarboxylic acids. Biochem J 33: 1109–1121.
Bass CC (1911). Pellagrous symptoms produced experimen-
tally in fowls by feeding maize spoiled by inoculation
with a specific bacterium. JAMA 57: 1684–1685.
Bender L, Schilder P (1933). Encephalopathia alcoholica;
Polioencephalitis haemorrhagica superior of Wernicke.
Arch Neurol Psychiatry 29: 990–1053.Au8
Berk L, Bauer JL, Castle WB (1948a). Folic acid: A report of
12 patients treated with synthetic pteroylglutamic acid,
with comments on the current literature. S Afr Med J
22: 604–611.
Berk L, Castle WB, Welch AD et al. (1948b). Observations
on the etiologic relationship of achylia gastrica to perni-
cious anemia. X. Activity of vitamin B12 as food (extrin-
sic) factor. New Engl J Med 239: 911–913.
Bethel FH, Sturgis CC (1948). The relation of therapy in per-
nicious anemia to changes in the nervous system. Early
and late results in a series of cases observed for periods
of not less than ten years, and early results of treatment
with folic acid. Blood 3: 57–67.
Bickel H (1914). Funikulare Myelitis mit bulbaren und poly-
During the preparation of your manuscript for typesetting some questions have arisen. These are listed below. Please check yourtypeset proof carefully and mark any corrections in the margin of the proof or compile them as a separate list. This form shouldthen be returned with your marked proof/list of corrections to Elsevier Science.
Disk use
In some instances we may be unable to process the electronic file of your article and/or artwork. In that case we have, forefficiency reasons, proceeded by using the hard copy of your manuscript. If this is the case the reasons are indicated below:
Disk damaged Incompatible file format LaTeX file for non-LaTeX journal
Virus infected Discrepancies between electronic file and (peer-reviewed, therefore definitive) hard copy.
Manuscript scanned Manuscript keyed in Artwork scanned
Files only partly used (parts processed differently:......................................................)
Bibliography
If discrepancies were noted between the literature list and the text references, the following may apply:
The references listed below were noted in the text but appear to be missing from your literature list. Please complete the list orremove the references from the text.
Uncited references: This section comprises references which occur in the reference list but not in the body of the text. Pleaseposition each reference in the text or, alternatively, delete it. Any reference not dealt with will be retained in this section.
QueryRefs.
Details Required Author’s re-sponse
AU1 Running head OK?
AU2 1889 in refs.
AU3 Pls. confirm spelling of author name
AU4 OK?
AU5 Not in ref list; pls. supply
AU6 Not in ref list; pls. supply
AU7 Pls. supply page range
AU8 Ref not cited in text; Ok to delete?
AU9 Journal title not listed in Index Medicus is abbreviation OK?
AU10 Journal title not listed in Index Medicus is abbreviation OK?
AU11 Journal title not listed in Index Medicus is abbreviation OK?
AU12 Pls. supply vol no.
AU13 Journal title not listed in Index Medicus is abbreviation OK?
AU14 Confirm page range
AU15 Ref not cited in text; Ok to delete?
AU16 Journal title not listed in Index Medicus is abbreviation OK?
AU17 Ref not cited in text; Ok to delete?
AU18 Journal title not listed in Index Medicus is abbreviation OK?
AU19 Journal title not listed in Index Medicus is abbreviation OK?