Malaria is one of the most prevalent and dangerous diseases known to man. It has existed for centuries and affects a myriad of people in the tropical region. Even today, with our newly discovered treatments for many of the tropical diseases, over 10% of the people that are infected with malaria each year and do not receive proper treatment die. In Africa alone, over 1 million children die each year because of malaria and new cases are reported frequently. Malaria is very dangerous and harmful to man. However, the protozoan that causes malaria has existed since man came into being.
Fossils of mosquitoes that are 30 million years old contain the vector for malaria. After written history, many civilisations have known about malaria. The Greek physician Hippocrates described the symptoms of malaria in the 5th Century BC The name malaria is derived from the Italian words, mal and aria, meaning “bad air”, because people of earlier times believed that the disease was caused by polluted air near swaps and wetlands in Europe. The scientific identification of malaria was not found until 1880.
The French army physician, Charles Laveran, while stationed in Algeria, noticed strange shapes of red blood cells in certain patients and identified the disease scientifically and linked to a certain protozoan. Although the disease had been identified, it was not until 1897, when British army physician, Ronald Ross studied birds and discovered that the malarial protozoan was transmitted through mosquitoes. Soon after, two Italian scientists noted that mosquitoes spread malaria to humans as well. Many attempts have been made to try to eradicate the disease.
As early as 7 AD, in Rome, swamps were drained to try to prevent the “bad air” from reaching nearby cities. Recently, in the 1950’s and 1960’s, about 25 years after the development of DDT, the United Nations World Health Organisation tried to wipe out the disease through the use of DDT. Although, the number of cases was reduces in many areas, they started again. Scientists today believe that malaria can never be eradicated due to the fact that the protozoan can manipulate easily and become resistant to a drug that is overused.
The mosquitoes that spread malaria are also becoming resistant to insecticides. Malaria can be treated on an individual basis and treatments and medicines can be used. To understand these treatments however, one must understand what happens to a malarial protozoan. The disease, malaria, is cause by the protozoan, Plasmodium, which lives in tropical regions all around the world. There are only four species of this protozoan that cause malaria in humans, Plasmodium ovale, Plasmodium vivax, Plasmodium malariae, and Plasmodium falciparum.
These protozoans are spread from infected to healthy people through the bite of the Anopheles mosquito, blood transfusions, or through hypodermic injections. This makes malaria one of the most easily communicable diseases in the world. These enter red cells where both sexual and asexual cycles continue. Malaria is spread only through the females of the 60 different types of the Anopheles mosquito, as the males do not feed on blood. The symptoms of this disease are many, however a physician must be consulted to avoid risk to a person.
To treat malaria, many drugs are used today. Forms of these drugs date back to the 1500’s and 1600’s. Physicians diagnose malaria by identifying Plasmodia in a patient’s body. Once identified, malaria can be treated with chloroquine and primaquine. Since some forms of Plasmodia falciparum have become resistant to these, quinine, mefloquine, or halofautrine are used. Almost all of the cases of malaria can be treated if done in the proper way. However, to suffer the pain and illness of malaria, people can use many preventive measures.
All swampy areas must be avoided as well as tropical water that may be contaminated or local food. People should just protect themselves from mosquitoes and risk of infection will be tremendously lowered. This can be done by impregnated bednets. These involve surrounding the bed with a curtain that is sprayed with certain compounds. These are normally pyrethroids or organophosphates, which create an effective barrier between the mosquito and its blood meal. Alternative ‘barrier’ methods are insect repellents.
These are certain chemicals that that when applied to the skin as a spray or lotion is quite effective at deterring the mosquito from landing on a person in order to feed. Other methods of controlling malaria are the use of insecticides and vaccines. Insecticides are chemicals such as pyrethrum, which are sprayed within persons living quarters. This was thought to kill the female mosquito preventing it from spreading malaria and laying further eggs as long as it had no means from escaping the room before spraying.
Vaccines work by stimulating antibody production to destroy a foreign organism in the body. As the foreign organism has the same surface antigens as the pathogenic organism, the antibody that the body produces to destroy the antigenic material in the vaccination will be equally as effective against the pathogenic organism. The lymphocytes that produce the antibody will remain in the blood stream. When the pathogenic organism enters the body the lymphocytes will be triggered to produce the antibody in order to destroy the invading organism.
At the moment this is where a lot of malaria research is centred – in trying to produce a malaria vaccine. Man evolved as a hunter-gatherer, with populations of low densities compared with other primates. At these low densities man would not have been the preferred host of many parasites, but would have experienced malaria as a zoonosis. It is postulated that the development of agriculture around 10,000 to 7000 years ago resulted in man made changes in nutrition, the environment and population density. These changes are so recent in genetic terms that the species has not adapted.
The success of our species, expressed as population expansion, has been at the cost of widespread disease, of which malaria related diseases are common manifestations. The burden is heaviest on pregnant women and children under five years old. Over 8 million of the 13 million under-five deaths in the world each year can be put down to diarrhoea, pneumonia, malaria, and vaccine-preventable diseases. But this simple way of classifying hides the fact that death is not usually an event with one cause but a process with many causes.
In particular, it is the conspiracy between malnutrition and infection, which pulls many people into the downward spiral of an early death and poor growth in children. Now, a new study has attempted to quantify the role of malnutrition in child deaths. Using data from 53 developing countries, researchers from Cornell University have concluded that over half of those 13 million child deaths each year are associated with malnutrition. Further, they show that more than three-quarters of all these malnutrition-assisted deaths are linked not to severe malnutrition but to mild and moderate forms.
This suggests that nutrition programmes focusing only on the severely malnourished will have far less impact than programmes to improve nutrition among the much larger number of mildly and moderately malnourished children. As discussed in the 1994 edition of The Progress of Nations, low-cost methods of reducing all forms of malnutrition are available and have been shown to work. And action on both fronts – to improve nutrition and to protect against disease – could save many more lives (and be far more cost-effective) than action on either front alone.
Malnutrition receives few banner headlines, like the AIDS crisis does. There is no excuse for starvation, with technology and science making food as plentiful as it is. Yet famine and malnutrition are not the same thing. Many of these children may be getting food. But what are missing are the nutrients they need to grow into healthy and productive adults. A report by UNICEF indicates that at least 100 million young children and several million pregnant women have damaged immune systems not because of HIV or AIDS, but because of malnutrition
It is thought that malaria can be prevented and risk of infection lowered with varies nutritional aspects. These include minerals such as Iron, zinc and Vitamins A, C, D, E, antioxidants, fatty acids and carbohydrates. Over the years, as the control of diseases such as malaria has improved, the significance of malnutrition has emerged more clearly. There is a need to understand its cause to ensure secure foundations for schemes of prevention, and thus preventing disease.
Nutrition and many tropical infections such as malaria interact, not just because of extensive geographical overlap between areas where malaria occur or nutrient deficiencies are common. The clinical and public health implications and the range of such interactions are becoming increasingly appreciated. It is evident that in many countries malnutrition is responsible for the high mortality in children along with disease. It is with children and pregnant women particularly that most of the research with nutrition and malaria has been done. Malaria is truly a grave problem and could affect any ignorant person.
However, if a person takes certain precautions and does not get involved with insects, they might just be safe from being one of the 300,000,000 people who are infected each year, or even worse, one of the 1,500,000 people that die of malaria annually. Most people are familiar with the Recommended Daily Allowances (RDA) for vitamins and minerals that have been established by the Food and Nutrition Board of the National Research Council. The RDA is defined as the level of intake of an essential nutrient that is judged to be adequate to meet the known needs of healthy people.
At these levels, in other words, people should not develop the deficiency illness associated with a lack of that nutrient. The RDA does not apply to people with special nutritional needs, nor does it suggest that these are the optimal dietary levels for these nutrients for normal people. We now know that mild to moderate deficiency of basic nutrients, while not causing the classic deficiency illnesses, may contribute to a host of other illnesses, especially in today’s world, where stress and poor lifestyle habits may tax the body’s nutritional resources.
Scientific data suggest that the consumption of many nutrients above the RDAs may prevent or combat many common illnesses. Vitamin C60 mgcitrus fruits, strawberries, tomatoes, cantaloupe, broccoli, asparagus, peppers, spinach, potatoes Vitamin E30 IUvegetable oils (soy, corn, olive, cottonseed, safflower, and sunflower), nuts, sunflower seeds, wheat germ. Beta Carotene15-50 mgdark green, yellow, and orange vegetables including spinach, collard greens, broccoli, carrots, peppers, and sweet potatoes; yellow fruits such as apricots and peaches.
IU = international units; mg = milligrams) Investigations into interactions between nutrient status and infectious disease are seriously complicated by the difficulties of assessing status of many nutrients during the acute phase response to infection. Many nutrients are acute phase reactants for example, plasma retinol, zinc and iron and the degree of transferrin saturation all decrease, and plasma copper and ferritin and erythrocyte protoporphyrin increase, in response to infection or trauma (Filteau, S M, and Tomkins, A M, 1994).
There is an urgent need for research into nutritional assessment of infected individuals and populations since these are frequently the people whose nutritional status is of most concern. The consistent alterations of micronutrient metabolism suggests that these may have advantages in the fight against infection, the alterations in iron metabolism have been suggested as a means of pathogen replication (Thurnham, 1990). The redistribution of zinc to liver and bone marrow after infection of inflammatory cytokines may serve to support acute phase protein synthesis and haematopoesis.
Patients with chronic inflammatory conditions have increased concentrations of zinc in mononuclear leukocytes, which may indicate that cells of the immune system are also favoured for zinc during inflammatory responses. The potential benefits of retinol fluxes during infection have not been explored. Although it is clear that a decreased plasma concentration of a nutrient during infection may be a beneficial adaptation rather than a harmful deficiency (Filteau, S M, and Tomkins, A M, 1994).
The problems of assessing nutrient status during infection have made it difficult to determine whether infections decrease status itself over the long term. Several factors could contribute to impaired status, including decreased appetite, decreased absorption due to diarrhoea, or increased requirement for nutrients for immune functions or tissue repair. Neither the American Heart Association nor professional medical societies endorse vitamin E supplements, though, mainly because most of the published research is observational. To date, there have only been two controlled clinical trials evaluating vitamin E.
There is some evidence that vitamin E (a-tocopherol) plays a role in the development of malaria infection. The malaria parasite is sensitive to oxidant stress and antioxidant agents such as Vitamin E may potentiate the infection in vivo (Skinner-Adams, T, et al 1998). Addition of vitamin E to cultures in vivo has been found to improve the growth of Plasmodium falciparum in old, normal red blood cells. In addition vitamin E deficient mice are resistant to Plasmodium Yoelii infection, while low serum vitamin E levels in humans with falciparum malaria are associated with a relatively short parasite clearance time.
Vitamin E like any other antioxidant vitamins also behave as a pro-oxidant under certain conditions and may therefore paradoxically inhibit the growth and development of malaria parasites at high blood concentrations. In a study results showed that vitamin E has limited ability to influence the growth of P. falciparum in vivo at medium concentrations, which span and exceed those present in normal blood serum (Skinner-Adams, T, et al 1998). Some inhibitory activity was seen at concentrations equivalent to the upper limit of normal human plasma concentrations.
Subphysiological vitamin E concentrations may, through increasing oxidant stress and perhaps membrane effects which impair merozoite invasion, inhibit the development of P. falciparum in humans. At supraphsiological concentrations, vitamin E behaves as a pro-oxidant and inhibition is seen. As malaria infection is associated with depressed serum vitamin E concentrations in humans, the maintenance of relatively high oxidant stress environment should aid in the treatment of malaria.
Although treatment with vitamin E may have an unpredictable effect on parasite burden, reflecting factors such as dose, pre-treatment plasma concentrations, and liver stores. Due to this, supplementation with vitamin E may not be appropriate in the acute phase of the illness. Beta-carotene is a previtamin-A compound found in plants. The body converts beta-carotene to vitamin-A. Vitamin A can be found in fresh apricots, asparagus, broccoli, cantaloupe, carrots, endive, kale, leaf lettuce, liver, mustard greens, pumpkin, spinach, squash, winter, sweet potatoes and watermelon.
Vitamin A has many beneficial uses it; 1) Aids in treatment of many eye disorders, including prevention of night blindness and formation of visual purple in the eye; 2) Promotes bone growth, teeth development, reproduction; 3) Helps form and maintain healthy skin, hair, mucous membranes; 4) Builds body’s resistance to respiratory infections; 5) Helps treat acne, impetigo, boils, carbuncles, open ulcers when applied externally. It is thought that the vitamin helps in shorting the duration of illnesses and helps in fighting infection. Vitamin A deficiencies may also lead to diarrhoea a malaria related diseases.
Clinical vitamin A deficiency in children, although still of public health significance in many countries, currently are rare in the United States and other industrially developed countries. Whereas clinical vitamin A deficiency is becoming less common in less industrialised countries, subclinical deficiency, also termed marginal vitamin A status, is still prevalent. In this regard, the incidence of mortality among pre-school children in many less industrialised countries is reduced by approximately 30% when vitamin A supplements are provided.
Each year, vitamin A deficiency contributes to the deaths of between 2 and 3 million children, to approximately 500,000 cases of permanent blindness, and to increased morbidity for many adults, especially among pregnant women. Vitamin A deficiency in children is common in countries where rice is a primary food staple. With support from The Micronutrient Initiative, PATH completed a feasibility study on the introduction of vitamin A-fortified Ultra Rice in the rural province of East Nusa Tenggara Timur in Indonesia.
The project was implemented by PATH, several local national government officers (NGOs), and Bon Dente Nutrition, a private company involved in the development of food products and fortificants. The trial verified the stability of vitamin A under field conditions, validated a mixing procedure for small rice millers, demonstrated consumer acceptability of the product, and confirmed the feasibility of selling vitamin A-fortified rice in local outlets. Furthermore, the trial attracted national, provincial, and local government interest in fortification of rice.
Vitamin A is often deficient in individuals living in malaria endemic areas, is essential for normal immune function, and several studies show it could play a part in potentiating resistance to malaria. Studies have shown that vitamin A deficient rats and mice are more susceptible to malaria than normal animals, and this susceptibility is readily reversed by vitamin A supplementation. Also, a genetic locus, which includes cellular retinol-binding protein, influences malaria mortality and parasitemia in mice.
In vitro, addition of free retinol to P. falciparum cultures reduced parasite replication in one study but not in another (Shankar A H, et al 1999). In humans there has been evidence for the role of protective vitamin A but it has not been proven. Although cross sectional studies with children and adults have shown that low plasma vitamin A concentrations are associated with increased blood parasite counts. However increased parasite counts can trigger an acute phase response, which transiently depresses the circulating vitamin A concentration.
The number of episodes of falciparum malaria among children in Papua New Guinea was 30% lower in children that received vitamin A supplementation than in those who received a placebo. At a cost of US $0. 03 per supplement and US $0. 25 per delivery, vitamin A ranks at supplementation ranks among the more cost effective non-pharmacological interventions for malaria. The mechanism by which vitamin A affects morbidity due to P. falciparum remain unknown. Also the beneficial effects of vitamin A are less evident in children younger than 1 year (Shankar A H, et al 1999).
Nutrient status influences immune function and resistance to disease. It is also thought that other nutrients such as zinc and thiamine may also reduce malaria morbidity. Cost, safety, and potential efficiency of targeted nutritional supplementation suggest that a rational approach to development of such interventions for malaria would be useful. These could be integrated with other controls such as treated bednets, chemoprophylaxis, future detection and rapid detection and treatment. Vitamin A supplementation may be an effective, inexpensive, and programmatically way of controlling P. falciparum malaria.
Vitamin A deficiency is a serious public health problem in Guatemala, affecting an estimated 22% of all children under five (Phillips, M, et al, 1996). There is considerable international evidence that rectifying vitamin A deficiencies offers important health benefits and at relatively low cost, making such programs highly cost effective. Though in the case of Guatemala some approaches may be more efficient than others (Phillips, M, et al, 1996). There are three main strategies for combating vitamin A deficiency world-wide. These strategies are food fortification, capsule distribution and diet modification.
Guatemala has examples of each of these three strategies in operation. The sugar fortification programme, initiated in 1987-88, established by law that all sugar that is processed and marketed for direct household consumption in the country should contain 15 mg of vitamin A per gram of sugar. A level originally designed to meet 100% of the vitamin A requirements given average sugar consumption per day for young children. This national fortification program has been complemented by geographically targeted interventions in areas where localised deficiencies where detected.
These include the distributing vitamin A capsules and promoting the production and consumption of vitamin A rich foods in areas which had high prevalence of vitamin A deficiency (Phillips, M, et al, 1996). In contrast to the capsule and food production/education programs, fortification reaches individuals regardless of their need for vitamin A and unlike the capsule program is not specifically targeted at women and children. The low cost of distributing the fortificant through sugar compensates for the fact that quite a substantial amount of the vitamin A reaches consumers who do not need it.
The only time when fortification looked lees attractive was in the 1989 program, when very low fortificant levels where detected in sugar samples despite adequate amounts of vitamin A being imported. The cost effectiveness of the capsule and food production/education programs has been high as the areas where they are implemented are often dispersed rural areas meaning transportation costs are high. Although the capsule method seems to be more effective when considering high risk groups (Phillips, M, et al, 1996).
Also a suitable vehicle for fortification must be considered if it is to be implemented. The food should be one which is consumed in a fairly homogeneous fashion by the targeted group, one which it is technically and economically feasible to fortify and one which will be culturally acceptable after fortification. With a very small budget it would probably be more worthwhile to invest in a focussed capsule distribution or perhaps a food production/education program in a high deficiency area rather than in fortification, whose effects would be highly diluted.
Where universal coverage is not possible, it may be necessary to assess the relative efficiency of targeting interventions at different geographical areas (West, K, P, et al 1984). Vitamin C is responsible for a number of benefits; it promotes healthy capillaries, gums, teeth, aids iron absorption, treats anaemia, especially for iron-deficiency anaemia, increases iron absorption from intestines, contributes to haemoglobin and red-blood-cell production in bone marrow, blocks production of nitrosamines.
Pregnancy requires vitamin-C supplements because of demands made by bone development, teeth and connective-tissue formation of fetus. Breast-feeding requires vitamin-C supplementation to support rapid growth of child. Anaemia as we know is a major public health problem. As in many developing countries, the most vulnerable population groups are pregnant and lactating women and pre-school and school-age children. School-age children are highly vulnerable to iron deficiency because there iron requirements for growth often exceed the dietary iron supply.
Several strategies have been proposed to overcome this problem including the use of iron supplements. This approach is effective but its usefulness is often limited by low compliance. Food fortification with iron is generally considered the most effective way to increase iron intake and can be achieved by fortifying a dietary staple such as cereal flour or by fortifying widely consumed foodstuffs such as sugar and salt. This strategy supplies everyone in the population with iron supplements including people who do not need it like adult men and postmenopausal women.
The preferred approach to target children would be to fortify a speciality food for that age group. One possibility would be to fortify a chocolate-flavoured milk drink with iron as was done in a recent study (Davidson, L, et al 1998). These chocolate drinks as well as milk contain inhibitors of iron absorption. A way around this is to add vitamin C (ascorbic acid) as is done in industrially produced foods. The study showed the effect of added ascorbic acid on iron absorption from the chocolate flavoured drink was clear.
The geometric mean iron absorption increased from 5. 4% to 7. 7% when the ascorbic acid content was doubled, from 25 to 50 mg. The enhancing effect of ascorbic acid on iron absorption is believed to be due to its ability to reduce ferric iron to ferrous iron, which binds less strongly with polyphenols and phytic acid (found in the test meal) to form insoluble complexes (Fairweather-Tait, S, and Hurrel, R F, 1996). Erythrocytic malaria parasites live in the blood which is rich in haemoglobin, a ready source of nutrients, but also a potential source of toxic forms of iron.
In acquiring nutrients the parasites take up large quantities of haemoglobin. In this process, globin is hydrolysed to free amino acids and haem is converted to haemozoin. Globin hydrolysis is presumed to provide the bulk of amino acids for parasite protein synthesis, and haem processing is thought to both detoxify haem molecules and provide necessary parasite iron. The processes of haemoglobin catabolism and iron utilisation are targets for a number of compounds with antimalarial activity.
Erythrocytic parasites require iron for the synthesis of iron containing proteins such as ribonucleotide reductase, superoxide dismutase and cytochromes and for de novo haem biosynthesis. The source of free iron for malaria parasites is not known. Three possible sources are serum iron, free erythrocytic iron and haemoglobin. There are some reports of iron uptake from serum by parasitised erythrocytes, supporting a serum source for parasite iron. This backs-up the observations that iron deficient individuals are partially protected against malaria infection.
Although studies showing a lack of transferin receptors on parasitised erythrocytes, argues against a serum source for parasite iron (Peto, T E A, Thompson, J L, 1986). Observations show that cell-impermeant, serum depleting, iron chelators have no antimalarial activity in culture. A report showed that the antimalarial effects of iron chelators in mice are independent of host iron status and a study showed that the course of malaria in children is unaffected by iron supplementation (Peto, T E A, Thompson, J L, 1986).
Arguing against free erythrocytic iron as the source of parasite iron are observations that iron chelators inserted into the erythrocyte cytoplasm are non toxic to cultured parasites. Considering this, the large amount of haemoglobin that is degraded by erythrocytic parasites, and the observation that small amounts of iron are released from haem after incubation at the pH of the food vacuole, it is logical that haemoglobin is the principal source of parasite iron (Rosenthal P J and Meshnick, S R, 1996). Although this has never been tested.
The best studied antimalarial iron chelator is deferoxamine (desferrioxamine B, DFO). Its antimalarial activity has been demonstrated in vitro, in animals and patients with both moderate and severe P. falciparum infections. The entry of DFO into the parasite is essential for antimalarial activity. DFO treatment of patients with cerebral malaria had a much greater effect on coma recovery time than on parasite clearance time, suggesting that iron chelation may have an effect on the disease process beyond its anti parasitic effect (Rosenthal, P J, 1996).
This suggests that it may be possible that iron deposition in tissue may be partially responsible for severe malaria. Indeed, haemozoin deposition in the brain was significantly higher in mice with cerebral malaria like illness than in mice with ordinary malaria. Although DFO has shown promising activity, it is unlikely to be of practical use as it is expensive and must be administrated by continuous infusion. A number of other iron chelators have shown antimalarial activity in vitro and in vivo. One of these may prove to be more clinically useful than DFO.