Wednesday, March 25, 2015

Key Role Of T Cells Is Highlighted By The Mapping Of Ovarian Cancer Tumor Cell Microenvironment


Dr. Enrique Jacome
Understanding the chemokine landscape of papillary serous ovarian cancer, the most common form of the disease, is important for the development of new immunotherapy strategies where the goal is to increase the presence and function of T cells in the tumor microenvironment. New research examining the molecular environment of ovarian cancer is the first to map the presence of chemokines, immune-system proteins that mobilize T cells in this disease. The study was conducted at the Perelman School of Medicine at the University of Pennsylvania and led in part by Emese Zsiros, MD, PhD, FACOG, a Penn fellow at the time the research was conducted and now an Assistant Professor in the Department of Gynecologic Oncology at Roswell Park Cancer Institute (RPCI). Findings have been published online ahead of print in the journal Clinical Cancer Research.
"Ovarian cancer patients who have T cells within their tumor at the time of diagnosis live significantly longer. These patients show an immune response to their disease, which slows down tumor progression and recurrence," said Dr. Zsiros. "The trafficking of T cells and other immune cells is mainly governed by chemokines, which are small signaling proteins that determine cell migration."
In this study, T cells were equipped with chemotherapy agents that target metastases in a complex multistep process. The ability of these re-engineered T cells to infiltrate is regulated by patients' own tumors and their chemokine environment. In their analysis of the papillary serous ovarian cancer environment, researchers found that, for a significant number of patients, the chemokine landscape is very diverse but exhibited a high number of known T cells that recruit chemokines.
Successful immunotherapy depends on the ability of the re-engineered T cells to travel into the tumor environment. The scientists also report that tumors lacking T cells also may express some of these quasi-universal chemokines. The challenge is to manipulate the ovarian cancer chemokine landscape in a way that promotes recruitment of tumor-specific T cells, while repressing the recruitment of other immune cells that dampen the patient's antitumor response.
Researchers also report that the ovaries, the primary disease site for these tumors, have the same chemokine expression as metastatic deposits outside the pelvis - an important factor to consider when designing immunotherapy trials targeting chemokines or chemokine receptors on T cells.
"Chemokines are important regulators of circulation, homing and retention of T cells, and thus the characterization of the tumor chemokine microenvironment is key to developing effective immunotherapy against solid tumors such as ovarian cancer," said principal investigator and senior author George Coukos, MD, PhD, faculty leader in tumor biology at the Ludwig Centre of the University of Lausanne. Dr. Coukos is an adjunct professor of Obstetrics and Gynecology at the Perelman School of Medicine at the University at Pennsylvania, where this research was conducted.
Roswell Park researchers will continue to focus on developing and testing new chemokine and chemokine-receptor-blocking agents to enhance the patients' own immune response with a goal of achieving longer survival and better quality of life for patients diagnosed with ovarian cancer.

Thursday, March 19, 2015

BU Study Finds Underlying Subfertility May Affect ART Birth Outcomes

Dr. Enrique Jacome
Birth outcomes for babies whose mothers used assisted reproductive technology (ART) are better in some cases, and worse in others, than for subfertile women who did not use ART, according to a first-of-its-kind study led by Boston University School of Public Health researchers.
Those findings, published online in the journal Fertility and Sterility, suggest that underlying subfertility, distinct from the use of ART, may account for some of the elevated risks in birth outcomes attributed to the use of in vitro fertilization and other ART procedures.
Researchers found that the risks of preterm birth and low birth weight were higher for singletons (single babies) born to mothers who had ART than for those who had fertility problems but did not use ART. But the risks of perinatal death - stillbirths or deaths within one week of birth -- were no higher for mothers with ART than for fertile women, while they were significantly higher for singletons born to subfertile mothers. And for twins, the risks of death among ART births were significantly lower than for either subfertile or fertile women.
The study is the first population based U.S. comparison of birth outcomes for women who received fertility treatment and those with subfertile indicators who did not use ART. The growth in the use of ART has raised concerns about a range of perinatal outcomes, including an excess of preterm births, low birth weight, and neonatal death. But because previous studies have compared ART birth outcomes with spontaneous conceptions, it has remained unclear whether differences in outcomes are related to underlying subfertility or other factors, such as the older average age of mothers.
The new study suggests that both underlying subfertility and ART itself may influence outcomes. Among singletons, the differences in rates of preterm birth and low birth weights were more pronounced when comparing the ART group to fertile women than to subfertile women, indicating that underlying fertility problems may play a role in those outcomes.
"Overall, these (findings) suggest an underlying risk associated with subfertility, distinct from that which may result from ART," the authors said.
Among twins, the study suggests that babies born after ART appear to have better outcomes. ART twin births compared with subfertile births had a longer mean gestational age, lower rate of very premature delivery and very low birth weight, and a much lower rate of perinatal death.
While the authors reached no conclusions, they said one possible explanation for the better ART outcomes was "the special baby" hypothesis, which posits that extra attention given to ART births by both parents and caregivers may contribute to more favorable outcomes. Mothers in the ART group tended to be older and more likely to deliver by cesarean section than the other groups studied.
"It may be possible that ART-related births involved greater attention to care, " the authors said, adding that "more sensitive and comprehensive measures of prenatal care than are currently available on a population basis will be necessary to determine if a 'special baby' hypothesis is supported."
The study is the first one published by a collaboration dubbed MOSART - for the Massachusetts Outcomes Study of Assisted Reproductive Technologies - which brings together BU School of Public Health childbirth experts with researchers from five other institutions to probe how ART influences health outcomes for women and children. The collaboration is funded through a five-year grant from the National Institutes of Health's National Institute of Child Health and Development.
The study linked detailed clinical information on ART treatment from all ART clinics in Massachusetts from 2004-2008 with data on births, fetal deaths and hospital records in the state. The subfertility comparison group was developed by means of an algorithm that conservatively identified births to women with indicators of subfertility, either through diagnosis codes or maternal reporting.
Lead author Eugene Declercq, professor of community health sciences at BUSPH, said the study is an important first step in "extricating the possible risks of ART from underlying infertility or maternal demographic and health risks." He encouraged further research taking advantage of the power of linking clinical and population data to examine ART outcomes for both infants and mothers.
Infertility affects an estimated 12 to 15 percent of women of reproductive age. The use of in vitro fertilization and other fertility-enhancing treatments has risen steadily in the U.S. Treatment with assisted reproductive technologies resulted in 65,160 live born infants in the U.S. in 2012, representing 1.6 percent of all U.S. births.

Tuesday, March 3, 2015

FDA Approves LILETTA - Levonorgestrel Releasing Intrauterine System

Dr. Enrique Jacome
Actavis plc, a leading global specialty pharmaceutical company, and Medicines360, a nonprofit women's health pharmaceutical company, have announced the approval of LILETTA (levonorgestrel-releasing intrauterine system) by the U.S. Food and Drug Administration (FDA) for use by women to prevent pregnancy for up to three years. LILETTA is placed in the uterus by a healthcare professional and works by continuously releasing levonorgestrel, a progestin, to prevent pregnancy.
Actavis and Medicines360's groundbreaking partnership will allow women, regardless of income and insurance coverage, to access this new and effective contraceptive option. Through the collaboration, LILETTA will be available in the U.S. commercially as well as at a lower cost to public health clinics enrolled in the 340B Drug Pricing Program.
"At Actavis, we are committed to developing alternative forms of contraceptive options. With the FDA's approval of LILETTA, we are pleased to offer women a novel IUD which provides three years of safe and effective contraception," said David Nicholson, PhD., Executive Vice President, Actavis Global Brands R&D.
"The FDA's approval of LILETTA marks an important milestone for women, providers, and the reproductive health community. LILETTA was designed from the beginning to be accessible by women, regardless of socioeconomic status," said Pamela Weir, Chief Operating Officer, Medicines360. "In the past, many barriers including expensive upfront costs or lack of insurance coverage have prevented women from obtaining IUDs." 
The approval of LILETTA was based on the largest hormonal IUD trial, ACCESS IUS (A Comprehensive Contraceptive Efficacy & Safety Study of an IUS), conducted in the U.S. with 1,751 enrolled women receiving LILETTA. LILETTA was safe and effective for a broad range of women, with a cumulative three year efficacy rate of 99.45 percent. LILETTA is indicated for women regardless of parity or BMI.
LILETTA is a small, flexible plastic T-shaped system which is 32 mm x 32 mm in size. It works to prevent pregnancy by slowly releasing levonorgestrel (LNG), a progestin, at an initial release rate of 18.6 mcg/day with an average in vivo release rate of LNG of approximately 15.6 mcg/day over a period of three years. Generally, LILETTA can be inserted at any time if the provider is reasonably certain that the woman is not pregnant. While LILETTA is intended for use up to three years, it can be removed by a healthcare professional at any time. LILETTA can be replaced at the time of removal with a new LILETTA, if continued contraceptive protection is desired.
"This new hormonal IUD was proven more than 99 percent effective in the largest ever IUD trial conducted in the U.S. It offers a long-term, highly-effective yet reversible option to prevent pregnancy for many women regardless of whether or not they've had a child before," said David L. Eisenberg, M.D., assistant professor of obstetrics and gynecology at Washington University in St. Louis and principal investigator and lead author of the ACCESS IUS study. "This long-acting reversible contraceptive is a desirable option for women looking to prevent pregnancy."
Actavis and Medicines360 expect that LILETTA will be available for use in the U.S. by Q2 2015.
About the Clinical Trial for LILETTA
The approval of LILETTA is supported by the largest hormonal IUD trial (ACCESS IUS) conducted in the U.S. designed to reflect the U.S. population. This multicenter open-label clinical trial included 1,751 women who received LILETTA. LILETTA was found to be 99.45 percent effective in preventing pregnancy in women regardless of age, parity (previous births), or BMI. The trial is ongoing to evaluate the use of LILETTA for up to four, five and seven years.
LILETTA was studied in women aged 16-45, with a BMI range of 15.8kg/m2 - 61.6kg/m2 (26.9 kg/m2 mean BMI) across women of various races and ethnicities. Most women were Caucasian (78.4 percent) or Black/African American (13.3 percent); 14.7 percent of women were of Hispanic ethnicity. Nearly 58 percent of trial participants were nulliparous (had not previously given birth), the largest percentage of nulliparous IUD patients ever studied. 
In an analysis of women who discontinued the study early, 97 percent returned to menses within three months after LILETTA was removed. Furthermore, in a group of women trying to conceive, 87 percent became pregnant within one year of removal. Additionally, some women were able to conceive as soon as two weeks after removal (12 days). Approximately 19 percent of women treated with LILETTA experienced amenorrhea (absence of menstruation) within one year of treatment and more than one-third experienced amenorrhea by the third year of treatment.
The incidence of ectopic pregnancy in the clinical trial with LILETTA, which excluded women with a history of ectopic pregnancy who did not have a subsequent intrauterine pregnancy, was approximately 0.12 percent per 100 woman-years.

Tuesday, February 24, 2015

Over The Past Decade IUDs And Hormonal Implants Have Become Five Times As Popular

Dr. Enrique Jacome
A new Centers for Disease Control and Prevention report finds a shift in preferences for birth control among American women, who are increasingly opting for long-lasting reversible contraceptives.
Long-acting reversible contraceptives (LARCs) include intrauterine devices (IUDs) and subdermal hormonal implants. Although IUDs were used more commonly in the US during the 1970s, concerns over their safety prompted a decline in use of these devices.
Since then, however, IUDs have been redesigned with safety in mind, and the Centers for Disease Control and Prevention (CDC) report shows that there has been growing interest in IUDs and 5-year contraceptive implants - which were approved in 1990 - because these contraceptive methods are highly effective at preventing unintended pregnancies.
IUDs are placed inside the uterus, where they release hormones or copper to prevent pregnancies. The CDC say that the failure rate for IUDs is below 1%, making them more effective than the birth control pill, which - partly due to users sometimes forgetting to take the pill - has a failure rate of about 9%.
According to the CDC's National Center for Health Statistics (NCHS) data, the use of LARCs waned during 1982-88 and remained stable up until 2002, but it has increased nearly five-fold over the last decade.
Among women aged 15-44, use of LARCs has increased from 1.5% in 2002 to 7.2% during 2011-13. The data also show LARCs are most popular among women aged 25-34 - more than twice as many women in this age group used LARCs during 2011-13 as women in other age groups.
Women who have given birth at least once were also found to be more likely to use LARCs compared with women who have not previously given birth. The report also found that this difference has increased over time.
CDC report tracks divergence in LARC use by race

There has also been a divergence in LARC use by race over the past 30 years. LARC use tripled among non-Hispanic white women between 2002 and 2006-10 and quadrupled among non-Hispanic black women, but LARC use among Hispanic women declined by 10% during this period.
Between 2006-10 and 2011-13, LARC use increased by 30% among non-Hispanic black women, however, there was a much higher increase in use among Hispanic (129%) and non-Hispanic white (128%) women. According to the report, non-Hispanic white women have traditionally used LARCs at the lowest rate.
When informed about the range of contraceptive options available to them, women are more likely to opt for long-acting contraceptive methods over the oral contraceptive pill or transdermal patch, recent research has found.
Despite this, the birth control pill remains the most popular contraceptive, used by 16% of women in the US.
Leading gynecologists from the American Congress of Obstetricians and Gynecologists (ACOG) say that while women are increasingly taking up LARCs as contraceptive options, use is still relatively low in the US:
"In part, high unintended pregnancy rates in the US may be the result of relatively low use of long-acting reversible contraceptive methods, specifically the contraceptive implant and intrauterine devices."

Monday, February 2, 2015

Research Shows Girls Who Drink Sugary Drinks Every Day May Start Periods Early

Dr. Enrique Jacome
Consumption of sugar-sweetened beverages has been associated with increased risk of obesity and type 2 diabetes. Now, a new study finds girls who frequently drink such beverages are likely to start menstruation earlier than those who do not consume sugary drinks, potentially putting them at higher risk of breast cancer.
A girl drinking soda
The research team, led by Karin Michels, associate professor at Harvard Medical School in Boston, MA, publishes their findings in the journal Human Reproduction.
According to the Centers for Disease Control and Prevention (CDC), approximately half of the US population consumes sugary drinks on any given day, including around 60% of females aged 2-19 years.
Sugar-sweetened drinks have grown to be a major public health concern, with numerous studies associating the beverages with increased weight gain in children and adolescents. In October 2014, Medical News Today reported on a study that also linked sugary drink consumption to premature aging of immune cells.
This latest study, however, is the first to associate sugary drink consumption in girls with the age of first menstruation, or menarche.
To reach their findings, Prof. Michels and her team analyzed 5,583 girls aged 9-14 years who were a part of the Growing Up Today Study, which involves 16,875 children of participants from the Nurses Health Study II.
At study baseline in 1996, none of the girls had started menstruation. They were followed-up until 2001, by which point 159 girls (3% of the participants) had started menstruation.
At several points during the 5-year study period, the girls were required to complete a dietary questionnaire that disclosed their consumption of sugary drinks - drinks that contain added sugars like sucrose, glucose and corn syrup.
They were asked how often they consumed a serving of these drinks, such as one glass or can of soda or one glass, can or bottle of sweetened ice tea. To assess the effects of artificially and naturally sweetened drinks, the girls were also asked how often they consumed a serving of diet soda or fruit juice.
1.5 servings of sugary drinks a day linked to starting menstruation 2.7 months earlier

The researchers found that, on average, girls who consumed more than 1.5 servings of sugar-sweetened beverages started menstruation 2.7 months earlier than girls who consumed two or fewer servings of these drinks each week.
What is more, the team found that at any age between 9 and 18.5 years, girls who consumed more than 1.5 servings of sugary drinks each day were approximately 24% more likely to begin menstruation in the next month than girls who drank two or fewer servings each week.
Overall, girls who drank the most sugary drinks started menstruation aged 12.8 years, while those who drank the least began menstruation aged 13 years.
The team found no association between consumption of artificially and naturally sweetened drinks and age of first menstruation.
These results remained significant even after the researchers accounted for other factors that could influence the age of first menstruation, such as body mass index (BMI), birth weight, height physical activity, ethnicity/race, family composition and how often the girls ate dinner with their family.
Commenting on the findings, Prof. Michels says:

"Our study adds to increasing concern about the widespread consumption of sugar-sweetened drinks among children and adolescents in the US and elsewhere.

The main concern is about childhood obesity, but our study suggests that age of first menstruation occurred earlier, independently of body mass index, among girls with the highest consumption of drinks sweetened with added sugar. These findings are important in the context of earlier puberty onset among girls, which has been observed in developed countries and for which the reason is largely unknown."
The team notes sugary drinks have a higher glycemic index than naturally sweetened drinks, which can trigger a rise in insulin concentrations. An increase in insulin concentrations can lead to a rise in concentrations of sex hormones, which can cause earlier menstruation - a potential explanation for the team's findings.
Findings may indicate an increased risk of breast cancer

The researchers say their findings raise concern as earlier menstruation has been associated with increased risk of breast cancer. 
They say a 1-year decrease in age at first menstruation is estimated to raise the risk of breast cancer by 5%. "Thus, a 2.7-month decrease in age at menarche likely has a modest impact on breast cancer risk."
"The amount of sugar-sweetened beverages consumed by girls in our highest category of consumption, more than 1.5 servings per day, however, is likely low compared with consumption in certain other populations, in which we would expect an even more dramatic decrease in age at menarche," they add.
"Most importantly, the public health significance of sugar-sweetened beverage consumption at age at menarche, and possibly breast cancer, should not be overlooked, since, unlike most other predictors of menarche, sugar-sweetened beverage consumption can be modified."

Wednesday, January 28, 2015

Mouse Model Discovers Two Genes Behind The Most Severe Form Of Ovarian Cancer

Dr. Enrique Jacome
In a new study reported in the journal Nature Communications, cancer researchers describe how they developed a mouse model of a very aggressive ovarian cancer that accurately portrays the disease as it occurs in humans. The model has helped them identify two mutated genes whose interaction appears to trigger, then hasten, the development of the cancer.
The researchers, from the University of North Carolina (UNC) at Chapel Hill, hope their findings will open new avenues to better treatments and much-needed diagnostic screens.
Ovarian cancer affects the ovaries, the reproductive organs responsible for producing eggs and female hormones in women.
Ovarian cancer is hard to detect, because the symptoms - such as feeling bloated and experiencing changes in appetite - are often mistaken for other conditions. There is also no effective diagnostic screen for early detection of ovarian cancer.
When ovarian cancer is found early, it can be highly treatable, and the 5-year survival rate in such cases is over 90%. But unfortunately, only 1 in 5 cases are found early, leaving the majority of patients to learn that their cancer has spread, is at an advanced stage that is hard to treat and their prognosis is stark.
While the pace is slow, the situation is gradually improving. Thirty years ago, the survival rate for women diagnosed with ovarian cancer in the US was 10-20%. Nowadays, it is nearer 50%. 
According to the American Cancer Society (ACS), a woman's risk of getting ovarian cancer during her lifetime is about 1 in 75, and her lifetime chance of dying from the disease is about 1 in 100.
The ACS estimate that in 2015, about 21,290 women in the US will receive a new diagnosis of ovarian cancer and about 14,180 women will die from the disease.

New mouse model depicts aggressive ovarian cancer just as it presents in humans

The senior author of the new study is Terry Magnuson, the Sarah Graham Kenan Professor and chair of UNC's Department of Genetics. He says the new model portrays an extremely aggressive form of ovarian cancer - ovarian clear cell carcinoma - just as it presents in women.
Not all mouse models of human conditions are able to portray them as accurately as they occur in humans. But the model that Prof. Magnuson and colleagues have developed is based on genetic mutations found in human cancer samples.
Prof. Magnuson says they used the mouse model to show how mutations in two genes - ARID1A and PIK3CA - interacted to trigger the aggressive ovarian cancer:
"When ARID1A is less active than normal and PIK3CA is overactive, the result is ovarian clear cell carcinoma 100% of the time in our model."
The team was also able to use the mouse model to show that BKM120, a drug that suppresses PI3 kinases - proteins that are involved in cancer cell growth, survival and proliferation - directly inhibited ovarian tumor growth and significantly prolonged the lives of mice.
BKM120 is currently undergoing human trials for the treatment of other cancers.

Two gene mutations interact to trigger aggressive ovarian tumor formation and growth

The study arose from previous work that found the ARID1A gene was highly mutated in several types of tumor, including ovarian clear cell carcinoma. But that work also found deleting the gene in mice did not trigger tumor formation or growth.
This is how the UNC team found the gene needed to interact with another gene, as Dr. Ronald Chandler, a postdoctoral fellow in Prof. Magnuson's lab, explains:
"We found that the mice needed an additional mutation in the PIK3CA gene, which acts like a catalyst of a cellular pathway important for cell growth. Too little expression of ARID1A and too much expression of PIK3CA is the perfect storm; the mice always get ovarian clear cell carcinoma. This pair of genes is really important for tumorigenesis."
Dr. Chandler says their research also "shows why we see mutations of both ARID1A and PIK3CA in various cancers, such as endometrial and gastric cancers."
The team also discovered that ARID1A and PIC3CA mutations were involved in overproduction of a protein that helps trigger inflammation. They say they do not know if inflammation causes ovarian clear cell carcinoma, but they do know it is important for tumor cell growth.
Speaking about the protein - Interleukin-6, or IL-6 - Prof. Magnuson says: "We think that IL-6 contributes to ovarian clear cell carcinoma and could lead to death. You really don't want this cytokine circulating in your body."
He suggests when they treated tumor cells with an antibody that targets IL-6, it inhibited cancer cell growth, so perhaps reducing levels of IL-6 could help patients, he adds.

The potential for new ways to diagnose ovarian cancer

The researchers say that while their work will help identify better treatment targets, it may also lead to new ways of diagnosing ovarian cancer.
Dr. Chandler says maybe they could find a biomarker that could be used to screen women. Perhaps there is a cell surface protein "downstream of ARID1A," he suggests, adding that:
"Right now, by the time women find out they have ovarian clear cell carcinoma, it's usually too late. If we can find it earlier, we'll have much better luck successfully treating patients."
The study was funded by the National Institutes of Health, with additional support from the ACS and the Ovarian Cancer Research Fund.
In March 2014, Medical News Today learned how scientists achieved another breakthrough by finding the genetic cause of a rare, aggressive ovarian cancer that most often strikes girls and young women. 
In that study - which used groundbreaking genomic techniques - the international team found several strong links between a mutation in the SMARCA4 gene and a large majority of patients with small cell carcinoma of the ovary, hypercalcemic type (SCCOHT).

Tuesday, January 6, 2015

The Health Benefits Of Iron

Dr. Enrique Jacome
Iron deficiency anemia is the world's most common nutritional deficiency disease and is most prevalent among children and women of childbearing age. Anemia develops due to an inadequate amount of iron in the diet or poor iron absorption.

Iron deficiencies can be caused or exacerbated by injury, blood loss, hemorrhage or gastrointestinal diseases that impair iron absorption. Inadequate intake of folate, protein and vitamin C can also contribute to iron deficiency.

MNT Knowledge Center feature is part of a collection of articles on the health benefits of popular vitamins and minerals. It provides an in-depth look at recommended intake of iron, its possible health benefits, foods high in iron and any potential health risks of consuming iron. 

Recommended intake
The Recommended Daily Allowance (RDA) for iron depends on age and gender. 
Children:
  • 1-3 years - 7 milligrams 
  • 4-8 years - 10 milligrams.
Males:
  • 9-13 years - 8 milligrams
  • 14-18 years - 11 milligrams
  • 19 years and older - 8 milligrams.
Females:
  • 9-13 years - 8 milligrams
  • 14-18 years - 15 milligrams
  • 19-50 years - 18 milligrams
  • 51 years and older - 8 milligrams.
Pregnancy:
  • 27 milligrams.
An estimated 8 million women of childbearing age in the US suffer from iron deficiency severe enough to cause anemia. Iron deficiency during pregnancy may raise the risk for preterm delivery.
Iron supplements are available, but it is best to obtain any vitamin or mineral through food first. It is not the individual vitamin or mineral alone that make certain foods an important part of our diet, but the synergy of the foods nutrients working together. It has been proven time and again that isolating certain nutrients in supplement form will not provide the same health benefits as consuming the nutrient from a whole food. First focus on obtaining your daily iron requirement from foods then use supplements as a backup.

Possible health benefits of consuming iron

Iron deficiency can cause many health problems. Common difficulties associated include delayed cognitive function, poor exercise performance and lowered immune function. In children, iron deficiency anemia can cause psychomotor and cognitive abnormalities resulting in future learning difficulties.

Healthy pregnancy

Low iron intakes increase a woman's risk of premature birth and the risk of her infant having low birth weight, low iron stores and impaired cognitive or behavioral development.

More energy

Not getting enough iron in your diet can affect how efficiently your body uses energy. Iron carries oxygen to the muscles and brain and is crucial for both mental and physical performance. Low iron levels may result in a lack of focus, and an increase in irritability. 

Better athletic performance

Iron deficiency is more common among athletes, especially young female athletes, than sedentary individuals. Iron deficiency in athletes decreases athletic performance and weakens immune systems. A lack in hemoglobin iron can greatly reduce physical work performance via a decrease in oxygen transport to exercising muscle. 

Foods high in iron

Iron has a low bioavailability, meaning that it has poor absorption within the small intestine and low retention in the body, decreasing its availability for use. The efficiency of absorption depends on the source of iron, foods consumed with the iron, and overall iron status of the person. In many countries, wheat products and infant formulas are fortified with iron.
There are two types of dietary iron - heme and non-heme. Most animal products and seafood contain heme iron, which is easier to absorb than non-heme. Non-heme iron sources include beans, nuts, vegetables and fortified grains. The recommended iron intake for vegetarians is 1.8 times higher than for those who eat meat in order to make up for the lower absorption level from plant-based foods.
Proton pump inhibitors (lansoprazole [Prevacid®] and omeprazole [Prilosec®]) used to reduce the acidity of stomach contents can inhibit the absorption of iron. The polyphenols and tannins in coffee and tea also decrease non-heme iron absorption. Eating foods that are high in vitamin C, on the other hand, help to increase iron absorption.
clams tomatoes and shrimp
Clams contain a significant 24 mg of iron per 3 oz.
  • Clams, canned, 3 oz: 24 milligrams
  • Cereal, fortified, one serving: 1-22 milligrams
  • White beans, canned, 1 cup: 8 milligrams
  • Chocolate, dark, 45-69% cacao, 3 oz: 7 milligrams
  • Oysters, cooked, 3 oz: 6 milligrams
  • Spinach, cooked, 1 cup: 6 milligrams
  • Beef liver, 3 oz: 5 milligrams
  • Blueberries, frozen, ½ cup: 5 milligrams
  • Lentils, boiled and drained, ½ cup: 3 milligrams
  • Tofu, firm, ½ cup: 3 milligrams
  • Chickpeas, boiled and drained, ½ cup: 2 milligrams
  • Tomatoes, canned, stewed, ½ cup: 2 milligrams
  • Ground beef, lean, 3 oz: 2 milligrams
  • Potato, baked, medium: 2 milligrams
  • Cashew nuts, roasted, 1 oz: 2 milligrams
  • Egg, 1 large: 1 milligram.1,2
Potential health risks of consuming iron

The tolerable upper intake level for iron is between 40-45 milligrams. Adults with healthy functioning gastrointestinal systems have a very low risk of iron overload from dietary sources. 
Taking iron supplements of 20 milligrams or more on a frequent basis can cause nausea, vomiting and stomach pain, especially if the supplement is not taken with food. In severe cases, iron overdoses can lead to organ failure, coma, seizure, and even death.
Some studies have suggested that excessive iron intake can increase the risk of coronary heart disease and cancer.
Iron supplements can interact with several medications, including levodopa (used to treat restless leg syndrome and Parkinson's) and levothyroxine (used to treat hypothyroidism, goiter, and thyroid cancer). 
It is the total diet or overall eating pattern that is most important in disease prevention and achieving good health. It is better to eat a diet with a variety than to concentrate on individual nutrients as the key to good health.