Menu

Artificial Sweeteners: Possible Link to Increased Cancer Risk

édulcorant artificiel

Aspartame, a well-known artificial sweetener, is for example present in thousands of food products worldwide. © Mathilde Touvier/Inserm

Artificial sweeteners are used to reduce the amounts of added sugar in foods and beverages, thereby maintaining sweetness without the extra calories. These products, such as diet sodas, yoghurts and sweetener tablets for drinks, are consumed by millions of people daily. However, the safety of these additives is the subject of debate. In order to evaluate the risk of cancer linked to them, researchers from Inserm, INRAE, Université Sorbonne Paris Nord and Cnam, as part of the Nutritional Epidemiology Research Team (EREN), analyzed data relating to the health of 102,865 French adults participating in the NutriNet-Santé cohort study and their consumption of artificial sweeteners. The results of these statistical analyses suggest a link between the consumption of artificial sweeteners and an increased risk of cancer. They have been published in PLOS Medicine.

Given the adverse health effects of consuming too much sugar (weight gain, cardiometabolic disorders, dental caries, etc.), the World Health Organization (WHO) recommends limiting free sugars1 to less than 10% of one’s daily energy intake2. Therefore, in order to ensure that foods maintain that sweet taste so sought after by consumers worldwide, the food industry is making increasing use of artificial sweeteners. These are additives that reduce the amount of added sugar (and calories) without reducing sweetness. What is more, in order to enhance flavor, manufacturers use them in certain products that traditionally contain no added sugar (such as flavored potato chips).

Aspartame, a well-known artificial sweetener, is for example present in thousands of food products worldwide. While its energy value is similar to that of sugar (4 kcal/g), its sweetening power is 200 times higher, meaning that a much smaller amount is needed to achieve a comparable taste. Other artificial sweeteners, such as acesulfame-K and sucralose, contain no calories at all and are respectively 200 and 600 times sweeter than sucrose.

Although several experimental studies have pointed to the carcinogenicity of certain food additives, there are no robust epidemiological data supporting a causal link between the everyday consumption of artificial sweeteners and the development of various diseases. In a new study, researchers sought to examine the links between the consumption of artificial sweeteners (total and most often consumed) and the risk of cancer (global and according to the most common types of cancer) in a vast population study. They used the data provided by 102,865 adults participating in the NutriNet-Santé study (see box below), an online cohort initiated in 2009 by the Nutritional Epidemiology Research Team (EREN) (Inserm/Université Paris Nord/CNAM/INRAE), which also coordinated this work.

The volunteers reported their medical history, sociodemographic data and physical activity, as well as information on their lifestyle and health. They also gave details of their food consumption by sending the scientists full records of what they consumed over several 24-hour periods, including the names and brands of the products. This made it possible to accurately evaluate the participants’ exposure to additives, and more particularly to artificial sweeteners.

After collecting information on cancer diagnoses over the NutriNet-Santé study period so far (2009-2021), the researchers conducted statistical analyses in order to study the links between the use of artificial sweeteners and the risk of cancer. They also took into account various potentially confounding factors, such as age, sex, education, physical activity, smoking, body mass index, height, weight gain over the study period so far, family history of cancer, as well as intakes of energy, alcohol, sodium, saturated fatty acids, fiber, sugar, whole grain foods and dairy products.

The scientists found that compared with those who did not consume artificial sweeteners, those who consumed the largest amounts of them, especially aspartame and acesulfame-K, were at increased risk of developing cancer, irrespective of the type.

Higher risks were observed for breast cancer and obesity-related cancers.

In accordance with several in vivo and in vitro experimental studies, this large-scale, prospective study suggests that artificial sweeteners, used in many foods and beverages in France and throughout the world, may represent an increased risk factor for cancer,” explains Charlotte Debras, PhD student and lead author of the study. Further research in other large-scale cohorts will be needed in order to replicate and confirm these findings.

These findings do not support the use of artificial sweeteners as safe alternatives to sugar, and they provide new information in response to the controversy regarding their potential adverse health effects. They also provide important data for their ongoing re-evaluation by the European Food Safety Authority (EFSA) and other public health agencies worldwide,” concludes Dr. Mathilde Touvier, Inserm Research Director and study coordinator.

NutriNet-Santé is a public health study coordinated by the Nutritional Epidemiology Research Team (EREN, Inserm / INRAE / Cnam / Université Sorbonne Paris Nord) which, thanks to the commitment and loyalty of over 170,000 participants (known as “Nutrinautes”), advances research into the links between nutrition (diet, physical activity, nutritional status) and health. Launched in 2009, the study has already given rise to over 200 international scientific publications. In France, new participants are currently being encouraged to join in order to continue to advance research on the relationship between nutrition and health.

By devoting a few minutes per month to answering various online questionnaires relating to diet, physical activity and health, participants contribute to furthering knowledge of the links between diet and health. With this civic gesture, we can each easily participate in research and, in just a few clicks, play a major role in improving the health of all and the wellbeing of future generations. These questionnaires can be found on the secure platform www.etude-nutrinet-sante.fr.

 

1 Sugars added to foods and beverages and sugars naturally present in honey, syrups, and fruit juices.

2 World Health Organization, 2015

La tenue d’un colloque à l’IHU intitulé « Premier bilan des connaissances et des controverses scientifiques… » interpelle, les membres fondateurs se mobilisent

 

Increased Ischemic Stroke Risk Associated with Certain Medications for Nausea and Vomiting

Every year in France, 140,000 people have a stroke. © Adobe Stock

Every year in France, 140,000 people have a stroke1. Around 80% are ischemic strokes or cerebral infarctions, which occur when a brain artery is obstructed by a blood clot. Studies have shown that the risk of ischemic stroke is increased by the use of antipsychotics: medications with antidopaminergic2 properties that are commonly prescribed in psychiatry. Researchers from Inserm and Université de Bordeaux (Bordeaux Population Health Research Center) and Bordeaux University Hospital evaluated the ischemic stroke risk associated with exposure to other antidopaminergic medications—antiemetics, which are very commonly used in the symptomatic treatment of nausea and vomiting. The findings of this study, which were obtained by analyzing French national health insurance reimbursement data, show a link between the use of these medications and the risk of ischemic stroke. These findings have been published in British Medical Journal.

Since the early 2000s, the use of antipsychotic medications indicated in psychiatry has been associated with an increased risk of ischemic stroke. This risk, which has been demonstrated for all of these medications with antidopaminergic properties, increases with age and the existence of dementia. Given that the mechanisms behind this phenomenon have not been elucidated, one possible hypothesis is the antidopaminergic action of these medications. The antiemetics—domperidone, metoclopramide and metopimazine—used in the symptomatic treatment of nausea and vomiting of various causes (acute gastroenteritis, migraine, chemo- or radiotherapy, or following an operation) are other medications that possess this property.

Prior to this, there had been no studies published evaluating the risk of ischemic stroke associated with exposure to antidopaminergic antiemetics. Yet these are very commonly used medications: in 2017, over 4 million people in France had at least one reimbursement for metopimazine, the antiemetic most often used.

Researchers from Inserm, Université de Bordeaux and Bordeaux University Hospital decided to take a closer look at this risk. They conducted a “case-time-control” study3 using French national health insurance reimbursement and hospital admissions data. In this type of study, the potential use of the medication in the period immediately preceding the stroke (here 14 days) is compared with the same use during an earlier period (here more than one month preceding the stroke) in which it could not have caused the event. More extensive use in the period immediately preceding the stroke suggests that the medication may well have played a role.

This method, in which the subject is their own reference, automatically takes into account personal risk factors for ischemic stroke, such as smoking, body mass index, physical activity, and dietary habits. Other factors that may vary over the short follow-up period in a given individual were considered, including the use of medications that present a risk for stroke (such as vasoconstrictors) or those that on the other hand prevent this risk (anticoagulants, antiplatelet agents).

The researchers began by analyzing the data for 2,612 adults hospitalized for a first ischemic stroke and who had started antiemetic treatment in the 70 days prior to it. In these subjects, they found that the use of antiemetics was more extensive in the days preceding the stroke, with a peak in treatment initiations during this period. This finding suggests an increased risk of ischemic stroke in the first days of using these medications (Figure below).

 

antiémétiques

Distribution of antiemetics initiation over the 70 days preceding the ischemic stroke (N=2 612 subjects with ischemic stroke).

To eliminate bias in the results that could occur should medication use vary greatly over time in the general population (for example during outbreaks of acute gastroenteritis), the study went on to consider, over the same period, a randomized group of 21,859 people who had not had a stroke. In this group, no peak or excessive use of antiemetics comparable with that seen in patients having presented a stroke was found (Figure below).

 

Distribution of antiemetics initiation within the 70 days preceding the reference date (N=21 859 subjects without ischemic stroke).

The results of this study suggest an increased risk of ischemic stroke in the first days of using antidopaminergic antiemetics. This increased risk was found for the three antiemetics studied—domperidone, metoclopramide and metopimazine—and the risk appeared higher in the first days of use.

“This first study provides a strong signal concerning medications that are widely used in the general population. In the immediate term, it appears to be very important to have these findings replicated in other studies—studies which could also provide indications on the frequency of this adverse effect, which we could not measure here given the methodology chosen. Having accurate information on the subtypes of ischemic stroke and their location would also allow us to explore the mechanisms involved,” explains Anne Bénard-Laribière, one of the authors of the study.

 

1Stroke, the leading cause of acquired disability in adults, Inserm, 2019 (only available in French)

2 Antidopaminergics block the action of dopamine, a neurotransmitter involved in regulating behavior, muscle tone and movement coordination, among other things.

3 This study was conducted as part of the work program of the DRUGS-SAFER Center, funded by the French medicines agency (ANSM) and partner of the EPI-PHARE GIS.

COVID-19: “Reactive” Vaccination, Effective in Case of High Viral Circulation?

Scientists are considering new strategies to continue to promote vaccination among the populations that remain hesitant © Mat Napo on Unsplash

Although the majority of its population is fully vaccinated, the virus continues to actively circulate in France. As health restrictions are being lifted, fears of a resurgence of the epidemic and of the emergence of new more contagious variants are leading scientists to consider new strategies to continue to promote vaccination among populations that remain hesitant. A new modeling study by researchers from Inserm and Sorbonne Université at the Pierre Louis Institute of Epidemiology and Public Health shows that a “reactive” vaccination strategy targeting homes, schools and workplaces where cases are detected could have beneficial effects, reducing the number of COVID-19 cases in certain epidemic situations. The findings of this research have been published in Nature Communications.

Mass COVID-19 vaccination campaigns in many countries have greatly reduced the pandemic. However, the vaccination rate is now stalling in Europe and the USA due to logistical constraints and the vaccine hesitancy of part of the population.

In March 2022, 79% of French people were fully vaccinated with a two-dose regimen and 53% had received the third (booster) dose. While these figures are high, efforts to counter the epidemic must be maintained: against a background of ever-intense viral circulation and the lifting of health restrictions, a resurgence of the epidemic remains possible – and with it the appearance of more contagious variants.

In such a context, and to improve efficacy, many scientists therefore believe that other vaccine strategies promoting accessibility and acceptability should be tested.

Researchers from Inserm and Sorbonne Université were therefore interested in a “reactive” vaccination strategy, which involves vaccinating homes, schools and workplaces where cases have been detected. This approach is already used in other epidemics, for example against outbreaks of meningitis. In COVID-19, it has occasionally been used on the ground in France, for example in Strasbourg at the Haute Ecole des Arts du Rhine (HEAR), following the discovery of a cluster of the delta variant.

What is “ring vaccination”?

In other epidemic contexts, for example during some Ebola epidemics, other innovative strategies have been deployed to reach as many people as possible. The most well-known is that of ring vaccination, which involves immunizing contacts of confirmed cases or contacts of those contacts.

The research team wished to evaluate the effects of this reactive approach on viral circulation and the number of cases of COVID-19 in different epidemic scenarios. In order to build their model, the scientists used National Institute of Statistics and Economic Studies (INSEE) data to model a typical population with the sociodemographic characteristics, social contacts, and professional situations of a population the size of an average French city.

Several parameters were also incorporated into the model, such as disease characteristics, vaccination coverage, vaccine efficacy, restrictions on contact in workplaces or in the community, travel, and the implementation of contact tracing strategies.

The scientists were then able to study the impact of a reactive vaccination strategy on several scenarios of epidemic dynamics. They show that in the majority of the scenarios, with the same number of vaccine doses, a reactive strategy is more effective than other vaccination strategies in reducing the number of COVID-19 cases.

For example, in a context where vaccination coverage is approximately 45% and viral circulation is high, the reduction in the number of cases over a two-month period increases from 10 to 16% when comparing a mass vaccination program with a program in which reactive vaccination is set up in parallel to mass vaccination.

The findings suggest that this strategy is especially effective when vaccination coverage is low and when combined with robust contact tracing measures.

When vaccination coverage is high, a reactive strategy is less useful, as most of those in contact with an infected person are already vaccinated. Nevertheless, such an approach would still have the benefit of reaching people who are not vaccinated and convincing them more easily of the utility of the vaccine. Indeed, exposure to the virus increases one’s perception of the risks and tends to make vaccination more acceptable.

“The model we built enables reactive vaccination to be considered as an effective strategy for increasing vaccination coverage and reducing the number of cases in some epidemic scenarios, especially when combined with other measures such as effective contact tracing. This is a tool that can also be reused and adapted in France should another variant emerge and where the efficacy of a reactive strategy needs to be tested in order to administer any boosters. This modeling may also be of interest to other countries with sociodemographic characteristics similar to France, but lower vaccination coverage,” explains Chiara Poletto, Inserm researcher and last author of the study.

Discovery of an immune escape mechanism promoting Listeria infection of the central nervous system

 vaisseau cérébral

Section of a cerebral vessel in an infected animal model containing infected monocytes adhering to endothelial cells. Listeria is marked in red, actin in white (including the actin tails propelling Listeria), nuclei in blue and macrophages in green. © Biology of Infection Unit – Institut Pasteur

Some “hypervirulent” strains of Listeria monocytogenes have a greater capacity to infect the central nervous system. Scientists from the Institut Pasteur, Université Paris Cité, Inserm and the Paris Public Hospital Network (AP-HP) have discovered a mechanism that enables cells infected with Listeria monocytogenes to escape immune responses. This mechanism provides infected cells circulating in the blood with a higher probability of adhering to and infecting cells of cerebral vessels, thereby enabling bacteria to cross the blood-brain barrier and infect the brain. The study will be published in Nature on March 16, 2022.

The central nervous system is separated from the bloodstream by a physiological barrier known as the blood-brain barrier, which is very tight. But some pathogens manage to cross it and are therefore able to infect the central nervous system, using mechanisms that are not yet well understood.

Listeria monocytogenes is the bacterium responsible for human listeriosis, a severe foodborne illness that can lead to a central nervous system infection known as neurolisteriosis. This central nervous system infection is particularly serious, proving fatal in 30% of cases.

Scientists from the Biology of Infection Unit at the Institut Pasteur (Université Paris Cité, Inserm) and the Listeria National Reference Center and WHO Collaborating Center led by Marc Lecuit (Université Paris Cité and Necker-Enfants Malades Hospital (AP-HP)) recently discovered the mechanism by which Listeria monocytogenes infects the central nervous system. They developed a clinically relevant experimental model that reproduces the different stages of human listeriosis, and involves virulent strains of Listeria[1] isolated from patients with neurolisteriosis.

The scientists first observed that inflammatory monocytes, a type of white blood cell, are infected by the bacteria. These infected monocytes circulate in the bloodstream and adhere to the cerebral vessels’ cells, allowing Listeria to infect the brain tissue.

The research team then demonstrated that InIB, a Listeria monocytogenes surface protein, enables the bacteria to evade the immune system and survive in the protective niche provided by the infected monocytes. The interaction between InlB and its cellular receptor c-Met blocks the cell death mediated by cytotoxic T lymphocytes, which specifically target Listeria-infected cells. InIB therefore enables infected cells to survive cytotoxic T lymphocytes.

This mechanism extends the life span of infected cells, raising the number of infected monocytes in the blood and facilitating bacterial spread to host tissues, including the brain. It also favors the persistence of Listeria in the gut tissue, its fecal excretion and transmission back to the environment.

“We discovered a specific, unexpected mechanism by which a pathogen increases the life span of the cells it infects by specifically blocking an immune system function that is crucial for controlling infection,” explains Marc Lecuit (Université Paris Cité and Necker-Enfants Malades Hospital (AP-HP)), head of the Biology of Infection Unit at the Institut Pasteur (Université Paris Cité, Inserm).

It is possible that other intracellular pathogens such as Toxoplasma gondii and Mycobacterium tuberculosis use similar mechanisms to infect the brain. Identifying and understanding the immune escape mechanisms of infected cells could give rise to new therapeutic strategies to prevent infection and also pave the way for new immunosuppressive approaches for organ transplantation.

This research was funded by the Institut Pasteur, Inserm and the European Research Council (ERC) and also received funding from the Le Roch-Les Mousquetaires Foundation.

[1] Uncovering Listeria monocytogenes hypervirulence by harnessing its biodiversity, Nature Genetics, February 1, 2016
Press release: https://www.pasteur.fr/en/listeria-hypervirulent-strains-cerebral-and-placental-tropism

Child Malnutrition: New Strategy Treats More Children at Lower Cost

Weighing of a child being screened in the community for acute malnutrition as part of the OptiMA-DRC clinical trial, Kasaï Province, Democratic Republic of the Congo. © ALIMA

Malnutrition affects millions of children worldwide, and has been made worse by the COVID-19 health crisis. In a new study, researchers from Inserm and Université de Bordeaux at the Bordeaux Population Health Research Center, in collaboration with the Research Institute for Sustainable Development (IRD) and the NGO ALIMA (The Alliance for International Medical Action), have developed and tested a new strategy to simplify and optimize the treatment of malnutrition in order to help a larger number of children. Called OptiMA, a randomized, clinical trial conducted in the Democratic Republic of the Congo has shown this strategy to treat 30% more children while using 20% less nutritional products than in standard programs. These findings were published on March 16, 2022 in The Lancet Global Health.

In 2019, malnutrition affected 47 million children under 5 years of age. Although all of the world’s regions were affected, a quarter of the children were in Africa. After two years of the COVID-19 pandemic and its attendant health system disruptions and increased food insecurity in many countries, the situation is far from improving.

At present, acutely malnourished children are treated differently depending on whether they have “severe acute malnutrition” or “moderate acute malnutrition” (see box). The management is not the same and includes the use of different nutritional products.

In severe acute malnutrition, “ready-to-use therapeutic food” is used. The amount given is calculated according to weight, with the dose increasing as the child gets closer to their normal weight, despite it appearing more logical to decrease the dose as the child recovers.

Children with moderate acute malnutrition receive a different product with a similar composition, known as “ready-to-use supplementary food”. Both types of product are brought into the countries concerned through different supply channels overseen by different United Nations agencies, thereby complicating the care of children on the ground. What is more, these nutritional products, along with the attendant human resources, account for the highest proportion of the costs of the programs aimed at fighting acute malnutrition.

At a time when these programs face a significant lack of funding, simplifying and standardizing the treatment of all types of malnutrition is a priority in order to increase efficacy, reduce costs, and ultimately help more children. At present, only 25% of the children estimated to be severely malnourished are receiving treatment.

Working hand in hand

Researchers from Inserm, Université de Bordeaux and the Research Institute for Sustainable Development (IRD), and humanitarians from the NGO ALIMA have pooled their expertise to develop and propose an evaluation methodology and implement a clinical trial to evaluate a new strategy called “OptiMA”, in which children suffering from acute malnutrition, regardless of the stage of the disease, receive one type of nutritional treatment – namely ready-to-use therapeutic food – at a dose that progressively decreases as the child improves.

By proposing the use of one type of product for all forms of acute malnutrition, and by optimizing the dose, the scientists hoped to be able to treat children more simply and in greater numbers, while preventing the development of the severest forms of malnutrition.

Their study, conducted in the Democratic Republic of the Congo, included nearly 1,000 malnourished children between 6 months and 5 years of age, recruited from four health zones in the Kasaï Province, composed of some 60 villages and four health centers. The study design envisaged for each child to be followed for six months to monitor their clinical outcome after the end of treatment and to assess the risk of relapse.

This involved twice-monthly visits to the homes of hundreds of children, made possible thanks to the know-how of ALIMA, both from the logistic and operational point of view and for the work in gaining acceptance of this research project and new strategy by the community concerned. The combined expertise of the researchers and the humanitarians ensured that almost no children were lost to follow-up, meaning that the scientific evaluation of the OptiMA strategy had as little bias as possible.

A superior and less costly strategy

The researchers were able to show that the OptiMA strategy was not only no less effective than that of conventional programs, but that it was actually more effective. The treatment of acute malnutrition based on a single program and using a single therapeutic product with progressively reduced doses led to improved child nutritional status and fewer relapses over a six-month period. More particularly, it led to a more rapid improvement in the health of the most moderately affected children. These children were also less likely to relapse during the follow-up period or develop more severe stages of malnutrition.

In addition, using only one nutritional product at optimized doses reduced the cost per child. The scientists show that in comparison with the programs usually implemented, OptiMA makes it possible to treat 30% more children while using 20% less nutritional products.

“Our strategy allows us to treat malnutrition at an earlier stage, by decompartmentalizing the treatment of moderate and severe malnutrition. In the current context, after two years of the COVID-19 pandemic, we are facing a growing number of malnourished children and complex situations in African health centers. The proposal to simplify and optimize the treatment of child malnutrition is appropriate to the changing context in which we live,” emphasizes Renaud Becquet, Inserm researcher and joint leader of the Global Health in the Global South team at the Bordeaux Population Health Research Center (Inserm/Université de Bordeaux).

The same teams are now testing the strategy in another clinical trial, this time in Niger. They are comparing it with the national protocol and another simplified strategy to see if its benefits can be replicated in other settings and other populations.

What is acute malnutrition in children under 5 years of age?

Malnutrition is characterized by an imbalance between nutrient intake and the needs of the body. When this intake is insufficient, the body becomes weaker and loses weight.

Acute malnutrition mainly affects children under 5 years of age. It is characterized by sudden, severe weight loss due to insufficient calories and nutrients, following an infectious disease such as diarrhea, respiratory infections, malaria and/or lack of access to a varied diet in sufficient quantity.

Various criteria are used to diagnose acute malnutrition: calculation of the weight-for-height Z score (WHZ score), measurement of the mid-upper-arm circumference (MUAC), or the presence of edema. For example, moderate acute malnutrition is defined as a child with an MUAC between 115 and 125 mm, and severe acute malnutrition as a child with an MUAC of less than 115 mm.

The new OptiMA strategy suggests using only MUAC to screen for acute malnutrition because it is simple to measure. Requiring few resources, mothers can be quickly trained in its use and thereby detect malnutrition in their children at an early stage.

MICA: A New Immune Response Gene That Predicts Kidney Transplant Failure

Histological image of a kidney transplant rejection mediated by antibodies. Sophie Caillard/Jérome Olagne (Inserm U1109).

Although a kidney transplant is the only curative treatment for end-stage kidney disease, the risk of the patient’s body rejecting the graft means that success is not guaranteed. To reduce this risk, physicians are now able to look at a certain number of genetic and immunological parameters in order to evaluate the histocompatibility between donor and recipient – i.e. how compatible their organs and tissues are. Nevertheless, rejections continue to remain common, and many are unexplained. In a new study, researchers from Inserm, Université de Strasbourg and Strasbourg University Hospitals at Unit 1109 “Molecular Immunology and Rheumatology”, and their partners from the Laboratory of Excellence (LabEx) Transplantex, report that the MICA gene is a new histocompatibility gene, in that it helps to better explain and predict the success or failure of a kidney transplant. Their findings have been published in Nature Medicine.

Kidney transplant is currently the best way to treat patients with end-stage kidney disease. In France, an average of around 4,000 kidney transplants are performed each year (around 20,000 in the US). The kidneys mainly come from deceased donors, although the number of kidneys from living donors has been gradually increasing each year over the last two decades.

The possibility of rejection of the graft considered “foreign” by the recipient’s body is currently the main limitation of this procedure. While the use of immunosuppressant drugs[1] helps to reduce the risk, it does not eliminate it completely. “Chronic” rejection, which occurs over the years following the transplant, remains a major problem.

The discovery of the HLA system in the mid-20th century by French researcher Jean Dausset and his colleagues has enabled major advances. It is a set of proteins coded by the HLA genes – proteins which are present on the surface of our cells, particularly white blood cells.

Highly diverse and specific to each individual, this system makes it possible to assess the histocompatibility between donors and recipients – i.e. how compatible their organs and tissues are. The closer the HLA genes between donors and recipients, the lower the risk of rejection.

However, even when donor and recipient HLA genes are compatible, unexplained transplant rejections still occur. This phenomenon suggests that other as yet unidentified histocompatibility genes may play a role.

A role for the MICA gene

Researchers from Inserm, Université de Strasbourg and Strasbourg University Hospitals and their partners from LabEx Transplantex were therefore interested in a gene discovered almost thirty years ago by Seiamak Bahram[2] who coordinated this new research.

This gene, called MICA, codes for a protein expressed on several cell types. Previous studies had already suggested that this gene was important in predicting the outcome of a transplant, but the numbers of patients studied were insufficient (among other methodological limitations) in asserting that it was a histocompatibility gene. Furthermore, these studies did not focus on the entire MICA system, that is to say on both genetics (histocompatibility) and the serological aspects (presence of anti-MICA antibodies in the recipient’s blood).

In this latest study, the team studied MICA in over 1,500 kidney transplant recipients and their donors. Analyses of the MICA gene sequences show that when recipients and donors have a different version of the gene, the survival of the graft is reduced.

Furthermore, the researchers show that these MICA gene incompatibilities are responsible for the synthesis of antibodies directed against the donor’s MICA proteins, which are involved in transplant rejection. These antibodies are produced when the donor’s MICA proteins differ excessively from those of the recipient.

These findings suggest that MICA is a relevant histocompatibility gene to consider when envisaging a transplant, and that testing for anti-MICA antibodies may also be useful in predicting the success or failure of the graft. They must now be validated in large-scale prospective studies in which MICA will be considered in the same way as classic HLA genes.

Following this research, we can now consider the inclusion in routine clinical practice of MICA gene sequencing and the identification of anti-MICA antibodies in patients prior to transplantation to assess histocompatibility with the donor and post-transplant to improve the prevention of rejection. Finally, we also envisage studying the role of MICA in the transplantation of other solid organs, such as the heart, lung and liver,” emphasizes Seiamak Bahram.

 

[1] Treatments that limit the action of the immune system used in autoimmune diseases and transplants.

[2] University Professor-Hospital Practitioner, Director of Inserm Unit 1109 and LabEx Transplantex, and Head of the Department of Clinical Immunology Laboratory at Strasbourg University Hospitals.

Exposure to Air Pollution Linked to Increased Risk of Poor Cognitive Performance

Air pollution is linked to thousands of deaths each year in France. © Unsplash

Forty percent of dementia cases could be avoided or delayed by acting on modifiable factors, one of which is air pollution1. In order to explore the subject in greater depth and obtain accurate data on this risk factor, researchers from Inserm, Université de Rennes 1 and the EHESP School of Public Health at Irset wished to identify the impact on cognitive performance of three pollutants linked to road traffic (fine particles of less than 2.5 microns in diameter, nitrogen dioxide and black carbon). They compared the results of cognitive tests performed by a large sample of people according to their level of exposure to these different pollutants. The results of the study suggest a link between exposure to higher levels of pollutants and lower levels of cognitive performance, a link which differs depending on the pollutants. These results have been published in The Lancet Planetary Health.

The adverse health effects of air pollutants, even at low levels of exposure, are well documented. Recent research has suggested that, in addition to increasing the risk of developing cardiovascular and lung disease, air pollution could accelerate cognitive decline, which is a warning sign of a neurodegenerative disease such as Alzheimer’s2 or other forms of dementia.

In recent years, air pollution has been recognized as a “modifiable” risk factor for dementia, in that it is possible to take action by changing the regulations governing tolerated levels of pollution. However, until now, there had been no studies that simultaneously investigated different types of pollutants and their respective potential effects on the various areas of cognition.

Researchers from Inserm, Université de Rennes 1 and the EHESP School of Public Health at Irset studied how the level of exposure to air pollutants affects cognitive performance. With a previous study having revealed that cognitive performance could decline as early as 45 years of age3, the research team used data from over 61,000 participants in the Constances epidemiological cohort aged 45 years and older.

All participated in a series of tests measuring their cognitive performance in three major areas of cognition: memory, fluidity of verbal expression (or verbal fluency) and the capacity to make decisions (or executive functions4). The researchers established a cognitive performance score for each test, taking into account the sex, age and education of each participant.

To measure each participant’s exposure to pollution, the research team used so-called “exposure” maps that estimate the concentration of pollutants at their place of residence. These maps take into account several variables, such as road traffic density and proximity of the place of residence to roads. Three pollutants related to road traffic were considered in the study: fine particles of less than 2.5 microns diameter (PM2.5), nitrogen dioxide (NO2) and black carbon.

By comparing the results of the cognitive tests with the level of exposure to the three air pollutants, the study shows that exposure to higher levels of these pollutants is significantly linked to a lower level of performance in the three cognitive domains studied.

For the participants most exposed, the researchers found a difference ranging from 1 to almost 5% in the cognitive performance score compared to the less exposed participants.

“The most capabilities most affected are verbal fluency and executive functions,” specifies Bénédicte Jacquemin, Inserm researcher who led this study. Nitrogen dioxide and PM2.5 particles have a greater impact on verbal fluency, whereas black carbon has a greater impact on executive functions. “

She concludes: “The next step in our research is to look at how the cognitive functions of these adults change over time, in order to see whether exposure to pollution is also linked to a decrease in cognitive function over time, which may reflect the early signs of dementia, be it Alzheimer’s disease or other forms of dementia in the elderly. “

 

1 Livingston G, Huntley J, Sommerlad A, et al. Dementia prevention, intervention, and care: 2020 report of the Lancet Commission. The Lancet 2020; 396: 413–46.

2 Livingston G, Huntley J, Sommerlad A, et al. Dementia prevention, intervention, and care: 2020 report of the Lancet Commission. The Lancet 2020

3 Singh-Manoux A, Kivimaki M, Glymour M M, Elbaz A, Berr C, Ebmeier K P et al. Timing of onset of cognitive

decline: results from Whitehall II prospective cohort study, BMJ 2012

4 Set of cognitive processes (reasoning, planning, problem-solving, etc.) that enable us to adapt to the context or to new situations.

Significant Increase in Infant Mortality in France

In France, for the first time in peacetime, the infant mortality rate has risen significantly in the last ten years. ©Adobe Stock

The infant mortality rate (IMR) is a key indicator of population health. In the absence of updated data on the statistical trends of this indicator in France, researchers from Inserm, Université de Paris, the Paris public hospitals group (AP-HP) and Nantes University Hospital, in collaboration with teams from the University of California, analyzed civil registry data from the French National Institute of Statistics and Economic Studies (INSEE) from 2001 to 2019. They identified a significant increase in the IMR since 2012, thereby setting France apart from other high-income countries. The findings, published in The Lancet Regional Health – Europe, reflect the importance of more in-depth research into the precise causes of these 1200 excess deaths observed each year in France before one year of age.

The United Nations have made one of its priority objectives the elimination of preventable deaths in children by 2030. Given that the vast majority of child deaths occur during the first year of life, the infant mortality rate (IMR)1 is used to track progress towards this goal.

IMR serves as a key indicator of population health, given its strong relationship with the socio-economic development and quality of preventive and curative care in the country. In some high-income countries, such as Finland and Sweden, the IMR has been continuously decreasing since World War II. In other countries, such as France, this decrease appears to be slowing down.

Scientists from Inserm, Université de Paris, the Paris public hospitals group (AP-HP), Nantes University Hospital and the University of California wanted to go further in the statistical analyses of the evolution of the French IMR, and more specifically over the 2001 to 2019 period.

During this study period, the deaths of 53,077 infants were recorded for 14,622,096 live births, giving an average IMR of 3.63/1,000 (4.00 for boys, 3.25 for girls). Around one quarter of the deaths (24.4%) occurred during the first day of life and half (47.8%) in the early neonatal period – the first week following birth.

An in-depth statistical analysis identified two inflexion points, in 2005 and 2012 (see figure above). The IMR saw a sharp decrease from 2001 to 2005, and then a slower decrease from 2005 to 2012. From 2012, a significant 7% increase in the IMR was observed. This meant that infant mortality rose from 3.32 in 2012 to 3.56 deaths per 1,000 live births in 2019. Sensitivity analyses2 showed this trend to be unrelated to changes in registering practices or changes in medical practices for the management of newborns with serious conditions. Subgroup analyses showed this increase to be mainly due to an increased IMR in the early neonatal period.

Thanks to in-depth statistical analyses, we have identified a significant increase in the infant mortality rate in France since 2012. When comparing the data against other European countries with similar economies, such as Sweden and Finland, we observe that every year in France there is an excess of around 1,200 deaths of children under one year of age,” explains Prof. Martin Chalumeau, last author of the study. “It is essential to be able to explore in detail the causes of this increase by having, for example, systematic information on the specific medical and social circumstances of these deaths and by making this population, which is the most vulnerable, a real research and public health priority, which is not the case at present,” the researcher concludes.

1 Infant mortality rate (IMR) is defined as the number of deaths of children under one year of age (D0-D364) per 1,000 live births over a given period

2 Additional analyses to support the robustness of the main analyses

fermer