Menu

Cell contraction drive the initial shaping of human embryos

Embryon humain au stade blastocysteHuman embryo at the blastocyst stage ready to implant. The nuclear envelope of the cells appears in blue and the actin cytoskeleton in orange. © Julie Firmin et Jean-Léon Maître (Institut Curie, Université PSL, CNRS UMR3215, INSERM U934)

Human embryo compaction, an essential step in the first days of an embryo’s development, is driven by the contractility of its cells. This is the finding of a team of scientists from CNRS, Institut Curie, Inserm, AP-HP and the Collège de France. Published in the 1st May edition of Nature, these results contradict the presupposed driving role of cell adhesion in this phenomenon and pave the way for improved assisted reproductive technology (ART).

In humans, embryonic cell compaction is a crucial step in the normal development of an embryo. Four days after fertilisation, cells move closer together to give the embryo its initial shape. Defective compaction prevents the formation of the structure that ensures the embryo can implant in the uterus. In assisted reproductive technology (ART), this stage is carefully monitored before an embryo is implanted.

An interdisciplinary research team1 led by scientists at the Genetics and Developmental Biology Unit at the Institut Curie (CNRS/Inserm/Institut Curie) studying the mechanisms at play in this still little-known phenomenon has made a surprising discovery: human embryo compaction is driven by the contraction of embryonic cells. Compaction problems are therefore due to faulty contractility in these cells, and not a lack of adhesion between them, as was previously assumed. This mechanism had already been identified in flies, zebrafish and mice, but is a first in humans.

By improving our understanding of the early stages of human embryonic development, the research team hopes to contribute to the refinement of ART as nearly one third of inseminations are unsuccessful today.2

The results were obtained by mapping cell surface tensions in human embryonic cells. The scientists also tested the effects of inhibiting contractility and cell adhesion and analysed the mechanical signature of embryonic cells with defective contractility.

Embryon humain au stade 4 cellules.

Human embryo at the 4-cell stage. Cell DNA appears in red and their actin cytoskeleton in blue. The cell on the right has just split its genome into two and is about to divide. © Julie Firmin et Jean-Léon Maître (Institut Curie, Université PSL, CNRS UMR3215, INSERM U934)

 

1 – Scientists from the following entities also took part in the study: the Centre interdisciplinaire de recherche en biologie (CNRS/Collège de France/Inserm), the Reproductive Biology Department – CECOS (AP-HP) and the Institut Cochin (CNRS/Inserm/Université Paris Cité).

2 – Source: Agence de la biomédecine

The consumption of certain food additive emulsifiers could be associated with the risk of developing type 2 diabetes

cornets de glaceEmulsifiers are among the most commonly used additives. They are often added to processed and packaged foods such as certain industrial cakes, biscuits and desserts, as well as yoghurts, ice creams… © Kenta Kikuchi sur Unsplash

Emulsifiers are among the additives most widely used by the food industry, helping to improve the texture of food products and extend their shelf life. Researchers from Inserm, INRAE, Université Sorbonne Paris Nord, Université Paris Cité and Cnam, as part of the Nutritional Epidemiology Research Team (CRESS-EREN), studied the possible links between the dietary intake of food additive emulsifiers and the onset of type 2 diabetes between 2009 and 2023. They analysed the dietary and health data of 104 139 adults participating in the French NutriNet-Santé cohort study, specifically evaluating their consumption of this type of food additive using dietary surveys conducted every six months. The findings suggest an association between the chronic consumption of certain emulsifier additives and a higher risk of diabetes. The study is published in Lancet Diabetes & Endocrinology.

In Europe and North America, 30 to 60% of dietary energy intake in adults comes from ultra-processed foods. An increasing number of epidemiological studies suggest a link between higher consumption levels of ultra-processed foods with higher risks of diabetes and other metabolic disorders.

Emulsifiers are among the most commonly used additives. They are often added to processed and packaged foods such as certain industrial cakes, biscuits and desserts, as well as yoghurts, ice creams, chocolate bars, industrial breads, margarines and ready-to-eat or ready-to-heat meals, in order to improve their appearance, taste and texture and lengthen shelf life. These emulsifiers include for instance mono- and diglycerides of fatty acids, carrageenans, modified starches, lecithins, phosphates, celluloses, gums and pectins.

As with all food additives, the safety of emulsifiers had been previously evaluated by food safety and health agencies based on the scientific evidence that was available at the time of their evaluation. However, some recent studies suggest that emulsifiers may disrupt the gut microbiota and increase the risk of inflammation and metabolic disruption, potentially leading to insulin resistance and the development of diabetes.

For more information: read Inserm’s report on type 2 diabetes

For the first time worldwide, a team of researchers in France has studied the relationships between the dietary intakes of emulsifiers, assessed over a follow-up period of maximum 14 years, and the risk of developing type 2 diabetes in a large study in the general population.

The results are based on the analysis of data from 104 139 adults in France (average age 43 years; 79% women) who participated in the NutriNet-Santé web-cohort study (see box below) between 2009 and 2023.

The participants completed at least two days of dietary records, collecting detailed information on all foods and drinks consumed and their commercial brands (in the case of industrial products). These dietary records were repeated every six months for 14 years, and were matched against databases in order to identify the presence and amount of food additives (including emulsifiers) in the products consumed. Laboratory assays were also performed in order to provide quantitative data. This allowed a measurement of chronic exposure to these emulsifiers over time.

During follow-up, participants reported the development of diabetes (1056 cases diagnosed), and reports were validated using a multi-source strategy (including data on diabetes medication use). Several well-known risk factors for diabetes, including age, sex, weight (BMI), educational level, family history, smoking, alcohol and levels of physical activity, as well as the overall nutritional quality of the diet (including sugar intake) were taken into account in the analysis.

After an average follow-up of seven years, the researchers observed that chronic exposure – evaluated by repeated data – to the following emulsifiers was associated with an increased risk of type 2 diabetes:

  • carrageenans (total carrageenans and E407; 3% increased risk per increment of 100 mg per day)
  • tripotassium phosphate (E340; 15% increased risk per increment of 500 mg per day)
  • mono- and diacetyltartaric acid esters of mono- and diglycerides of fatty acids (E472e; 4% increased risk per increment of 100 mg per day)
  • sodium citrate (E331; 4% increased risk per increment of 500 mg per day)
  • guar gum (E412; 11% increased risk per increment of 500 mg per day)
  • gum arabic (E414; 3% increased risk per increment of 1000 mg per day)
  • xanthan gum (E415; 8% increased risk per increment of 500 mg per day)

This study constitutes an initial exploration of these relationships, and further investigations are now needed to establish causal links. The researchers mentioned several limitations of their study, such as the predominance of women in the sample, a higher level of education than the general population, and generally more health-promoting behaviours among the NutriNet-Santé study participants. Therefore caution is needed when extrapolating the conclusions to the entire French population.

The study is nevertheless based on a large sample size, and the researchers have accounted for a large number of factors that could have led to confounding bias. They also used unique, detailed data on exposure to food additives, down to the commercial brand name of the industrial products consumed. In addition, the results remain consistent through various sensitivity analyses[1], which reinforces their reliability.

These findings are issued from a single observational study for the moment, and cannot be used on their own to establish a causal relationship. They need to be replicated in other epidemiological studies worldwide, and supplemented with toxicological and interventional experimental studies, to further inform the mechanisms linking these food additive emulsifiers and the onset of type 2 diabetes. However, our results represent key elements to enrich the debate on re-evaluating the regulations around the use of additives in the food industry, in order to better protect consumers,’ explain Mathilde Touvier, Research Director at Inserm, and Bernard Srour, Junior Professor at INRAE, lead authors of the study.

Among the next steps, the research team will be looking at variations in certain blood markers and the gut microbiota linked to the consumption of these additives, to better understand the underlying mechanisms. The researchers will also look at the health impact of additive mixtures and their potential ‘cocktail effects.’ They will also work in collaboration with toxicologists to test the impact of these exposures in in vitro and in vivo experiments, to gather more arguments in favour of a causal link.

NutriNet-Santé is a public health study coordinated by the Nutritional Epidemiology Research Team (CRESS-EREN, Inserm/INRAE/Cnam/Université Sorbonne Paris Nord/Université Paris Cité) which, thanks to the commitment and loyalty of over 170 000 participants (known as Nutrinautes), advances research into the links between nutrition (diet, physical activity, nutritional status) and health. Launched in 2009, the study has already given rise to over 270 international scientific publications. In France, a call to recruit new participants is still ongoing in order to continue to further public research into the relationship between nutrition and health.

By devoting a few minutes per month to answering questionnaires on diet, physical activity and health through the secure online platform  etude-nutrinet-sante.fr, the participants contribute to furthering knowledge, towards a healthy and more sustainable diet.

[1] Sensitivity analyses in epidemiology aim to test the robustness of statistical models by varying certain parameters, hypotheses or variables in the model to assess the stability of the associations observed. For example, in this study, additional account was taken of sweetener consumption, weight gain during follow-up and other metabolic diseases.

Preventing cardiovascular risk thanks to a tool for measuring arterial stiffness

CAVI© Adobe stock

Cardiovascular diseases represent the leading cause of death worldwide[1]. Preventing cardiovascular risk by identifying the people most susceptible to these diseases is a major public health challenge. In a new study in this field, researchers from Inserm, Université de Lorraine and Nancy Regional University Hospital opted to focus on arterial stiffness and how it changes with age, given that ageing is associated with a loss of arterial flexibility. Thanks to health data collected from over 1,250 Europeans, their research confirms that the stiffer the arteries, the greater the cardiovascular risk. The scientists suggest measuring arterial stiffness as a way to predict a patient’s cardiovascular risk, and emphasise the utility of a specific clinical tool called the Cardio-Ankle Vascular Index (CAVI). These findings have been published in eBioMedicine.

Cardiovascular risk is the probability of developing a cardiovascular disease or accident (problems affecting the heart and arteries). Finding a measurement that can predict this risk through the early detection of the factors that can influence it is a major challenge for research. The risk factors that are already well-known are high blood pressure, smoking, diabetes, high cholesterol, excess weight and sedentary lifestyle.

Previous studies have shown that ageing affects the flexibility of our arteries in that they become increasingly rigid over time. What is more, the scientific literature indicates that this phenomenon may be accelerated by other factors during ageing, such as hypertension or diabetes, and is associated with an increased cardiovascular risk. On the basis of these factors, it had been suggested that looking at arterial stiffness could be useful in preventing cardiovascular risk. However, testing for arterial stiffness is not on the list of recommended clinical practices.

In a new study, researchers from Inserm, Université de Lorraine and Nancy Regional University Hospital looked at a tool for measuring arterial stiffness called the Cardio-Ankle Vascular Index (CAVI), hypothesising that its use in clinical practice could predict patient cardiovascular risk.

The scientists were specifically interested in CAVI because of its accuracy, non-invasiveness and the fact that it is not influenced by blood pressure but reflects the structure of the artery itself. CAVI is measured using four cuffs – one around each arm and one around each ankle – assessing stiffness from the femoral artery to the tibial artery. A microphone is also placed on the heart. The tool measures the speed of blood flow and calculates an index: the higher the number, the stiffer the arteries[2].

During their research, the scientists followed 1,250 people from 18 European countries, all over the age of 40[3]. They provided their medical history and underwent a physical examination, including an assessment of their arterial stiffness using CAVI. They were then invited for a follow-up examination 2 years after the first measurement, and for some, up to 5 years after the first measurement. The aim of the follow-up was to assess the progression of the arterial stiffness and correlate this with the participants’ general state of health.

Thanks to their measurements, the researchers were able to observe that each one-point increase in CAVI, which corresponds to an approximate 10% increase in arterial stiffness, was associated with a 25% increased risk of a cardiovascular event in the years following the measurement.

The researchers also looked at what might influence arterial stiffness. They saw that age affected not just the CAVI value but also its progression, in that it increases more rapidly with age. They also observed the impact of blood pressure: the higher the blood pressure, the higher the CAVI.

The scientists then tried to determine a threshold for arterial stiffness that would be associated with an increased cardiovascular risk and could be commonly recognised and adopted by clinicians, in order to implement more intensive patient monitoring. They found that a CAVI of over 9.25 was associated with a high cardiovascular risk from the age of 60.

Finally, they observed that treatment for cholesterol or diabetes affected the rate of progression of arterial stiffness. Although these observations are still being studied, they do suggest that certain treatments could help slow the progression of arterial stiffness.

‘Our findings suggest that CAVI could be a quick, easy and non-invasive tool for predicting cardiovascular risk. In the future, it could be included on the list of tests recommended in clinics to predict a person’s cardiovascular risk and provide preventive monitoring,’ explains Magnus Bäck, first author of the study.

‘As well as being easy to deploy, we could use CAVI to determine the actual age of the cardiovascular system,’ explains Athanase Benetos, the study’s final author.

 

[1] WHO data: https://www.who.int/fr/health-topics/cardiovascular-diseases#tab=tab_1

[2] An index of 10 is already a sign of great rigidity.

[3] TRIPLE-A-Stiffness is an international longitudinal cohort study having recruited more than 2,000 participants over 40 years of age from 18 European countries. Of these, 1,250 subjects (55% of whom women) were followed for a median duration of 3.82 (2.81 – 4.69) years.

Menstrual cycle regularity: a biological clock driven by the moon?

Cycle menstruel© AdobeStock

Because of their cyclical rhythm and similar durations, the menstrual and lunar cycles have often been assumed to be linked, despite no solid evidence so far to support this. To gain a better understanding of the origin of the rhythmic regularity of the menstrual cycle, an international research team involving Inserm, CNRS and Université Claude Bernard Lyon 1 compared a large amount of data on cycles collected from studies conducted in Europe and North America. Its findings show that the menstrual cycle is finely regulated by an internal clock, which in turn is occasionally influenced by the lunar cycle. This research, to be published in Science Advances, argues in favour of further study of this potential link, to identify the potential relevance of chronobiology in the treatment of fertility disorders.

The average length of a typical human ovarian cycle, or menstrual cycle, is 29.3 days, varying from one person to another and from one cycle to another within the same person. It begins on the first day of menstruation and consists of three phases, each dedicated to the realization of a specific process linked to ovulation, which occurs around Day 14 of the cycle.

Some studies have suggested that each phase may be influenced by an internal clock, the disruption of whose rhythm is associated with irregularities in the menstrual cycle.

In humans, the most well-known internal clock is the circadian clock, which is very close to 24 hours and maintains the sleep-wake cycle and the various physiological rhythms. It is synchronised with the day-night cycle under the influence of light. When the circadian clock is disrupted – such as with jet-lag – it takes a few days to return to its normal rhythm, resynchronising with the new day-night cycle.

In the case of menstrual cycles, the involvement of an internal clock could manifest in a similar way: the cycle length should be highly stable within individuals and, if disrupted, its optimal rhythm should be restored by mechanisms of adaptation through synchronisation with external conditions.

So what could this ‘external synchroniser’ be? One recurring theory suggests that the lunar cycle could play this role, but scientific evidence is lacking to date.

An international research team led by Claude Gronfier, Inserm researcher at the Lyon Neuroscience Research Centre (Inserm/CNRS/Université Claude Bernard Lyon 1), has investigated the potential existence of an internal biological clock that would regulate the menstrual cycle and which may be synchronised with the lunar cycle. Using a large database of menstrual cycles collected in European and North American studies, the team was able to compare a total of approximately 27,000 menstrual cycles in 2,303 European women and approximately 4,800 cycles in 721 North American women.

The researchers began by examining the stability from one menstrual cycle to another at an individual level, by comparing the lengths of successive cycles. They observed that the average length of each participant’s cycle was stable overall, even though out of a series of successive cycles some were longer or shorter than that person’s ‘standard’ cycle.

These observations suggest the existence of a mechanism that corrects the difference between the length of the current cycle and that of a typical menstrual cycle in the person concerned, explains René Écochard, first author of the study, a doctor at the Hospices Civils de Lyon and professor at Université Claude Bernard Lyon 1. A few shorter cycles could therefore compensate for a series of a few longer cycles so that the total length of the cycle is around the usual length of the menstrual cycle. The length of a cycle could therefore depend on the length of previous cycles.’

‘The observation of this phenomenon argues in favour of the existence of an internal clock that finely regulates menstrual cycles, itself synchronised by a cyclical environmental event,’ adds Gronfier.

Secondly, the research team looked at the potential relationships between the onset of menstruation in the cycles studied and the phases of the moon at the time the data was collected.

It was able to observe an occasional but significant association between the menstrual cycle and the lunar cycle with, however – and without this research being able to explain its cause – a major difference between the European cohorts and the North American cohort: in the European cohorts, the cycle began more often during the waxing phase of the moon, whereas in the North American cohort it began more often at full moon.

Despite this astonishing difference, which we are unable to explain at present, the links identified in this research between the lunar and menstrual cycles, through their proximity to certain phenomena that we observe in chronobiology, suggest that the periodicity of menstruation and ovulation could be influenced, in a modest but significant way, by the lunar cycle,’ adds Gronfier.

These findings therefore argue in favour of an internal clock system with a near-monthly rhythm, synchronised to a small extent by the lunar cycle. However, they need to be studied in more depth and confirmed by laboratory studies and larger epidemiological studies.

Thanks to smartphone apps for recording cycles, the emergence of large databases containing information on the cycles of several hundreds of thousands of women could provide new opportunities for studies,’ says Écochard.

Confirming the existence of an internal clock coordinating the menstrual cycle, as well as the mechanisms involved in its synchronisation, could make it possible to apply personalised “circadian” medicine approaches – which are already used in oncology and for the treatment of sleep disorders or depression, for example – to problems such as ovulation and fertility disorders,’ concludes Gronfier.

Respiratory allergies: newly discovered molecule plays a major role in triggering inflammation

AllergiesMicroscopic visualisation of immune cells (in green) activated by the alarmins TL1A and interleukin-33 during the onset of allergic inflammation in the lungs. ILC2s immune cells produce large quantities of interleukin-9, a key mediator of allergic inflammation. They are located near collagen fibres (blue) and blood vessels in the lung (red). © Jean-Philippe GIRARD – IPBS (CNRS/UT3 Paul Sabatier).

  • Inflammation plays a major role in allergic diseases, affecting at least 17 million people in France, including 4 million asthmatics.
  • One of the molecules that initiates this process in the respiratory tract has just been identified.
  • This molecule, a member of the alarmin family, is a major therapeutic target for the development of new treatments for respiratory allergies.

One of the molecules responsible for triggering the inflammation that causes allergic respiratory diseases, such as asthma and allergic rhinitis, has just been discovered by scientists from the CNRS, Inserm and the Université Toulouse III – Paul Sabatier. This molecule, from the alarmin family, represents a therapeutic target of major interest for the treatment of allergic diseases. The study, co-directed by Corinne Cayrol and Jean-Philippe Girard, is published in the Journal of Experimental Medicine on 10 April1.

The inflammation process plays a crucial role in allergic respiratory diseases, such as asthma and allergic rhinitis. Although the pulmonary epithelium, the carpet of cells that forms the inner surface of the lungs, is recognised as a major player in the respiratory inflammation that causes these diseases, the underlying mechanisms are still poorly understood.

A research team has identified one of the molecules responsible for triggering these allergic reactions, in a study co-led by two CNRS and Inserm scientists working at l’Institut de pharmacologie et de biologie structural (CNRS/Université Toulouse III – Paul Sabatier). This molecule from the alarmin family, named TL1A, is released by lung epithelium cells a few minutes after exposure to a mould-type allergen. It cooperates with another alarmin, interleukin-33, to alert the immune system. This double alarm signal stimulates the activity of immune cells, triggering a cascade of reactions responsible for allergic inflammation.

Alarmins, therefore, constitute major therapeutic targets for the treatment of respiratory allergic diseases. In a few years’ time, treatments based on antibodies blocking the TL1A alarmin could benefit patients suffering from severe asthma or other allergic diseases. In France, at least 17 million people are affected by allergic diseases2 with the most severe forms of asthma being responsible for several hundred deaths every year3.

 

  1. This study was supported by the ANR.
  2. According to the Ministère du travail, de la santé et des solidarités : https://sante.gouv.fr/sante-et-environnement/air-exterieur/pollens-et-allergies/article/effets-des-pollens-sur-la-sante; 13/04/2023
  3. According to Santé Publique France : https://www.santepubliquefrance.fr/maladies-et-traumatismes/maladies-liees-au-travail/asthme; 25/10/2023

A study on the pre-COVID-19 health of French citizens to help decision-makers reduce the disease burden

StéthoscopeA new study provides an overview of the health of French citizens just prior to the 2019 pandemic and its evolution from 1990 to 2019 © Unsplash

In order to implement suitable public health policies, it is crucial to know the health status of the population and its evolution over time. This knowledge is all the more important given the severe disruptions caused by the COVID-19 pandemic to healthcare systems worldwide. For the first time, a study by teams from Inserm, Université de Bordeaux and Bordeaux University Hospital in collaboration with Santé publique France, French National Health Insurance (CNAM) and the Global Burden of Disease (GBD) study collaborators, provides an overview of the health of French citizens just prior to the 2019 pandemic, and its evolution from 1990 to 2019. This research also compares the French situation with that of other European countries, resulting in an accurate assessment to guide public decision-making and reflect on the more general impact of COVID-19 on French health. The study findings have been published in The Lancet Regional Health.

The COVID-19 pandemic has disrupted the organisation of healthcare systems and exacerbated problems that many countries, including France, were already facing, such as inequalities in healthcare access, hospital pressures and shortages of healthcare professionals. In order to address these challenges beyond the usual surveillance systems, it is essential to make a more accurate diagnosis of the health of French citizens prior to the COVID-19 crisis, particularly in an attempt to analyse its specific impact on different health indicators.

This is the proposal of a new study by teams from Inserm, Université de Bordeaux and Bordeaux University Hospital, in collaboration with Santé publique France and French National Health Insurance (CNAM). This study evaluated several French health indicators in 2019 prior to the pandemic, as well as their evolution from 1990 to 2019, and offers a comparison of France’s situation with that of other Western European countries over the same period.

The added value of GBD data

The Global Burden of Disease (GBD) study, coordinated by the Institute for Health Metrics and Evaluation, has been conducted since 1990 by a global network of 5,647 employees across 152 countries and territories. The 2019 study analyses 286 causes of death, 369 diseases and injuries, and 87 risk factors in 204 countries and territories. GBD has been used to inform health policies in many nations and local jurisdictions, as well as by international organisations such as the World Bank and World Health Organization.

However, this wealth of data had never previously been used and presented specifically for France to describe the health evolution of its citizens through indicators reflecting sociodemographic status, life expectancy, healthy life expectancy and years lived with disability.

Some definitions

Life expectancy: this is the life expectancy at birth, which represents the average life expectancy of a fictitious generation subject to the age-specific mortality conditions prevailing that year.

Healthy life expectancy: this is the average life in good health, i.e. without irreversible limitation of activity in daily life or disability – of a fictitious generation subject to the conditions of mortality and morbidity prevailing that year.

Years of life lost: this is an indicator of premature mortality. It represents the number of life years lost due to a disease having caused premature death in relation to the life expectancy of the population.

Years lived with disability: this is an indicator used to estimate morbidity – i.e. the ‘weight of a disease’ in terms of disability – in years lived with the disability for a given disease. This number of years is weighted according to the nature of the disability.

Disability-adjusted life-years (DALYs): this is the number of years of healthy life years ‘lost’ due to illness, disability or death and is the sum of the two previous indicators (Years of life lost + Years lived with disability).

Improved life expectancy and healthy life expectancy

The results of the analysis confirm that over the period considered (1990 to 2019), life expectancy at birth in France had improved over time, from 77.2 years in 1990 to 82.9 years in 2019, ranking it seventh in terms of life expectancy among the 23 western European countries studied. What is more, the French live longer in good health on average, with their healthy life expectancy having also increased – this time from 67 to 71.5 years, and placing the country in fourth position.

The increase in healthy life expectancy makes it possible to hypothesise on the improvements made in certain areas, such as better care or more appropriate prevention, thereby limiting the development of diseases. This study thus makes it possible to estimate the burden of diseases according to their impact on the various GBD indicators.

Compared to other European countries, cardiovascular diseases play a smaller role in morbidity and mortality in France.

‘We have observed that the stroke and ischaemic heart disease burden is lower in France than in other Western European countries. This finding, which has been seen in the past, could be explained by a lower prevalence of many cardiovascular risk factors (hypertension, diabetes) and a healthier lifestyle (exercise, diet) in France. Efforts must continue to be made to prevent and treat these diseases, whose prevalence nevertheless continues to remain high,’ explain the authors.

It is also important to provide better care for mental health disorders (particularly depressive- and anxiety disorders) and musculoskeletal disorders (particularly low back pain) which represent the main reasons for years lived with disability.

The study also highlights the progress that needs to be made in preventing cancer, particularly by continuing efforts to combat smoking. In France, as in other European countries, cancer is still the leading cause of mortality.

‘Overall, these findings highlight a marked trend towards the improvement of health in France. They should encourage decision-makers to design intervention strategies to reduce the burden of morbidity and mortality, paying particular attention to causes such as cancer, cardiovascular diseases, mental health and musculoskeletal disorders,’ the authors point out.

This study is therefore a valuable resource, complementing the regular epidemiological surveillance conducted by Santé publique France in particular, to guide public policies and implement relevant measures to improve prevention initiatives and access to healthcare. It is also an important first step towards improving our understanding of the impact of the COVID-19 pandemic on the health of French people. The same study will now have to be carried out with the data collected at the end of the health crisis on these same key indicators, in order to highlight any changes in the population’s health.

Amyotrophic lateral sclerosis: a new avenue for improving patient diagnosis and follow-up

Neurones noradrénergiques présents dans le locus coeruleus de sourisNoradrenergic neurons in the mouse locus coeruleus whose dysfunction contributes to cortical hyperexcitability in ALS. © Caroline Rouaux

Amyotrophic lateral sclerosis or Charcot’s disease is a neurodegenerative disease that results in progressive paralysis and subsequent death. Diagnosing it is difficult and no curative treatment exists to date, making these challenges for research. In a new study, Inserm researcher Caroline Rouaux and her team at the Strasbourg Biomedical Research Centre (Inserm-Université de Strasbourg), in collaboration with researchers from Ludwig Maximilian University in Munich, CNRS and Sorbonne Université, show that electroencephalography could become a diagnostic and prognostic tool for the disease. Thanks to this type of examination, the scientists were able to reveal an atypical brain wave profile that could prove to be specific to the disease. Through this research, published in Science Translational Medicine, a potential therapeutic target has also been discovered. Fundamental advances that could ultimately benefit patients.

Amyotrophic lateral sclerosis (ALS), otherwise known as Charcot’s disease, remains a veritable challenge for clinicians. This neurodegenerative disease, which most often develops between the ages of 50 and 70, leads to progressive paralysis and death within just two to five years. It is caused by the death of the motor neurons – the nerve cells that control the muscles, both in the brain (central motor neurons) and in the spinal cord (peripheral motor neurons).

Diagnosing ALS is difficult because its initial signs vary from person to person: weakness or cramps in an arm or leg, trouble swallowing or slurred speech, etc. In addition, there is no biomarker specific to the disease. Therefore it is diagnosed by ruling out other conditions that can lead to motor disorders, which usually takes one to two years after the onset of symptoms, delaying the deployment of therapeutic measures and reducing the chances of inclusion in clinical trials at an early stage.

It was with the aim of shortening this time frame that Caroline Rouaux’s team at the Strasbourg Biomedical Research Centre, in collaboration with the teams of Sabine Liebscher in Munich and Véronique Marchand-Pauvert, Inserm researcher in Paris, tested the use of electroencephalography1. This inexpensive and easy-to-use technique involves placing electrodes on the surface of the skull to record brain activity in the form of waves.

The examination performed in subjects with ALS and in corresponding animal models revealed an imbalance between two types of waves associated with excitatory and inhibitory neuron activity, respectively. This imbalance, in favour of greater excitatory neuron activity to the detriment of inhibitory neurons, reflects cortical hyperexcitability.

This phenomenon is no surprise and had already been described with other investigation methods, but these are rarely used due to being difficult to implement and only work at the very beginning of the disease. Electroencephalography, however, is minimally invasive, very inexpensive, and can be used at different times during the disease. In addition, the atypical brain wave profile revealed by electroencephalography could prove to be specific to the disease,’ explains Rouaux, Inserm researcher and last author of the study.

Indeed, analysis of the electroencephalographic recording of the brain’s electrical activity reveals various types of brain waves of differing amplitudes and frequencies. One of these, called the theta wave, reflects the activity of the excitatory neurons that transmit messages stimulating neurons, while another wave, gamma, reflects that of the inhibitory neurons that block the transmission of nerve messages.

The study reveals that in humans and animals with ALS, the interaction between these two wave types is atypical, revealing an imbalance between the excitatory and inhibitory activities. Not only was this imbalance found in all the subjects tested, but the scientists also showed that the more the symptoms of the disease progress, the greater the imbalance. In addition, this atypical wave pattern was detected in animals even before the onset of the first motor symptoms.

If these initial findings are confirmed, electroencephalography could in the future serve as a prognostic tool for already-diagnosed patients in order to evaluate, for example, the response to a medication, or even as a diagnostic tool in the event of symptoms suggestive of the disease.

In the second part of this research, the scientists were able to study, in patients and mice, the mechanisms behind the hyperexcitability observed. First, they measured the levels of the different neuromodulators produced by the neurons to communicate with each other, and found a deficiency in one of them: noradrenaline was present in smaller amounts in the brains of the patients and mice with ALS compared to healthy brains.

To verify the role of noradrenaline, they blocked the production of this neuromodulator in healthy animals, and showed that doing so causes cortical hyperexcitability, such as that observed in the disease. And conversely, by administering molecules that stimulate the action of noradrenaline in a mouse model of ALS, the scientists reduced the hyperexcitability and restored brain activity equivalent to that of healthy mice.

This discovery could mark the opening of a new therapeutic avenue in ALS provided that cortical hyperexcitability is indeed associated with disease progression. Indeed, while we have seen an association between the two in our study, no causal link has been established for the moment. This is what we will be checking in the coming months.’ concludes Rouaux.

 

1 Electroencephalography is commonly used for research purposes in neurology but also in clinical practice. The examination provides information on brain activity in the event of sleep disorders, after a stroke, or even in the case of coma. It can also be used to diagnose encephalitis, epilepsy or confirm brain death.

Physical and mental well-being of older adults: a positive impact of meditation and health education

Méditation_personnes âgéesLearning mindfulness meditation improves self-compassion, while health education promotes an increase in physical activity. © AdobeStock

A team from Inserm and Université de Caen Normandie, in collaboration with researchers from the University of Jena (Germany) and University College London (UK), has studied the potential benefits of meditation and health education interventions in people who feel that their memory is in decline. This research was performed as part of the European H2020 Silver Santé Study programme coordinated by Inserm[1]. It shows that learning mindfulness meditation improves self-compassion, while health education promotes an increase in physical activity. These findings, published in Alzheimer’s & Dementia: Diagnosis, Assessment & Disease Monitoring, propose new avenues to support healthier ageing.

‘Subjective cognitive decline’ is when people feel that their cognitive faculties have deteriorated without this being apparent in standard cognitive tests. Studies have shown that such people have a higher risk of developing actual cognitive decline.

Previous studies had concluded that mindfulness meditation and health education (a practice in which people implement preventive measures and actions that are beneficial for their health) had a positive impact, which was still present six months later, on anxiety in people reporting subjective cognitive decline.

More generally, self-compassion (feeling of kindness toward oneself, having a sense of common humanity, and having an awareness of negative thoughts and feelings without over-identification) and exercise have previously been associated with better mental health, itself associated with improved general health, well-being, and quality of life.

A European research group coordinated by Julie Gonneaud, Inserm researcher at the Physiopathology and Imaging of Neurological Disorders laboratory (Inserm/Université de Caen Normandie), Olga Klimecki, researcher at the University of Jena (Germany), and Nathalie Marchant, researcher at University College London (UK), studied the impact of eight weeks of mindfulness meditation and health education courses on self-compassion and physical activity in people reporting subjective cognitive decline.

The trial included 147 patients from memory clinics in France, Spain, Germany and the UK. One group took meditation classes for eight weeks, while the other took health education classes. The impact of the interventions was evaluated using blood tests, cognitive assessments and questionnaires.

The researchers observed that the participants who did the mindfulness meditation training showed an improvement in their self-compassion. The participants who did the health education training showed an increase in their physical activity. These changes were still present six months later.

These findings support complementary effects of mindfulness meditation and participation in health education programmes on certain factors contributing to improved mental well-being and lifestyle in older adults reporting subjective cognitive decline.

The fact that these improvements appear to be sustained after six months of follow-up suggests that these new skills and habits have been incorporated into the participants’ lives.

According to Marchant, who led the trial, ‘more and more people are living to an advanced age, and it is crucial that we find ways to support the mental and physical health of older adults.’

‘Self-compassion can be of great importance to the elderly. It could improve psychological well-being in order to promote healthy ageing,’ adds Klimecki. ‘Our findings are an encouraging first step towards a mindfulness-based intervention that could be used to strengthen self-compassion in older adults. ‘

Gonneaud adds: ‘Although physical activity has been scientifically associated with better physical, cognitive and mental health, how to promote it in everyday life remains a challenge. Given the particularly harmful effect of a sedentary lifestyle on the health of ageing populations, showing that health education intervention programmes can strengthen commitment to physical activity among the elderly is particularly promising for promoting healthy ageing,‘ she concludes.

 

[1] This research is funded by the European Union and forms part of the SCD-Well study of the H2020 Silver Santé programme (www.silversantestudy.eu), which has received 7 million euros in funding and is coordinated by Inserm. The Silver Santé study, funded for a five-year period, examines whether mental training techniques, such as mindfulness meditation, health education, or language learning, can help improve the mental health and the well-being of the ageing population.

A promising vaccine against Nipah virus infection

NIPAH virusA scanning electron micrograph shows the Nipah virus (yellow) budding from the surface of a cell.© National Institute of Allergy and Infectious Diseases, NIH

The WHO recently classified the Nipah virus (NiV) as one of the eight main emerging pathogens likely to cause major epidemics in the future. In a context where no treatment or vaccine is yet available, a team comprising researchers from Inserm (Unit 955-VRI) and from the Université Paris-Est Créteil (UPEC) is presenting the preclinical results of an innovative vaccine against this virus. Most candidate vaccines target the viral surface proteins required for entry into human cells. To develop its new vaccine, the team at the VRI (Vaccine Research Institute of the ANRS MIE/Inserm) focused on the central role played by antigen-presenting cells (APCs) in the development of protective responses. The candidate vaccine, called CD40.NiV, carries specific parts of the surface proteins of the NiV-B virus, the Bangladesh strain. Following infection with the Nipah virus in animals, CD40.NiV demonstrated immunogenicity, neutralisation and complete protection, representing an important step towards the clinical development of a vaccine against the infection. The results of this work have just been published in the March 2024 issue of Cell Reports Medicine.

The Nipah virus (NiV) is a zoonotic virus, meaning it is transmitted from animals to humans. However, it can also be transmitted via contaminated food or directly between individuals. The clinical presentation can range from asymptomatic infection to acute respiratory infection to fatal encephalitis. First identified in Malaysia in 1999, the virus has since spread regularly through outbreaks in Bangladesh and India. Mortality linked to these outbreaks is estimated to be between 75% and 90%.

The virus has recently been included on the WHO list of priority emerging pathogens. There is currently no approved treatment or vaccine. Numerous candidate vaccines are under study or development. Most target the G and F proteins on the surface of the virus, which are necessary for it to enter human cells and spread throughout the body.

The teams at Inserm and UPEC have developed an original approach involving antigen-presenting cells (APCs), in particular dendritic cells, which play an important role in the immune response. To construct the CD40.NiV vaccine, specific parts (or epitopes) of the G, F and N proteins of the Bangladesh strain of Nipah virus (NiV-B) were attached to an antibody recognising the CD40 receptors on the surface of dendritic cells. The epitopes are thus presented directly to the cells of the immune system.

Immunogenicity (the ability to induce an immune response) of the vaccine was assessed in mice and non-human primates after two administrations of CD40.NiV vaccine (the so-called “prime-boost” strategy). As early as 10 days after the first vaccination with CD40.NiV (prime), NiV-specific IgG and IgA antibodies, as well as neutralising antibodies (specific antibodies that prevent infection by blocking viral entry into target cells) were produced. The neutralising antibody response is maintained for at least 100 days after the peak of the immune response. In addition, the team showed that the antibodies induced against NiV also neutralise various strains of NiV (Malaysia, Cambodia) and the Hendra virus (this is known as cross-neutralising immunity), an infectious agent transmitted by bats and causing a highly fatal infection in horses and humans.

To ensure that the vaccine was effective, the animals were infected with the NiV virus 60 days after the second injection of CD40.NiV (boost). Protection was complete.

This preclinical study demonstrated that the CD40.NiV vaccine candidate confers protection against the development of Nipah virus, with 100% survival of immunised animals until the end of the study, 28 days after infection. The absence of significant clinical signs or virus replication suggests that the candidate vaccine provides ‘sterilising immunity’, meaning that it can prevent the disease and its transmission.

Overall, results obtained with CD40.NiV are highly promising for fighting NiV infection and represent an important milestone towards the clinical development of a vaccine against this virus.

The brain mechanisms behind our desire to dance

Groove- envie de danser© AdobeStock

Why does some music make us want to dance more than others? This is the question that a research team from Inserm and Aix-Marseille Université tried to answer by studying the desire to dance (also called the ‘groove’) and the brain activity of 30 participants who were asked to listen to music. Their findings show that the groove sensation is highest for a moderately complex rhythm and that the desire to move is reflected in the brain by an anticipation of the music’s rhythm. This research, to be published in Science Advances also designates the left sensorimotor cortex[1] as being the centre of coordination between the auditory and motor systems.

Dancing means action. But to dance to the sound of a melody, you still have to coordinate your actions with the rhythm of the music. Previous studies have already shown that the motor system (consisting of the motor cortex[2] and all the brain structures and nerve pathways which, under its control, participate in the execution of movement) plays a crucial role in the brain’s processing of musical rhythms.

‘Groove’ is the spontaneous desire to dance to music. But while some music has us immediately heading for the dance floor, others leaves us indifferent. So what is it that makes some music more ‘groovy’ than others?

A research team led by Benjamin Morillon, Inserm researcher at the Institute of Systems Neurosciences (Inserm/Aix-Marseille Université), looked at the neural dynamics (i.e. the interactions between neurons resulting from the electrical activity of the brain) of 30 participants when they listened to pieces of music whose rhythms were of greater or lesser complexity. This was to determine the brain mechanisms involved in the emergence of the groove sensation.

To do this, the team started by creating 12 short melodies comprised of a rhythm of 120 beats per minute – or 2 Hz, the average rhythm generally found in music. Each melody was then modified in order to obtain three variants with an increasing degree of syncopation[3] (low, medium, high) – i.e. with an increasingly complex rhythm, but without changing either the speed of the rhythm or the other musical characteristics of the melody.

The researchers then asked the participants to listen to these melodies while recording their brain activity in real time using a magnetoencephalography (MEG) device. At the end of each melody, the participants were asked to score the level of groove felt.

They also created a so-called ‘neurodynamic’ mathematical model of the neural network that describes in a simple way the brain calculations required for the emergence of the groove.

The experience of the groove as reported by the participants – and reproduced by the neurodynamic model – appeared to be correlated with the rate of syncopation. As observed in previous studies, the desire to move to music was highest for a rhythm with an intermediate level of syncopation, i.e. not too simple or too complex.

‘These findings show that the motor engagement linked to the groove is materialised by a temporal anticipation of the tempo. At brain level, this is based on a dynamic balance between the temporal predictability of the rhythm (the less complex the rhythm, the better it is) and the listener’s temporal prediction errors (the more complex the rhythm, the more errors they make),’ explains Arnaud Zalta, first author of the study and post-doctoral fellow at ENS-PSL.

Analysis of the participants’ brain activity then enabled the researchers to highlight the role of the left sensorimotor cortex as coordinator of the neural dynamics involved in both auditory temporal prediction and the planning and execution of movement.

The brain region which is the site of the left sensorimotor cortex is currently considered to be the potential cornerstone of sensorimotor integration, essential for the perception of both music and speech. The fact that it appears in our study as necessary for ‘cooperation’ between the auditory and motor systems reinforces this hypothesis, especially as we are using natural stimuli here,’ concludes Morillon.

 

[1]In the brain, the sensorimotor cortex consists of the motor cortex and the sensory cortex (postcentral gyrus, at the front of the parietal lobe), separated by the central fissure. Involved in the coordination of movements, it receives sensory information from the different parts of the body and integrates it to adjust and refine the movements generated by the motor cortex.

[2]The motor cortex consists of the regions of the cerebral cortex that participate in the planning, control and execution of voluntary muscle movements. It is located in the posterior part of the brain’s frontal lobe, in the precentral gyrus.

[3]In rhythmic solfège, if we consider the 4/4 measure, beats 1 and 3 are ‘strong’ and beats 2 and 4 are ‘weak’. Syncopation is a rhythm in which a note (or a chord) is started on a weak beat and prolonged over the next strong beat. For the listener, this creates a shift in the expected accent, perceived as a kind of musical ‘hiccup’ that disrupts the regularity of the rhythm. These musical motifs are particularly present in funk or jazz.

Improving the treatment of anaemia thanks to a new discovery in iron metabolism

globules rougesAn essential component of the haemoglobin in red blood cells, iron is crucial to many biological processes – including the transport and storage of oxygen in the body. © Inserm/Claude Féo

Anaemia is a major public health problem worldwide, affecting around one third of the population. Its causes are multiple, but the most common are a lack of red blood cell production, a lack of iron in the blood, and genetic diseases such as thalassaemia. A better understanding of iron metabolism is essential to improve the care of the many patients affected. In a new study, Inserm researchers at the Digestive Health Research Institute (Inserm/INRAE/Université Toulouse III – Paul-Sabatier/Toulouse National Veterinary School) identified the major role of a protein called FGL1 in iron metabolism. Their discovery paves the way for new clinical possibilities in the treatment of anaemia. These findings have been published in the journal Blood.

Anaemia is a disease in which the number of red blood cells – or the haemoglobin levels of the red blood cells – is lower than normal. A major factor in the morbidity and mortality of one third of the world’s population, anaemia is a major public health problem.

Anaemia can be caused by a deficit of iron in the blood resulting from dietary deficiencies, infections, chronic diseases, heavy menstruation, problems during pregnancy or by genetic diseases that affect the production of red blood cells (thalassaemia).

An essential component of the haemoglobin in red blood cells, iron is crucial to many biological processes – including the transport and storage of oxygen in the body. In other words, insufficient iron in the body means insufficient haemoglobin and red blood cells for transporting oxygen to the organs and tissues, which ultimately leads to organ failure.

For more information: C’est quoi l’hémoglobine ? (only available in French)

However, too much iron is also toxic to the body, meaning that its intake needs to be carefully regulated to avoid excessively high or low levels which are responsible for severe clinical complications.

Understanding iron metabolism

For several years, knowledge about anaemia and iron metabolism has been steadily increasing. It is now well known that iron levels in the body are regulated by a hormone called hepcidin.

We also now know that if the body needs more iron, as is the case with anaemia, a hormone called erythroferrone (ERFE) suppresses the expression of hepcidin in the liver. This process supplies the bone marrow with iron to synthesise new red blood cells and increase haemoglobin levels.

The identification of ERFE in 2014 by Inserm researcher Léon Kautz and his colleagues represented an important step in this field of research. However, these data obtained ten years ago were already suggesting that ERFE was not the only hormone controlling this process. The scientists hypothesised that a second protein, previously unknown, performed a similar function.

 

A new factor identified

This is what they have now confirmed by conducting new experiments in mouse models of anaemia, in two specific cases: one during an increased synthesis of red blood cells aimed at correcting induced anaemia in mice and the other in mice with thalassaemia.

The scientists started by studying the molecular mechanisms activated in the animals’ liver to identify the genes whose expression was increased during the anaemia. They observed that the expression of the gene coding for protein FGL1 was increased in the liver when the oxygen concentration decreased.

The researchers then produced different forms of protein FGL1 to test its mode of action in vivo in mice and in vitro in human liver cells. They were able to show that its mode of action is similar to that of the hormone ERFE, because FGL1 also represses hepcidin expression.

‘In addition to the fundamental aspects of this research in understanding anaemia, we believe that identifying the role of FGL1 will lead to the development of new therapeutic strategies to treat anaemia of various causes and for which the current treatments are ineffective,’ emphasises Léon Kautz, Inserm staff scientist.

For the moment, the team will start by conducting additional research to verify that FGL1 levels are indeed increased in the blood of patients with different types of anaemia. But the scientists plan to go further, with Inserm Transfert having already filed two patent applications for this study.

On the one hand, the first patent aims to better treat anaemia resulting from chronic diseases such as cancer. The objective is to identify analogous molecules or molecules that activate FGL1 synthesis, which would reduce hepcidin expression in these patients and increase their haemoglobin levels.

On the other hand, thalassaemia is characterised by very low levels of hepcidin, leading to excess iron that is harmful to the organs, causing high mortality. The team hypothesised that FGL1 is also involved in this process. The second patent therefore aims to achieve proof of concept that FGL1 inhibition could improve iron overloads in patients suffering from thalassaemia.

fermer