Menu

A Parkinson’s treatment could delay progression of one of the forms of AMD

DMLAAge-related macular degeneration (AMD) is the leading cause of visual disability in people over 50 years of age. © Adobe Stock

Age-related macular degeneration (AMD) is the leading cause of visual disability in people over 50 years of age. Improving the treatment offering for patients is a major challenge for research. In a new study, a team of researchers from Inserm, CNRS and Sorbonne Université at the Vision Institute[1] in Paris describes the efficacy of dopaminergic drugs in slowing the progression of one of the forms of the disease, namely the neovascular or ‘wet’ form which is characterised by the proliferation of dysfunctional blood vessels under the retina. These specific drugs are already used in the treatment of Parkinson’s disease. These findings have been published in The Journal of Clinical Investigation.

AMD  is a multifactorial disease of the retina that affects people over 50 years of age. It is when part of the retina – the macula – degenerates, with the potential loss of central vision. Although highly incapacitating, it never causes complete blindness since the peripheral part of the retina remains intact.

There are two forms of the disease whose prevalence is roughly equivalent: the neovascular form – known as the ‘exudative’ or ‘wet’ form and the atrophic – ‘advanced dry’ form (see box).

While there is currently no curative treatment for the dry form of the disease, the neovascular form can be slowed down by regular injections administered directly into the eye (intravitreal injections). Although necessary, these injections can represent a major therapeutic burden due to their frequency, which is monthly or bimonthly depending on the course of the disease. It is therefore useful to continue to identify new alternatives for patients.

 

Medicines for Parkinson’s

Previous epidemiological studies have already shown a possible association between Parkinson’s disease and a reduced risk of neovascular AMD[2]. In this new study, researchers from Inserm, CNRS and Sorbonne Université at the Vision Institute explored the underlying mechanisms that would explain this potential protection.

In cell and animal models, the scientists have shown that L-Dopa, a drug in the dopaminergic family[3] used to treat Parkinson’s disease, activates a specific receptor in the brain known as DRD2. This activation blocks the formation of new blood vessels in the eye, which is a key process in the development of neovascular AMD.

To go further, the team then analysed the health data of over 200 000 patients with neovascular AMD in France[4]. They showed that those patients who took L-Dopa or other drugs that inhibit the DRD2 receptor (DRD2 agonists) to treat their Parkinson’s disease developed neovascular AMD later in life and required fewer intravitreal injections. Indeed, such patients developed the disease at 83 years of age instead of 79 years of age for the other patients.

‘These findings open up new perspectives for patients with wet AMD. We now have a serious avenue for delaying the progression of this disease and reducing the burden of current treatments’, explains Florian Sennlaub, Inserm Research Director at the Vision Institute (CNRS/Sorbonne Université/Inserm).

Thibaud Mathis, university professor and hospital practitioner in the ophthalmology department of Croix-Rousse Hospital-Hospices civils de Lyon, and researcher at Université Lyon 1, as well as at the Vision Institute, agrees: ‘These findings suggest that dopaminergic drugs, beyond their role in Parkinson’s disease, could have a beneficial effect in the prevention and treatment of neovascular AMD.’

While more in-depth clinical studies will be needed to confirm these findings and evaluate the efficacy and safety of these drugs in the treatment of AMD, this discovery opens up encouraging new perspectives for the fight against the neovascular form, offering hope of a more effective and less burdensome treatment for patients.

 

Two forms of AMD

Wet AMD is characterised by the proliferation of new dysfunctional vessels under the retina. Blood can leak through their walls and lead to the formation of macular oedema from which blood sometimes escapes, leading to retinal haemorrhages.

The wet form of AMD progresses rapidly if not managed. Previously, loss of central vision could occur within weeks or even days. This process can now be stopped thanks to drugs (anti-VEGF) injected into the eye, which inhibit the growth of new vessels. However, after several years of treatment, the disease can progress to an atrophic form.

In atrophic or ‘advanced dry’ AMD, the photoreceptors in the macula gradually disappear, followed by the retinal pigment epithelial cells. This process generates holes of increasing size in the macula, visible by simply observing the retina (dilated eye exam). This process is slow and usually takes between five and ten years before the patient loses their central vision. Currently, no treatment for this form of AMD is approved in Europe.

Mixed forms of the disease may be observed, and each of these two forms may precede the appearance of the second.

 

[1]This research is the result of collaboration with teams from Université de Lyon, Lyon University Hospital, Université de Bourgogne and the Brain Institute in Paris.

[2]Levodopa Is Associated with Reduced Development of Neovascular Age-Related Macular Degeneration, Max J Hyman et al. Ophthalmology retina 2023

[3]Dopaminergic drugs provide the dopamine needed for the brain to function. In Parkinson’s disease, dopamine reduces the intensity of tremor, rigidity and akinesia.

[4] Data from the French national health information database (Système National des Données de Santé [SNDS]).

A better understanding of Alzheimer’s disease: A study confirms the utility of caffeine as treatment avenue

Augmentation neuronale du récepteur A2A dans l’hippocampe de sourisIn red, the increased A2A receptors in the neurons of a mouse hippocampus. In blue, the cell nuclei (DAPI staining). © Émilie Faivre

In France, 900 000 people have Alzheimer’s disease or a related condition. The risk of developing Alzheimer’s depends on genetic and environmental factors. Among these factors, various epidemiological studies suggest that the regular consumption of moderate amounts of caffeine slows age-related cognitive decline and the risk of developing the disease. In a new study[1], researchers from Inserm, Lille University Hospital and Université de Lille at the Lille Neuroscience and Cognition research centre have taken a step further in understanding the mechanisms underlying its development. They have shown that the pathological increase in certain receptors in the neurons at the time of disease onset promotes a loss of synapses, and as such the early onset of memory impairments in an animal model of the disease. Their findings also confirm the utility of conducting clinical trials to measure the effects of caffeine on the brains of patients at an early stage of the disease. These findings have been published in Brain.

Alzheimer’s disease is characterised by disorders of memory, executive function and orientation in space and time. It results from the slow degeneration of the neurons, which begins in the hippocampus (essential for memory) and then spreads to the rest of the brain. Sufferers present two types of microscopic brain lesions: senile plaques (or amyloid plaques) and neurofibrillary degeneration (or tau pathology), which contribute to neuron dysfunction and disappearance[2].

Studies had already shown that the expression of certain receptors, known as A2A receptors, was found to be increased in the brains of patients affected by Alzheimer’s disease in the hippocampus. However, the impact of such dysregulation on the development of the disease and associated cognitive disorders had remained unknown until now. In a new study, a team led by Inserm researcher David Blum took a closer look at this question.

The scientists were able to reproduce an early increase[3] in the expression of the adenosine A2A receptors, as observed in the brains of patients, in a mouse model of Alzheimer’s disease that develops amyloid plaques. The objective was to assess the consequences of this increase on the disease and to describe the mechanisms involved.

The results of their research show that the increase in A2A receptors promotes the loss of synapses[4] in the hippocampus of ‘Alzheimer’s mice’. This causes the early onset of memory impairments in the animals. The scientists then showed that a dysfunction of certain brain cells – the microglial cells, partly responsible for the brain inflammation seen in the disease – could be involved in the loss of synapses, in response to an increase in the A2A receptors.

Similar mechanisms had already previously been described by the team, this time in another model of the disease developing tau lesions[5].

‘These findings suggest that the increased expression of the A2A receptors alters the relationship between neurons and microglial cells. This alteration could be the cause of an escalation of effects leading to the development of the memory impairments observed,’ explains Émilie Faivre, co-last author of the study and researcher at the Lille Neuroscience and Cognition centre (Inserm/Université de Lille/Lille University Hospital).

 

Caffeine: An interesting treatment avenue for the early prevention of cognitive decline?

Several studies had already suggested that regular and moderate caffeine consumption (equivalent to 2 to 4 cups of coffee per day) could slow the cognitive decline associated with ageing and the risk of developing Alzheimer’s disease.

In 2016, the same research team had described one of the mechanisms by which caffeine could have this beneficial effect in animals, reducing the cognitive disorders associated with Alzheimer’s disease. The scientists then showed that the effects of caffeine were linked to its ability to block the activity of the adenosine A2A receptors – the same receptors whose expression is abnormally increased in the brains of people with Alzheimer’s[6].

‘By describing in our new study the mechanism by which the pathological increase in A2A receptor expression causes a cascade of effects leading to a worsening of memory impairment, we confirm the relevance of therapeutic avenues that could act on this target. We therefore once again emphasise the utility of testing caffeine in a clinical trial on patients with early forms of the disease. Indeed, we can imagine that by blocking these A2A receptors, whose activity is increased in patients, this molecule could prevent the development of memory impairment or even other cognitive and behavioural symptoms,’ continues Blum, Inserm Research Director and co-last author of the study.

A phase 3 clinical trial[7], run by Lille University Hospital, is ongoing. Its objective is to evaluate the effect of caffeine on the cognitive functions of patients with early to moderate forms of Alzheimer’s disease.

 

[1]This research was supported by Fondation Alzheimer, FRM, ANR, CoEN (LICEND), Inserm, Université de Lille, Lille University Hospital and labEx Distalz (Development of Innovative Strategies for a Transdisciplinary Approach to Alzheimer’s Disease) as part of France’s Investments for the Future programme. 

[2] Read the Inserm Report on Alzheimer’s disease and consult its Graphic novel which explains the cellular and molecular mechanisms involved in its development. 

[3] At a stage during which the animals are not yet usually suffering from memory impairments.

[4] Zones that enable the transmission of information between neurons.

[5] Exacerbation of C1q dysregulation, synaptic loss and memory deficits in tau pathology linked to neuronal adenosine A2A receptor, Brain, Volume 142, Issue 11, November 2019, Pages 3636–3654, https://doi.org/10.1093/brain/awz288

[6] Read the press release

[7] The CAFCA phase 3 clinical trial is being conducted by neurologist Thibaud Lebouvier, in conjunction with the LilNCog laboratory and the Lille University Hospital Memory Centre. https://www.cafca-alzheimer.fr/

Immune cells that protect against post-stroke neurological damage

Visualisation, au sein de la barrière hémato-encéphalique, des macrophages associés au cerveau (CAM, en jaune), à l'interface entre un vaisseau sanguin (magenta) et des astrocytes (cyan), cellules de soutien des neurones en forme d'étoile.

Visualisation, au sein de la barrière hémato-encéphalique, des macrophages associés au système nerveux central (CAM, en jaune), à l’interface entre un vaisseau sanguin (magenta) et des astrocytes (cyan), cellules de soutien des neurones en forme d’étoile. © Dr Damien Levard

Ageing greatly increases the risk of ischaemic stroke. A team of researchers from Inserm, Caen-Normandy University Hospital and Université de Caen Normandie have looked at the role that immune cells known as central nervous system-associated macrophages (CAMs) could play in the neurological damage that occurs following a stroke. Their research shows that during the course of ageing these cells acquire a key role in regulating the immune response triggered in the wake of a stroke.  This research, to be published in Nature Neuroscience, highlights the importance of the presence of these cells at the interface between the blood and the brain in maintaining brain integrity.

The most common type of stroke is ischaemic stroke, which is caused by a blood clot obstructing an artery in the brain. Age is a major risk factor, with the risk of ischaemic stroke doubling every 10 years from the age of 55.

Ischaemic stroke is followed by inflammatory processes in the brain that may aggravate neurological lesions. Central nervous system-associated macrophages (CAMs) are immune cells located within the blood-brain barrier[1], at the interface between the blood circulation and the brain parenchyma[2]. Normally, the role of the CAMs is to monitor their environment, clean it of debris and other molecules from the brain parenchyma, as well as molecules from the blood that cross the blood-brain barrier, and alert other immune cells to the presence of pathogens. Little studied so far, they are nevertheless in an ideal anatomical situation to detect and react to external inflammatory signals and protect the brain parenchyma.

A research team from the Physiopathology and Imaging of Neurological Disorders unit (Inserm/Université de Caen Normandie) led by Marina Rubio, Inserm researcher, and Denis Vivien, professor and hospital practitioner at Université de Caen and Caen-Normandy University Hospital and head of the unit, has studied in mice and in human brain tissues how the role of CAMs evolves during ageing and their potential involvement in regulating the inflammatory response occurring in the brain after an ischaemic stroke.

First, the scientists sought to characterise how the role of CAMs and their biological environment change during the course of ageing. They observed that while the number of CAMs did not change with age, their functions did, with the appearance on their surface of the MHC II receptor – a specific molecule that plays a major role in the communication between immune cells (e.g. to coordinate the immune response to a pathogen). At the same time, the blood-brain barrier, which is intact in young brains, becomes more porous, allowing certain immune cells to pass from the blood to the parenchyma.

‘These observations suggest that the CAMs are capable of adapting their activity to the individual’s stage of life, state of health and the brain region in which they are located,’ specifies Rubio. Thus, in order to compensate for the age-related increase in blood-brain barrier porosity, they would strengthen their ability to communicate with other immune cells through further expression of the MHC II receptor. ‘Following an ischaemic stroke, this could help prevent an excessive immune response that would have more serious neurological consequences,’ adds the researcher.

The team then looked at the impact of these functional changes on the immune response in the brain parenchyma following an ischaemic stroke. To do this, it compared what happened after a stroke in a normal elderly mouse brain with what happened in the absence of CAMs or when their MHC II receptor was inhibited.

In these last two models, the researchers observed that during the acute phase of the ischaemic stroke and also in the days that followed, more immune cells from the blood crossed the blood-brain barrier, indicating its increased permeability coupled with an exacerbated immune response. This phenomenon was accompanied by a worsening of the neurological damage caused by the stroke.

These findings suggest that the CAMs acquire, during the course of ageing, a central role in orchestrating immune cell traffic after an ischaemic stroke, explains Vivien. And that, thanks to their capacity for adaptation, they ensure the ongoing close monitoring of the integrity of the blood-brain barrier and the intensity of the inflammatory response.’

The MHC II receptor located on the CAMs appears to be involved in this modulation as well as in the limitation of stroke-related neurological damage.

Further research for this team will aim to better understand the molecular mechanisms involved in the dialogue between the CAMs and the cells lining the internal wall of the brain’s blood vessels.

The objective will ultimately be to identify and develop new therapeutic targets that could enable the brain’s immune response to be modulated in a manner appropriate to each patient after a stroke,’ concludes Rubio.

 

[1]The blood-brain barrier separates the brain’s blood vessels from the brain parenchyma. It acts as a highly selective filter capable of allowing the passage of nutrients essential for the brain while protecting the parenchyma from pathogens, toxins and hormones circulating in the blood and which are likely to succeed in exiting the blood vessels. 

[2]The parenchyma is the functional tissue of the brain directly involved in neural activities and the transmission of nerve impulses. It is surrounded by the perivascular spaces and the meninges where the CAMs reside.

Menstrual cycle regularity: a biological clock driven by the moon?

Cycle menstruel© AdobeStock

Because of their cyclical rhythm and similar durations, the menstrual and lunar cycles have often been assumed to be linked, despite no solid evidence so far to support this. To gain a better understanding of the origin of the rhythmic regularity of the menstrual cycle, an international research team involving Inserm, CNRS and Université Claude Bernard Lyon 1 compared a large amount of data on cycles collected from studies conducted in Europe and North America. Its findings show that the menstrual cycle is finely regulated by an internal clock, which in turn is occasionally influenced by the lunar cycle. This research, to be published in Science Advances, argues in favour of further study of this potential link, to identify the potential relevance of chronobiology in the treatment of fertility disorders.

The average length of a typical human ovarian cycle, or menstrual cycle, is 29.3 days, varying from one person to another and from one cycle to another within the same person. It begins on the first day of menstruation and consists of three phases, each dedicated to the realization of a specific process linked to ovulation, which occurs around Day 14 of the cycle.

Some studies have suggested that each phase may be influenced by an internal clock, the disruption of whose rhythm is associated with irregularities in the menstrual cycle.

In humans, the most well-known internal clock is the circadian clock, which is very close to 24 hours and maintains the sleep-wake cycle and the various physiological rhythms. It is synchronised with the day-night cycle under the influence of light. When the circadian clock is disrupted – such as with jet-lag – it takes a few days to return to its normal rhythm, resynchronising with the new day-night cycle.

In the case of menstrual cycles, the involvement of an internal clock could manifest in a similar way: the cycle length should be highly stable within individuals and, if disrupted, its optimal rhythm should be restored by mechanisms of adaptation through synchronisation with external conditions.

So what could this ‘external synchroniser’ be? One recurring theory suggests that the lunar cycle could play this role, but scientific evidence is lacking to date.

An international research team led by Claude Gronfier, Inserm researcher at the Lyon Neuroscience Research Centre (Inserm/CNRS/Université Claude Bernard Lyon 1), has investigated the potential existence of an internal biological clock that would regulate the menstrual cycle and which may be synchronised with the lunar cycle. Using a large database of menstrual cycles collected in European and North American studies, the team was able to compare a total of approximately 27,000 menstrual cycles in 2,303 European women and approximately 4,800 cycles in 721 North American women.

The researchers began by examining the stability from one menstrual cycle to another at an individual level, by comparing the lengths of successive cycles. They observed that the average length of each participant’s cycle was stable overall, even though out of a series of successive cycles some were longer or shorter than that person’s ‘standard’ cycle.

These observations suggest the existence of a mechanism that corrects the difference between the length of the current cycle and that of a typical menstrual cycle in the person concerned, explains René Écochard, first author of the study, a doctor at the Hospices Civils de Lyon and professor at Université Claude Bernard Lyon 1. A few shorter cycles could therefore compensate for a series of a few longer cycles so that the total length of the cycle is around the usual length of the menstrual cycle. The length of a cycle could therefore depend on the length of previous cycles.’

‘The observation of this phenomenon argues in favour of the existence of an internal clock that finely regulates menstrual cycles, itself synchronised by a cyclical environmental event,’ adds Gronfier.

Secondly, the research team looked at the potential relationships between the onset of menstruation in the cycles studied and the phases of the moon at the time the data was collected.

It was able to observe an occasional but significant association between the menstrual cycle and the lunar cycle with, however – and without this research being able to explain its cause – a major difference between the European cohorts and the North American cohort: in the European cohorts, the cycle began more often during the waxing phase of the moon, whereas in the North American cohort it began more often at full moon.

Despite this astonishing difference, which we are unable to explain at present, the links identified in this research between the lunar and menstrual cycles, through their proximity to certain phenomena that we observe in chronobiology, suggest that the periodicity of menstruation and ovulation could be influenced, in a modest but significant way, by the lunar cycle,’ adds Gronfier.

These findings therefore argue in favour of an internal clock system with a near-monthly rhythm, synchronised to a small extent by the lunar cycle. However, they need to be studied in more depth and confirmed by laboratory studies and larger epidemiological studies.

Thanks to smartphone apps for recording cycles, the emergence of large databases containing information on the cycles of several hundreds of thousands of women could provide new opportunities for studies,’ says Écochard.

Confirming the existence of an internal clock coordinating the menstrual cycle, as well as the mechanisms involved in its synchronisation, could make it possible to apply personalised “circadian” medicine approaches – which are already used in oncology and for the treatment of sleep disorders or depression, for example – to problems such as ovulation and fertility disorders,’ concludes Gronfier.

Amyotrophic lateral sclerosis: a new avenue for improving patient diagnosis and follow-up

Neurones noradrénergiques présents dans le locus coeruleus de sourisNoradrenergic neurons in the mouse locus coeruleus whose dysfunction contributes to cortical hyperexcitability in ALS. © Caroline Rouaux

Amyotrophic lateral sclerosis or Charcot’s disease is a neurodegenerative disease that results in progressive paralysis and subsequent death. Diagnosing it is difficult and no curative treatment exists to date, making these challenges for research. In a new study, Inserm researcher Caroline Rouaux and her team at the Strasbourg Biomedical Research Centre (Inserm-Université de Strasbourg), in collaboration with researchers from Ludwig Maximilian University in Munich, CNRS and Sorbonne Université, show that electroencephalography could become a diagnostic and prognostic tool for the disease. Thanks to this type of examination, the scientists were able to reveal an atypical brain wave profile that could prove to be specific to the disease. Through this research, published in Science Translational Medicine, a potential therapeutic target has also been discovered. Fundamental advances that could ultimately benefit patients.

Amyotrophic lateral sclerosis (ALS), otherwise known as Charcot’s disease, remains a veritable challenge for clinicians. This neurodegenerative disease, which most often develops between the ages of 50 and 70, leads to progressive paralysis and death within just two to five years. It is caused by the death of the motor neurons – the nerve cells that control the muscles, both in the brain (central motor neurons) and in the spinal cord (peripheral motor neurons).

Diagnosing ALS is difficult because its initial signs vary from person to person: weakness or cramps in an arm or leg, trouble swallowing or slurred speech, etc. In addition, there is no biomarker specific to the disease. Therefore it is diagnosed by ruling out other conditions that can lead to motor disorders, which usually takes one to two years after the onset of symptoms, delaying the deployment of therapeutic measures and reducing the chances of inclusion in clinical trials at an early stage.

It was with the aim of shortening this time frame that Caroline Rouaux’s team at the Strasbourg Biomedical Research Centre, in collaboration with the teams of Sabine Liebscher in Munich and Véronique Marchand-Pauvert, Inserm researcher in Paris, tested the use of electroencephalography1. This inexpensive and easy-to-use technique involves placing electrodes on the surface of the skull to record brain activity in the form of waves.

The examination performed in subjects with ALS and in corresponding animal models revealed an imbalance between two types of waves associated with excitatory and inhibitory neuron activity, respectively. This imbalance, in favour of greater excitatory neuron activity to the detriment of inhibitory neurons, reflects cortical hyperexcitability.

This phenomenon is no surprise and had already been described with other investigation methods, but these are rarely used due to being difficult to implement and only work at the very beginning of the disease. Electroencephalography, however, is minimally invasive, very inexpensive, and can be used at different times during the disease. In addition, the atypical brain wave profile revealed by electroencephalography could prove to be specific to the disease,’ explains Rouaux, Inserm researcher and last author of the study.

Indeed, analysis of the electroencephalographic recording of the brain’s electrical activity reveals various types of brain waves of differing amplitudes and frequencies. One of these, called the theta wave, reflects the activity of the excitatory neurons that transmit messages stimulating neurons, while another wave, gamma, reflects that of the inhibitory neurons that block the transmission of nerve messages.

The study reveals that in humans and animals with ALS, the interaction between these two wave types is atypical, revealing an imbalance between the excitatory and inhibitory activities. Not only was this imbalance found in all the subjects tested, but the scientists also showed that the more the symptoms of the disease progress, the greater the imbalance. In addition, this atypical wave pattern was detected in animals even before the onset of the first motor symptoms.

If these initial findings are confirmed, electroencephalography could in the future serve as a prognostic tool for already-diagnosed patients in order to evaluate, for example, the response to a medication, or even as a diagnostic tool in the event of symptoms suggestive of the disease.

In the second part of this research, the scientists were able to study, in patients and mice, the mechanisms behind the hyperexcitability observed. First, they measured the levels of the different neuromodulators produced by the neurons to communicate with each other, and found a deficiency in one of them: noradrenaline was present in smaller amounts in the brains of the patients and mice with ALS compared to healthy brains.

To verify the role of noradrenaline, they blocked the production of this neuromodulator in healthy animals, and showed that doing so causes cortical hyperexcitability, such as that observed in the disease. And conversely, by administering molecules that stimulate the action of noradrenaline in a mouse model of ALS, the scientists reduced the hyperexcitability and restored brain activity equivalent to that of healthy mice.

This discovery could mark the opening of a new therapeutic avenue in ALS provided that cortical hyperexcitability is indeed associated with disease progression. Indeed, while we have seen an association between the two in our study, no causal link has been established for the moment. This is what we will be checking in the coming months.’ concludes Rouaux.

 

1 Electroencephalography is commonly used for research purposes in neurology but also in clinical practice. The examination provides information on brain activity in the event of sleep disorders, after a stroke, or even in the case of coma. It can also be used to diagnose encephalitis, epilepsy or confirm brain death.

Physical and mental well-being of older adults: a positive impact of meditation and health education

Méditation_personnes âgéesLearning mindfulness meditation improves self-compassion, while health education promotes an increase in physical activity. © AdobeStock

A team from Inserm and Université de Caen Normandie, in collaboration with researchers from the University of Jena (Germany) and University College London (UK), has studied the potential benefits of meditation and health education interventions in people who feel that their memory is in decline. This research was performed as part of the European H2020 Silver Santé Study programme coordinated by Inserm[1]. It shows that learning mindfulness meditation improves self-compassion, while health education promotes an increase in physical activity. These findings, published in Alzheimer’s & Dementia: Diagnosis, Assessment & Disease Monitoring, propose new avenues to support healthier ageing.

‘Subjective cognitive decline’ is when people feel that their cognitive faculties have deteriorated without this being apparent in standard cognitive tests. Studies have shown that such people have a higher risk of developing actual cognitive decline.

Previous studies had concluded that mindfulness meditation and health education (a practice in which people implement preventive measures and actions that are beneficial for their health) had a positive impact, which was still present six months later, on anxiety in people reporting subjective cognitive decline.

More generally, self-compassion (feeling of kindness toward oneself, having a sense of common humanity, and having an awareness of negative thoughts and feelings without over-identification) and exercise have previously been associated with better mental health, itself associated with improved general health, well-being, and quality of life.

A European research group coordinated by Julie Gonneaud, Inserm researcher at the Physiopathology and Imaging of Neurological Disorders laboratory (Inserm/Université de Caen Normandie), Olga Klimecki, researcher at the University of Jena (Germany), and Nathalie Marchant, researcher at University College London (UK), studied the impact of eight weeks of mindfulness meditation and health education courses on self-compassion and physical activity in people reporting subjective cognitive decline.

The trial included 147 patients from memory clinics in France, Spain, Germany and the UK. One group took meditation classes for eight weeks, while the other took health education classes. The impact of the interventions was evaluated using blood tests, cognitive assessments and questionnaires.

The researchers observed that the participants who did the mindfulness meditation training showed an improvement in their self-compassion. The participants who did the health education training showed an increase in their physical activity. These changes were still present six months later.

These findings support complementary effects of mindfulness meditation and participation in health education programmes on certain factors contributing to improved mental well-being and lifestyle in older adults reporting subjective cognitive decline.

The fact that these improvements appear to be sustained after six months of follow-up suggests that these new skills and habits have been incorporated into the participants’ lives.

According to Marchant, who led the trial, ‘more and more people are living to an advanced age, and it is crucial that we find ways to support the mental and physical health of older adults.’

‘Self-compassion can be of great importance to the elderly. It could improve psychological well-being in order to promote healthy ageing,’ adds Klimecki. ‘Our findings are an encouraging first step towards a mindfulness-based intervention that could be used to strengthen self-compassion in older adults. ‘

Gonneaud adds: ‘Although physical activity has been scientifically associated with better physical, cognitive and mental health, how to promote it in everyday life remains a challenge. Given the particularly harmful effect of a sedentary lifestyle on the health of ageing populations, showing that health education intervention programmes can strengthen commitment to physical activity among the elderly is particularly promising for promoting healthy ageing,‘ she concludes.

 

[1] This research is funded by the European Union and forms part of the SCD-Well study of the H2020 Silver Santé programme (www.silversantestudy.eu), which has received 7 million euros in funding and is coordinated by Inserm. The Silver Santé study, funded for a five-year period, examines whether mental training techniques, such as mindfulness meditation, health education, or language learning, can help improve the mental health and the well-being of the ageing population.

The brain mechanisms behind our desire to dance

Groove- envie de danser© AdobeStock

Why does some music make us want to dance more than others? This is the question that a research team from Inserm and Aix-Marseille Université tried to answer by studying the desire to dance (also called the ‘groove’) and the brain activity of 30 participants who were asked to listen to music. Their findings show that the groove sensation is highest for a moderately complex rhythm and that the desire to move is reflected in the brain by an anticipation of the music’s rhythm. This research, to be published in Science Advances also designates the left sensorimotor cortex[1] as being the centre of coordination between the auditory and motor systems.

Dancing means action. But to dance to the sound of a melody, you still have to coordinate your actions with the rhythm of the music. Previous studies have already shown that the motor system (consisting of the motor cortex[2] and all the brain structures and nerve pathways which, under its control, participate in the execution of movement) plays a crucial role in the brain’s processing of musical rhythms.

‘Groove’ is the spontaneous desire to dance to music. But while some music has us immediately heading for the dance floor, others leaves us indifferent. So what is it that makes some music more ‘groovy’ than others?

A research team led by Benjamin Morillon, Inserm researcher at the Institute of Systems Neurosciences (Inserm/Aix-Marseille Université), looked at the neural dynamics (i.e. the interactions between neurons resulting from the electrical activity of the brain) of 30 participants when they listened to pieces of music whose rhythms were of greater or lesser complexity. This was to determine the brain mechanisms involved in the emergence of the groove sensation.

To do this, the team started by creating 12 short melodies comprised of a rhythm of 120 beats per minute – or 2 Hz, the average rhythm generally found in music. Each melody was then modified in order to obtain three variants with an increasing degree of syncopation[3] (low, medium, high) – i.e. with an increasingly complex rhythm, but without changing either the speed of the rhythm or the other musical characteristics of the melody.

The researchers then asked the participants to listen to these melodies while recording their brain activity in real time using a magnetoencephalography (MEG) device. At the end of each melody, the participants were asked to score the level of groove felt.

They also created a so-called ‘neurodynamic’ mathematical model of the neural network that describes in a simple way the brain calculations required for the emergence of the groove.

The experience of the groove as reported by the participants – and reproduced by the neurodynamic model – appeared to be correlated with the rate of syncopation. As observed in previous studies, the desire to move to music was highest for a rhythm with an intermediate level of syncopation, i.e. not too simple or too complex.

‘These findings show that the motor engagement linked to the groove is materialised by a temporal anticipation of the tempo. At brain level, this is based on a dynamic balance between the temporal predictability of the rhythm (the less complex the rhythm, the better it is) and the listener’s temporal prediction errors (the more complex the rhythm, the more errors they make),’ explains Arnaud Zalta, first author of the study and post-doctoral fellow at ENS-PSL.

Analysis of the participants’ brain activity then enabled the researchers to highlight the role of the left sensorimotor cortex as coordinator of the neural dynamics involved in both auditory temporal prediction and the planning and execution of movement.

The brain region which is the site of the left sensorimotor cortex is currently considered to be the potential cornerstone of sensorimotor integration, essential for the perception of both music and speech. The fact that it appears in our study as necessary for ‘cooperation’ between the auditory and motor systems reinforces this hypothesis, especially as we are using natural stimuli here,’ concludes Morillon.

 

[1]In the brain, the sensorimotor cortex consists of the motor cortex and the sensory cortex (postcentral gyrus, at the front of the parietal lobe), separated by the central fissure. Involved in the coordination of movements, it receives sensory information from the different parts of the body and integrates it to adjust and refine the movements generated by the motor cortex.

[2]The motor cortex consists of the regions of the cerebral cortex that participate in the planning, control and execution of voluntary muscle movements. It is located in the posterior part of the brain’s frontal lobe, in the precentral gyrus.

[3]In rhythmic solfège, if we consider the 4/4 measure, beats 1 and 3 are ‘strong’ and beats 2 and 4 are ‘weak’. Syncopation is a rhythm in which a note (or a chord) is started on a weak beat and prolonged over the next strong beat. For the listener, this creates a shift in the expected accent, perceived as a kind of musical ‘hiccup’ that disrupts the regularity of the rhythm. These musical motifs are particularly present in funk or jazz.

Prefer natural light to avoid age-related sleep disorders

© Adobe stock

One in three French adults is thought to have a sleep disorder. While the prevalence of these disorders increases with age, the biological mechanisms at play are relatively unknown, leaving scientists in doubt as to their origin. In a new study, Inserm researcher Claude Gronfier and his team at the Lyon Neuroscience Research Center (Inserm/CNRS/Université Claude-Bernard Lyon 1) hypothesised that their onset during ageing was linked to a desynchronisation of the biological clock caused by decreased light perception. In the course of their research, they identified a new adaptive mechanism of the retina during ageing that enables older individuals[1] to remain sensitive to light. These findings are also of clinical relevance in encouraging older people to have more exposure to daylight, rather than artificial light, to avoid developing sleep disorders. These results have been published in the Journal of Pineal Research.

Almost all biological functions are subject to the circadian rhythm, which is a 24-hour cycle. The secretion of the night hormone melatonin is typically circadian. Its production increases at the end of the day shortly before bedtime, helping us to fall asleep, and falls before we wake up.

Previous studies have shown that its secretion by the brain is blocked by light, to which it is very sensitive. This sensitivity to light can manifest as desynchronisation of the circadian clock, which can lead to sleep disorders. Other studies have also revealed the important role, in the control of melatonin production, of melanopsin – a photoreceptor present in certain cells of the retina which, being highly sensitive to light (mainly blue light), regulates pupillary reflex and circadian rhythm. Therefore, when exposed to light, melanopsin becomes a driver of melatonin suppression and biological clock synchronisation.

While sleep disorders are already common in adults, they increase with age: nearly one third of people over 65 chronically consume sleeping pills[2]. Yet there are no previous studies specifically focusing on the biological mechanism at work in age-related sleep disorders. Are we talking about the consequence of a problem of light perception? If so, at what level? And what is the role of melanopsin in this specific case?

A team at the Lyon Neuroscience Research Center tried to elucidate this mystery. The scientists observed the effects of light on melatonin secretion in a group of adults. The participants were all exposed to 9 different coloured lights (corresponding to 9 very precise wavelengths) to enable the scientists to identify the mechanisms involved via the photoreceptors concerned.

The participants were divided into two distinct groups, with mean ages of 25 and 59. This experiment was performed in the middle of the night, when the body normally releases the most melatonin.

The results show that, out of the lights tested, blue light (with a wavelength of approximately 480 nm) is very effective in suppressing melatonin production in the youngest individuals. More specifically, the scientists observed that in the young subjects exposed to blue light, melanopsin was the only photoreceptor driving melatonin suppression. Conversely, in the older participants, photoreceptors other than melanopsin appear to be involved, such as the S and M cones – photoreceptors that enable the world to be perceived in colour, and which are located in the outer retina.

These data suggest that while ageing is accompanied by decreased melanopsin involvement in visual perception, the retina is able to compensate for this loss through an increase in the sensitivity of other photoreceptors that were previously not known to be involved in melatonin suppression.

These observations enable the scientists to conclude that light perception – and light requirements – change with age.

While for young people, in whom only the melanopsin receptor is involved, exposure to blue light[3] is sufficient to synchronise their circadian clock over a 24-hour day, older people require exposure to light that is richer in wavelengths (colours) – a light whose characteristics are those of sunlight.

‘This is the discovery of a new adaptive mechanism of the retina during ageing – enabling older subjects to remain sensitive to light despite yellowing of the lens. These findings are also clinically relevant, encouraging older people to have more exposure to daylight, which is richer in wavelengths, rather than artificial light, in order to avoid developing sleep disturbances or mood or metabolism disorders, for example. Finally, they offer new possibilities for the optimal personalisation of phototherapies/light therapies for older people‘, explains Claude Gronfier, Inserm researcher and last author of the study.

Regarding this last aspect, the research team is now looking at the quantity and quality of light necessary for each individual, and the best time for light exposure during the day, to prevent the development of sleep disorders and health problems more generally.

The research is being conducted in healthy subjects (children and adults), day and night workers, and patients (with sleep and biological rhythm disorders, genetic diseases, mood disorders and neurodegeneration)[4].

 

[1]In this study, the average age of the participants in the ‘older’ group was 59 years.

[2] https://www.has-sante.fr/jcms/c_1299994/troubles-du-sommeil-stop-a-la-prescription-systematique-de-somniferes-chez-les-personnes-agees

[3] The LED lights used are rich in blue light.

[4] https://www.crnl.fr/fr/page-base/groupe-sommeil-rythmicite-circadienne-lhumain-epidemiologie-populationnelle-recherche

Discovery of the role of a brain regulator involved in psychiatric illnesses

It was widely accepted that families of synaptic receptors transmitted excitatory, and others inhibitory, messages to neurons. © Adobe Stock

Contrary to all expectations, GluD1 – a receptor considered to be excitatory – has been shown in the brain to play a major role in controlling neuron inhibition. Given that alterations in the GluD1 gene are encountered in a certain number of neurodevelopmental and psychiatric disorders, such as autism (ASD) and schizophrenia, this discovery opens up new therapeutic avenues to combat the imbalances between excitatory and inhibitory neurotransmissions associated with these disorders. Published in Science, this research is the result of collaborations between researchers from Inserm, CNRS and ENS at the ENS Institute of Biology (IBENS, Paris, France) with their colleagues at the MRC Laboratory of Molecular Biology in Cambridge, UK.

The complexity of the brain’s function reveals many surprises. While it was widely accepted in brain activity that families of synaptic receptors (situated at the extremity of a neuron) transmitted excitatory, and others inhibitory, messages to neurons, a study co-led by Inserm researchers Pierre Paoletti and Laetitia Mony at the ENS Institute of Biology has shed new light on this.

To understand what it is all about, we need to go back to the basics. An ‘excitatory’ synapse triggers the creation of a nerve message in the form of an electrical current if a receptor on its surface is able to bind to an excitatory neurotransmitter present in the interneuronal space, most often glutamate. This is called ‘neuronal excitation’. However, an ‘inhibitory’ synapse prevents this neuronal excitation by releasing an inhibitory neurotransmitter, often GABA. This is called ‘neuronal inhibition’. Thus, the families of glutamate receptors (iGluR) and GABA receptors (GABAAR) are considered to have opposite roles.

Toutefois, un sous-type de récepteur au glutamate appelé GluD1 intriguait les scientifiques. En effet, alors qu’il est censé avoir un rôle excitateur, celui-ci est préférentiellement retrouvé au niveau de synapses inhibitrices. Cette observation, effectuée par l’équipe de la chercheuse Inserm Cécile Charrier à l’Institut de Biologie de l’ENS en 2019, avait interpellé la communauté scientifique car le gène GluD1 est souvent associé à des troubles du neurodéveloppement comme l’autisme ou à des maladies psychiatriques de type troubles bipolaires ou schizophrénie, dans les études génétiques de population humaine. Comprendre le rôle de ce récepteur représente donc un enjeu de taille. Pour y voir plus clair, l’équipe de Pierre Paoletti a étudié ses propriétés moléculaires et sa fonction, à partir de cerveaux de souris, au niveau de l’hippocampe où il est fortement exprimé.

However, a glutamate receptor subtype called GluD1 intrigued the scientists. Although it is meant to have an excitatory role, it is preferentially found at the inhibitory synapses. This observation, made by the team of Inserm researcher Cécile Charrier at the ENS Institute of Biology in 2019, attracted the interest of the scientific community because the GluD1 gene is often associated with neurodevelopmental disorders (e.g. autism) or psychiatric conditions (e.g. bipolar disorders or schizophrenia) in human population genetic studies. Understanding the role of this receptor is therefore a major challenge. To find out more, Paoletti’s team used mouse brains to study its molecular properties and function in the hippocampus where it is strongly expressed.

 

An atypical role

Contrary to its name, the researchers already knew that the GluD1 receptor is unable to bind to glutamate. But in this study they were surprised to find that it bound GABA. Radu Aricescu’s team in Cambridge even described in the publication the fine atomic structure of the site where GluD1 interacts with GABA, using a technique called X-ray crystallography[1].

In principle, its role in the brain is therefore not excitatory of neuronal activity but inhibitory. Taking this finding into account, can we still say that this receptor belongs to the glutamate receptor family?

‘While the question remains, the analyses of phylogeny (relationships between genes and proteins) and the structural data do all show that it belongs to it. However, it is possible that certain mutations acquired during the course of evolution have profoundly modified its functional properties’, explains Paoletti.

Another source of curiosity is that this receptor does not function as a ‘conventional’ glutamate receptor or as a GABA receptor. Both cause the opening of channels in the cell membrane enabling the passage of ions responsible for the excitation or inhibition of the neuron. The GluD1 receptor however does not allow any channels to be opened. Its activity results from other internal mechanisms within the cell, which remain to be clarified.

Finally, this research suggests a major regulatory role for GluD1 in relation to the inhibitory synapses. Indeed, when activated by the presence of GABA, the inhibitory synapse is more effective. This manifests as a greater inhibitory response that lasts for a few dozen minutes.

 ‘In other words, GluD1 reinforces the inhibition signal. Perhaps by promoting the recruitment of new GABA receptors at the synapse? In any case, we are talking about a key regulator’, explains Mony.

For the scientists who contributed to this research, this discovery marks a real step forward.

These findings pave the way for a better understanding of the imbalances between excitatory and inhibitory messages in the brain in neurodevelopmental and psychiatric disorders, such as ASD and schizophrenia, or in conditions characterised by neuronal hyperexcitability, such as epilepsy. Following that, it will be important to study the potential of GluD1 as a therapeutic target for restoring better balance and reducing symptoms in these disorders’, they conclude.

 

[1] A physicochemical analysis technique based on the diffraction of X-rays  by the matter to determine its molecular composition and 3D structure.

The very first 3D map of the embryonic human head enables new insights into its development

Image 3 D glande lacrymale d’embryon humain 3D light-sheet microscope image of a lacrimal gland of a tissue-cleared 12-week-old human embryo. The different elements of the gland were coloured using virtual reality software. © Raphael Blain/Alain Chédotal, Vision Institute (Inserm/CNRS/Sorbonne Université)

Improving our knowledge of the development of the complex structures of the human head to shed new light on the congenital abnormalities that cause malformations: this is the challenge that a team of researchers from Inserm, CNRS and Sorbonne Université at the Vision Institute, Université Claude Bernard Lyon 1 and Hospices civils de Lyon is well on its way to fulfilling. Thanks to an innovative technique in which the skull structures are made transparent and 3D photos are taken of their component cells, this team has been able to establish the very first 3D atlas of the embryonic human head. These findings, to be published in Cell, have already provided deeper insights into how certain complex structures of the head are formed, such as the lacrimal and salivary glands or the arteries of the head and neck. They pave the way for new tools to study embryonic development.

The head is the most complex structure in the human body. In addition to the muscles and skin that protect it, and the brain encased within the skull, it contains blood vessels, nerves, endocrine glands – which secrete hormones directly into the bloodstream – such as the pituitary glands, and exocrine glands – which secrete substances to the outside environment – such as the salivary glands, which produce saliva, or the lacrimal glands, which secrete tears.

Our current knowledge about the development of the human head and its complex structures is rudimentary and comes from studies mostly carried out in the first half of the XX century, using simple histological sections. As such, despite head malformations occurring in around one third of newborns with congenital defects, the mechanisms that control the development of the human head remain poorly understood.

A team led by Alain Chédotal, Inserm research director at the Vision Institute (Inserm/CNRS/Sorbonne Université) and professor at the MéLiS laboratory of Mechanisms in Integrated Life Sciences (Inserm/CNRS/Université Claude Bernard Lyon 1/Hospices civils de Lyon), and Yorick Gitton, CNRS staff scientist also at the Vision Institute, used an innovative microscopy method to shed new light on the development of the human head.

The team had previously used the same technology in the embryo to study the development of other human organs[1]. This technology is called ’tissue clearing’ because it makes organs transparent to light. The cleared sample is then imaged in 3D using a special microscope that scans with a fine sheet of laser light. This makes it possible to locate in situ the cells that make up the embryonic tissues.

The researchers were able to apply this technique to embryos at different stages of development, obtained from the human tissue biobank created as part of the Human Developmental Cell Atlas (HuDeCA) programme coordinated by Inserm[2]. Thanks to the images obtained, they established the first 3D map of the embryonic human head[3].

Next, the team used virtual reality to analyse the 3D images and thus ‘navigate’ within the embryos.

‘This enabled us to discover previously unknown characteristics of the development of the cranial muscles, nerves, blood vessels and exocrine glands, states Chédotal. For example, it had never been possible to study the very early stages of development of the human salivary and lacrimal glands. Our research has enabled us to begin to visualise and better understand the mechanisms behind the establishment of these anatomically extremely complex structures’, he adds.

 

œil d’embryon humain transparisé 3D light-sheet microscope image of a 12-week-old tissue-cleared human embryo eye. The 6 oculomotor muscles responsible for eye movement and the 3 motor nerves (in white, green and red) were coloured using virtual reality software. ©Raphael Blain/Alain Chédotal, Vision Institute (Inserm/CNRS/Sorbonne Université)

The scientists have also set up a web interface (hudeca.com) to access not only the images obtained in this research, but also models for 3D printing and interactive 3D reconstructions of human embryos. This platform provides valuable resources that can also contribute to the training of medical students.

In future research, the team will attempt to map the various cells of certain organs, such as the retina.

‘At this stage, it is kind of as if we have mapped the continents and countries but still have to position the cities and their inhabitants’, explains Chédotal, whose team will also collaborate with physicians to apply the technology to pathological samples.

‘The new knowledge of human embryology provided by this research, as well as the new tools developed, has major implications for understanding craniofacial malformations and neurological disorders, as well as for improving diagnostic and therapeutic strategies’, concludes the researcher.

 

[1] See our press release of 23 March 2017: https://presse.inserm.fr/en/the-human-embryo-as-you-have-never-seen-it/57363/

[2] Launched in 2019, the objective of the cross-cutting HuDeCa programme coordinated by Inserm is to build the first atlas of human embryonic and foetal cells. It also aims to structure human embryology research at French level and develop databases. In the longer term, HuDeCa is expected to serve as a basis for understanding the origin of chronic diseases or congenital malformations.

[3] With the specific exception of the brain, a structure that was not covered by this research.

fermer