<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Cortivision</title>
	<atom:link href="https://www.cortivision.com/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.cortivision.com/</link>
	<description>Fully portable and wireless fNIRS devices to measure brain activity. The highest standards for psychology and cognitive neuroscience researches!</description>
	<lastBuildDate>Tue, 24 Mar 2026 14:52:14 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.8.2</generator>

 
	<item>
		<title>MagyAR in Microgravity: Investigating Brain Function in Space</title>
		<link>https://www.cortivision.com/magyar-in-microgravity-investigating-brain-function-in-space/</link>
					<comments>https://www.cortivision.com/magyar-in-microgravity-investigating-brain-function-in-space/#respond</comments>
		
		<dc:creator><![CDATA[Hanna Babijew]]></dc:creator>
		<pubDate>Tue, 24 Mar 2026 07:11:38 +0000</pubDate>
				<category><![CDATA[Science]]></category>
		<category><![CDATA[Study]]></category>
		<category><![CDATA[brain imaging]]></category>
		<category><![CDATA[Cortivision]]></category>
		<category><![CDATA[fNIRS]]></category>
		<category><![CDATA[HUNOR project]]></category>
		<category><![CDATA[ISS research]]></category>
		<category><![CDATA[neuroimaging]]></category>
		<category><![CDATA[non-invasive monitoring]]></category>
		<category><![CDATA[PhotonGrav]]></category>
		<category><![CDATA[VR in neuroscience]]></category>
		<guid isPermaLink="false">https://www.cortivision.com/?p=6824</guid>

					<description><![CDATA[<p>Explore how fNIRS was used in the MagyAR ISS experiment to study brain function in microgravity, and how PhotonGrav extends this technology toward brain-computer interface applications during Axiom Mission 4.</p>
<p>The post <a href="https://www.cortivision.com/magyar-in-microgravity-investigating-brain-function-in-space/">MagyAR in Microgravity: Investigating Brain Function in Space</a> appeared first on <a href="https://www.cortivision.com">Cortivision</a>.</p>
]]></description>
										<content:encoded><![CDATA[<div class="block-wrapper">
    <p class="block-lead m:b1-bold-150 b2-bold-150">
        As part of the Hungarian HUNOR program, researchers from the University of Pécs conducted a series of experiments aboard the International Space Station (ISS) during Axiom Mission 4 (June 25 &#8211; July 15, 2025). The mission has now been completed, with analysis of the collected data currently in progress.
One of the key research initiatives, the MagyAR (Neuromotion VR) project, focused on how brain function, perception, and cognitive performance adapt to microgravity.    </p>
</div>


<p></p>


<div class="block-wrapper">
    <div class="h- bg-grey-300 col-start-1 col-end-13 m:col-start-3 m:col-end-11"></div>
</div>

<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h2 class="block-header">Monitoring Brain Activity in Space</h2>
                <p>At the core of the MagyAR study was functional near-infrared spectroscopy (fNIRS), used to monitor changes in cerebral oxygenation and brain activity in orbit. The system deployed on the ISS was provided by Cortivision, enabling stable, non-invasive neuroimaging in microgravity conditions.<br />
fNIRS proved particularly suitable for this environment due to its robustness against motion and external interference. Within the experiment, it was used to capture how fluid shifts and altered oxygen delivery-well-documented effects of microgravity-affect neural processes associated with cognition.<br />
The study combined fNIRS with virtual reality (VR) and eye-tracking, forming a multimodal framework for assessing mental performance. Astronauts performed cognitive and motor tasks in a controlled VR setting while brain activity and physiological responses were recorded simultaneously.</p>
    </div>
</div>

<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h2 class="block-header">Experimental Scope</h2>
                <p>Measurements were conducted at multiple stages of the mission, enabling observation of both immediate and progressive adaptations to space conditions. The experiment focused on:</p>
<ul>
<li>cerebral blood flow and changes in blood oxygenation,</li>
<li>attention and executive functions,</li>
<li>planning and executing movements.</li>
</ul>
<p>Complementary biological data, including saliva samples, were collected to track metabolic changes.</p>
    </div>
</div>

<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h2 class="block-header">Mission Status and Next Steps</h2>
                <p>The in-orbit phase of the MagyAR experiment has been successfully completed. Detailed scientific results are expected following full post-mission data analysis.</p>
    </div>
</div>

<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h2 class="block-header">Broader Research Context</h2>
                <p>The MagyAR project was part of a wider set of experiments conducted by the University of Pécs, including:</p>
<ul>
<li>ESEL3D, examining the behavior of 3D-printed materials in space conditions,</li>
<li>Step in Space (SiS), a virtual reality initiative presenting the mission and its experiments,</li>
<li>additional biomedical and plant-based studies conducted within consortium projects.</li>
</ul>
    </div>
</div>

<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h2 class="block-header">See the MagyAR Experiment in Action</h2>
                    <img decoding="async" class="mb-5 w-full" src="https://www.cortivision.com/app/uploads/2026/03/magyar.jpg" alt="" />
                <p>A short video prepared by the HUNOR team presents the MagyAR experiment, showcasing the integration of virtual reality and fNIRS to study cognitive and motor processes in microgravity.</p>
<p><a href="https://www.linkedin.com/posts/hungarian-to-orbit_a-magyar-k%C3%ADs%C3%A9rlet-ami-virtu%C3%A1lis-val%C3%B3s%C3%A1g-activity-7344405334234431488-HU1T?utm_source=share&amp;utm_medium=member_desktop&amp;rcm=ACoAADoF7IIBxLaW8S5xpwHHnrHO-8GTRAb9614">Check out the video on LinkedIn</a></p>
    </div>
</div>

<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <p>&nbsp;</p>
<p>To explore how this technology is used toward brain-computer interface applications in space, <strong>read more about the PhotonGrav</strong> in our dedicated article: <a href="https://www.cortivision.com/photongrav-pioneering-bci-in-space/">PhotonGrav: Pioneering BCI in Space</a><a href="https://www.cortivision.com/photongrav-pioneering-bci-in-space/"><img decoding="async" class="wp-image-6318 alignright" src="https://www.cortivision.com/app/uploads/2025/04/PhotonGrav-Patch-300x300.png" alt="" width="115" height="115" srcset="https://www.cortivision.com/app/uploads/2025/04/PhotonGrav-Patch-300x300.png 300w, https://www.cortivision.com/app/uploads/2025/04/PhotonGrav-Patch-150x150.png 150w, https://www.cortivision.com/app/uploads/2025/04/PhotonGrav-Patch-768x768.png 768w, https://www.cortivision.com/app/uploads/2025/04/PhotonGrav-Patch.png 851w" sizes="(max-width: 115px) 100vw, 115px" /><br />
</a></p>
    </div>
</div>

<div class="block-wrapper">
    <div class="h- bg-grey-300 col-start-1 col-end-13 m:col-start-3 m:col-end-11"></div>
</div>

<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h2 class="block-header">Source</h2>
                <p>This blog post is based on information from the official website the University of Pécs and official page of the Hungarian Astronaut Programme, HUNOR:</p>
    </div>
</div>


<div class="wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex">
<div class="wp-block-button"><a class="wp-block-button__link has-text-align-left wp-element-button" href="https://aok.pte.hu/en/hirek/hir/17755">&gt;&gt;&gt; Cutting-edge brain research, 3D printing, and virtual reality in the HUNOR Mission</a></div>



<div class="wp-block-button"><a class="wp-block-button__link has-text-align-left wp-element-button" href="https://aok.pte.hu/en/hirek/hir/17830">&gt;&gt;&gt; Hungarian experiments outperformed NASA&#8217;s success rate on the International Space Station</a></div>



<div class="wp-block-button"><a class="wp-block-button__link has-text-align-left wp-element-button" href="https://www.linkedin.com/posts/hungarian-to-orbit_a-magyar-k%C3%ADs%C3%A9rlet-ami-virtu%C3%A1lis-val%C3%B3s%C3%A1g-activity-7344405334234431488-HU1T?utm_source=share&amp;utm_medium=member_desktop&amp;rcm=ACoAADoF7IIBxLaW8S5xpwHHnrHO-8GTRAb9614">&gt;&gt;&gt; HUNOR (Hungarian to Orbit) about project</a></div>
</div>



<div class="wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex"></div>


<div class="block-wrapper">
    <div class="h- bg-grey-300 col-start-1 col-end-13 m:col-start-3 m:col-end-11"></div>
</div>


<p></p>
<p>The post <a href="https://www.cortivision.com/magyar-in-microgravity-investigating-brain-function-in-space/">MagyAR in Microgravity: Investigating Brain Function in Space</a> appeared first on <a href="https://www.cortivision.com">Cortivision</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.cortivision.com/magyar-in-microgravity-investigating-brain-function-in-space/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Photon Cap in On-going Research of Pathways Behind Mental Health (PARAM Project)</title>
		<link>https://www.cortivision.com/photon-cap-in-param-project/</link>
					<comments>https://www.cortivision.com/photon-cap-in-param-project/#respond</comments>
		
		<dc:creator><![CDATA[Hanna Babijew]]></dc:creator>
		<pubDate>Mon, 02 Mar 2026 09:19:20 +0000</pubDate>
				<category><![CDATA[Science]]></category>
		<category><![CDATA[Study]]></category>
		<category><![CDATA[cohort studies]]></category>
		<category><![CDATA[neurodevelopmental studies]]></category>
		<category><![CDATA[PARAM project]]></category>
		<category><![CDATA[PhotonCap]]></category>
		<guid isPermaLink="false">https://www.cortivision.com/?p=6791</guid>

					<description><![CDATA[<p>Introduction Due to socio-cultural and environmental diversity, rapid urbanization and digitalization, India has a potential for neurodevelopmental studies. Some previous cohort studies have shown that factors such as “socioeconomic status, childhood adversity, maternal mental health, caregiving practices, and environmental toxins all influence cognitive and emotional development” (Holla et al., 2025). However, many cohort studies target &#8230; <a href="https://www.cortivision.com/photon-cap-in-param-project/">Continued</a></p>
<p>The post <a href="https://www.cortivision.com/photon-cap-in-param-project/">Photon Cap in On-going Research of Pathways Behind Mental Health (PARAM Project)</a> appeared first on <a href="https://www.cortivision.com">Cortivision</a>.</p>
]]></description>
										<content:encoded><![CDATA[<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h3>To understand how genetic, environmental, and life-course factors interact to shape brain, behavioural, and psychological development, the Indian Council of Medical Research (ICMR) launched a major <em>longitudinal cohort study</em> titled <strong>the Pathways to Resilience And Mental health (PARAM) project.</strong></h3>
    </div>
</div>


<h2 class="wp-block-heading">Introduction</h2>



<p>Due to socio-cultural and environmental diversity, rapid urbanization and digitalization, India has a potential for neurodevelopmental studies. Some previous cohort studies have shown that factors such as “socioeconomic status, childhood adversity, maternal mental health, caregiving practices, and environmental toxins all influence cognitive and emotional development” (Holla et al., 2025). However, many cohort studies target late childhood, are not large, and provide only limited combined exposure-outcome information.</p>



<p>As reported by Holla et al. (2025),<strong> ICMR</strong> in coordination <strong>with the National Institute of Mental Health and Neurosciences (NIMHANS)</strong> from Bengaluru, launched a large PARAM project, focusing on <em>individuals from the prenatal period into early adulthood</em> across diverse social and ecological settings.</p>



<p>By <em>tracking neurodevelopmental trajectories</em>, linking them to life exposures, and identifying risk and resilience factors, PARAM aims to reveal what shapes mental well-being early in life.</p>



<p>Because of the longitudinal design and national scale, the results of PARAM study will emerge over time. However, the project’s goals, study design, methods, and planned analyses are detailed in the study protocol published in <a href="https://link.springer.com/article/10.1186/s12888-025-07492-x"><em>BMC Psychiatry Journal</em></a> by Holla et al. (2025).</p>



<p>We follow up on the project’s outcomes and findings, as our <strong>Photon Cap technology</strong> <strong>is used as one of the methods</strong> to measure cerebral hemodynamic signals from the prefrontal cortex during resting-state sessions as part of the research process.</p>



<h2 class="wp-block-heading">Objectives</h2>



<p>According to Holla et al. (2025), PARAM is primarily designed to track <em>how development unfolds over time</em>, and identify when and why certain pathways diverge, particularly in ways linked to mental health outcomes.</p>



<p>To manage and combine data from multiple sites while accounting for site differences in the analyses, the study protocol sets out the following goals:</p>



<ol class="wp-block-list">
<li>tracing the development of the brain, cognition, behaviour, and mental health from before birth through early adulthood;</li>



<li>examining how maternal stress, nutrition, inflammation, toxins, digital media engagement, and socio-environmental adversity influence development, with focus on critical and sensitive periods;</li>



<li>applying gene-to-environment (G×E) approach and normative models to integrate genetic and environmental data and generate individual-level scores for personalized insights;</li>



<li>studying the biological, brain imaging, cognitive, and social factors that increase risk or provide protection in the development and progression of psychiatric symptoms;</li>



<li>developing an open platform for multimodal data to enable reproducible research and global data sharing;</li>
</ol>



<h2 class="wp-block-heading">Timeframe</h2>



<p><em>The PARAM study began in May 2023</em> with a nine-month preparation phase focused on hiring staff, adapting assessment tools into seven Indian languages, setting up digital systems, and training teams across study sites.</p>



<p><em>Participant recruitment began in February 2024 and will continue until April 2026</em>, being carried out in phases across study sites.</p>



<p>Baseline assessments are planned to run over 18 months at each site and be completed within three months of a participant’s enrollment, with all assessments scheduled to finish by July 2026.</p>



<h2 class="wp-block-heading">Participants</h2>



<p>PARAM project <em>aims to enroll around 9000 participants</em>, from the prenatal period (before birth up to age 6) through young adulthood (23-30 years), to track early-life exposures (Holla et al., 2025).</p>



<h3 class="wp-block-heading"><strong>Participants are divided into two cohorts:</strong></h3>



<ul class="wp-block-list">
<li><em>pregnant mothers and infants</em> followed from enrolment or birth to age two, with six-monthly assessments to capture changes during the first 1000 days of life;</li>



<li><em>ages of 2-30 years</em> with overlapping age groups to enable reconstruction of age-related development within a shorter timeframe.</li>
</ul>



<p>The study was approved by the Institutional Ethics Committees at all participating sites, and obtained informed consent with written consent from a parent or legal guardian on minors behalf (Holla et al., 2025).</p>



<p>Recruitment takes place in both urban and rural areas, including clinics, community groups, schools, and high-risk settings, to study developmental differences and mental health outcomes.</p>



<h2 class="wp-block-heading">Methods and Procedures</h2>



<p>Participants undergo a combination of assessments including questionnaires (interviewer- and self-administered), performance-based neurocognitive tasks, neurophysiological tests, and neuroimaging protocols.</p>



<p>Assessments are organized into <strong>core measures for all participants</strong> and <strong>deep measures</strong> (with fNIRS and MRI) <strong>for a subset</strong>. Standardized procedures, quality control, and centralized training ensure consistency across sites and visits (Holla et al., 2025).</p>



<p>During the <em>core assessment </em>all participants complete one or two assessment visits lasting about 4-5 hours with breaks. Follow-up visits occur within one month before to three months after the planned date. Any unfinished assessments are completed within three months of recruitment.</p>



<p>The assessment <em>begins with documentation of prenatal growth from </em><strong>ultrasound records</strong> and <em>continues with </em><strong>longitudinal developmental assessments</strong> spanning infancy to adulthood, including cognitive, social, pubertal, and caregiver-child interaction measures (Holla et al., 2025).</p>



<p>Participants and parents complete <strong>age-appropriate questionnaires </strong>covering <em>temperament, personality, psychopathology, family history, caregiving context, adversity, and environmental exposures</em>, alongside <em>standardized neuropsychological tasks</em> assessing <em>attention, memory, executive function, and social cognition</em> (Holla et al., 2025).</p>



<p><strong>Anthropometry</strong>, <strong>nutritional biomarkers</strong>, and multi-matrix<strong> biospecimens</strong> (e.g., blood, saliva, hair, urine) are collected to support genetic, metabolic, inflammatory, hormonal, and environmental exposure analyses (Holla et al., 2025).</p>



<p><strong>Neurophysiological measures</strong> include <em>postural balance testing</em> and <em>heart rate variability</em> during rest and isometric stress challenge.</p>



<p>As part of a <em>deep assessment subset</em>, the participants additionally undergo<strong> MRI</strong> and <strong>resting-state fNIRS</strong> acquisition, with centralized data management and cross-site quality control procedures.</p>



<h2 class="wp-block-heading">Data Analysis Approach</h2>



<p>Holla et al. (2025) report emphasizes that PARAM handles missing data using likelihood-based methods, multiple imputation, and inverse probability weighting, while accounting for non-random missingness and differences in mode, language, and site.</p>



<p>To detect meaningful patterns, <em>PARAM uses a strategic sampling approach.</em> It intentionally includes more individuals at developmental extremes and across important exposures &#8211; such as levels of urbanisation, pollution, and adversity. This boosts the study’s ability <em>to model complex, nonlinear developmental trajectories and interactions</em>.</p>



<p>Developmental trajectories across neurocognition, symptoms, imaging, and anthropometry are modelled using <em>generalized additive mixed models (GAMM)</em>, <em>generalized additive models for location, scale, and shape (GAMLSS)</em>, and <em>latent growth or mixture models</em> (Holla et al., 2025).</p>



<p>Emphasizing on generalized transdiagnostic mechanisms instead of discrete diagnostic categories, PARAM enables individual-level inference.</p>



<h2 class="wp-block-heading">fNIRS Application</h2>



<p><strong>Photon Cap C20</strong> is used to collect <em>resting-state</em> fNIRS data. Activity from the prefrontal cortex is recorded for up to 15 minutes at baseline and follow-up visits, with at least 10 minutes of usable, high-quality data retained after processing.</p>



<p>The Photon Cap system uses 16 LED sources and 10 detectors arranged <em>over the bilateral prefrontal cortex</em>. It incorporates both long- and short-separation channels to distinguish cortical signals from physiological artifacts.</p>



<p>The recorded data are processed with a standardized channel-level quality checks, calibration and physiological artifacts removal, as well as estimation of functional connectivity measures. The sessions are repeated if signal quality is insufficient to ensure reliable, consistent measurements throughout the study.</p>



<p>Such implementation of the Photon Cap allows PARAM to integrate <em>non-invasive brain activity</em> measurements alongside MRI to help chart neurodevelopmental and functional developmental trajectories from early life into adulthood.</p>



<h2 class="wp-block-heading">Potential Implications</h2>



<p>Holla et al. (2025) aim to provide important insights into how nutrition, caregiving, education, and environmental conditions influence development from childhood through adolescence. The focus on prenatal and early childhood stages helps identify sensitive periods for prevention, while follow-up into adolescence and young adulthood clarifies risks during key life transitions.</p>



<p>Potential outputs include developmental reference charts, exposure-trajectory maps, and risk stratification tools to support maternal-child health programs, environmental policies, and digital wellbeing initiatives (Holla et al., 2025).</p>



<p>An open, governed-access database and biobank will also support further research and international collaboration.</p>


<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <pre class="wp-block-preformatted"><em>Based on original publication: Holla, B., Sharma, E., Venkataramanan, S. et al. The PAthways to Resilience And Mental health (PARAM) project: protocol for a multi-site developmental cohort in India. BMC Psychiatry 25, 1051 (2025). https://doi.org/10.1186/s12888-025-07492-x</em></pre>
    </div>
</div><p>The post <a href="https://www.cortivision.com/photon-cap-in-param-project/">Photon Cap in On-going Research of Pathways Behind Mental Health (PARAM Project)</a> appeared first on <a href="https://www.cortivision.com">Cortivision</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.cortivision.com/photon-cap-in-param-project/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Hyperscanning with the Photon Cap: The Role of Neural Synchrony During Social Interaction</title>
		<link>https://www.cortivision.com/hyperscanning-with-the-photon-cap-the-role-of-neural-synchrony-during-social-interaction/</link>
					<comments>https://www.cortivision.com/hyperscanning-with-the-photon-cap-the-role-of-neural-synchrony-during-social-interaction/#respond</comments>
		
		<dc:creator><![CDATA[Hanna Babijew]]></dc:creator>
		<pubDate>Thu, 05 Feb 2026 16:05:48 +0000</pubDate>
				<category><![CDATA[Hyperscanning]]></category>
		<category><![CDATA[Study]]></category>
		<category><![CDATA[fNIRS]]></category>
		<category><![CDATA[hyperscanning]]></category>
		<category><![CDATA[INS]]></category>
		<category><![CDATA[intergenerational community programs]]></category>
		<category><![CDATA[PhotonCap]]></category>
		<guid isPermaLink="false">https://www.cortivision.com/?p=6769</guid>

					<description><![CDATA[<p>Introduction In their study, Moffat et al. (2025) raised the question of increasing health risks caused by loneliness. Particularly, the study focused on the efficiency of intergenerational community programs bringing younger and older people together. It helps older adults stay physically, socially, and cognitively healthier, while also reducing younger adults’ negative stereotypes about aging.&#160; The &#8230; <a href="https://www.cortivision.com/hyperscanning-with-the-photon-cap-the-role-of-neural-synchrony-during-social-interaction/">Continued</a></p>
<p>The post <a href="https://www.cortivision.com/hyperscanning-with-the-photon-cap-the-role-of-neural-synchrony-during-social-interaction/">Hyperscanning with the Photon Cap: The Role of Neural Synchrony During Social Interaction</a> appeared first on <a href="https://www.cortivision.com">Cortivision</a>.</p>
]]></description>
										<content:encoded><![CDATA[<div class="block-wrapper">
    <p class="block-lead m:b1-bold-150 b2-bold-150">
        Loneliness is becoming a serious health issue worldwide, pushing researchers to look for new ways to help people feel more connected.     </p>
</div>


<h2 class="wp-block-heading">Introduction</h2>



<p>In their study, Moffat et al. (2025) raised <em>the question of increasing health risks</em> <em>caused by loneliness</em>. Particularly, the study focused on the efficiency of <strong>intergenerational community programs</strong> bringing younger and older people together. It helps older adults stay physically, socially, and cognitively healthier, while also reducing younger adults’ negative stereotypes about aging.&nbsp;</p>



<p>The study implies that because of these benefits, doctors are increasingly recommending participation in community programs, highlighting the acknowledgement of social connection as part of healthcare. However, the mechanisms behind the benefits were unclear, since most evidence of improvement comes from participants’ self-evaluation and earlier behavioral studies, not physiological data.</p>



<p>To shed light on the physiological side of relationship building and explain how these relationships produce positive effects, Moffat et al. (2025) conducted a <em>longitudinal hyperscanning study using two Photon Cap devices provided through the Cortivision <a href="https://www.cortivision.com/?post_type=post&amp;s=Pathfinder">Pathfinder program.</a></em></p>



<h2 class="wp-block-heading">Methods and Materials</h2>



<h3 class="wp-block-heading"><em>Participants</em></h3>



<p>Using a six-week art program, Moffat et al. (2025) followed 122 participants &#8211; 31 older adults (69+ years) and 91 younger adults (18-35 years) &#8211; from Zurich, Switzerland, recruited through universities, senior community groups, and social media.</p>



<p>Participants were paired into either intergenerational or same-generation teams based on their availability to attend the sessions. Gender matching was not prioritised due to scheduling differences. All pairs joined the study as strangers.</p>



<h3 class="wp-block-heading"><em>Procedure</em></h3>



<p>The research team tracked changes in social wellbeing, how well the pairs worked together, and how closely their brain activity aligned during interactions using a hyperscanning approach, which allowed researchers to examine <em>interpersonal neural synchrony (INS)</em>, or how closely people’s brain activity aligns during interaction.</p>



<h4 class="wp-block-heading has-text-align-center">Self-evaluation</h4>



<p><strong>At the start</strong> of each session, participants completed a short self-report questionnaire measuring overall, emotional, and social loneliness, and <strong>at the end</strong> of each session, participants rated how close they felt to their partner using a simple visual scale showing different levels of overlap between two circles.&nbsp;</p>



<p>Participants reported their attitudes toward people of different age groups. They rated people their own age as well as people from another generation (older or younger). This allowed us to track changes in attitudes toward other generations over time.</p>



<h4 class="wp-block-heading has-text-align-center">Photon Cap use</h4>



<p>Moffat et al. (2025) measured brain activity using two <em>Photon Cap</em> and<em> Cortiview</em> software which allowed participants to move and interact naturally similar to real-world social interactions. Each Photon Cap measured brain activity over key social brain areas (IFG and TPJ) using 16 light emitters and 10 light detectors, placed using standard 10-5 EEG placement based on brain mapping from the <a href="https://neurosynth.org/">Neurosynth database</a>. After placing the caps, a 3D scan of each participant’s head was created and then processed using MATLAB and the FieldTrip toolbox to ensure the sensors were positioned accurately over the regions of interest (ROI), allowing precise measurement of brain activity during interactions (Moffat et al., 2025).</p>



<h2 class="wp-block-heading">Results</h2>



<p>According to Moffat et al. (2025), the <em>participants in intergenerational pairs</em> <em>reported feeling less lonely and more socially close than those in same-generation pairs</em>. These benefits increased over repeated sessions for both groups.</p>



<p>Attitudes toward other generations remained stable overall. However, intergenerational pairs felt equally positive toward their own and the other generation, whereas same-generation pairs favored their own.&nbsp;</p>



<p><em>Brain data revealed robust INS </em>especially during collaborative drawing, with stronger and more widespread synchrony emerging over time. These <em>effects were most pronounced in brain regions linked to social cognition and coordination</em> and differed by task and pair type, highlighting distinct neural dynamics for intergenerational versus same-generation interactions.</p>



<p>INS data was meaningfully related to self-reported experience: greater synchrony was associated with lower loneliness, higher social closeness, and more positive intergenerational attitudes in specific contexts, suggesting that <em>INS captures both the social and emotional quality of interaction and holds promise as a neural marker of changes in loneliness</em>.</p>



<pre class="wp-block-preformatted"><em>Based on preprint publication: Moffat, R., Dumas, G., &amp; Cross, E. S. (2025). Longitudinal intergenerational hyperscanning reveals indices of relationship formation and loneliness (Preprint). bioRxiv.</em><a href="https://doi.org/10.1101/2025.10.14.682029"><em> https://doi.org/10.1101/2025.10.14.682029</em></a></pre>


<div class="block-wrapper">
    <div class="h- bg-grey-300 col-start-1 col-end-13 m:col-start-3 m:col-end-11"></div>
</div><p>The post <a href="https://www.cortivision.com/hyperscanning-with-the-photon-cap-the-role-of-neural-synchrony-during-social-interaction/">Hyperscanning with the Photon Cap: The Role of Neural Synchrony During Social Interaction</a> appeared first on <a href="https://www.cortivision.com">Cortivision</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.cortivision.com/hyperscanning-with-the-photon-cap-the-role-of-neural-synchrony-during-social-interaction/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Effects of Cognitive Load on the Complex Decision-Making</title>
		<link>https://www.cortivision.com/effects-of-cognitive-load-on-the-complex-decision-making/</link>
					<comments>https://www.cortivision.com/effects-of-cognitive-load-on-the-complex-decision-making/#respond</comments>
		
		<dc:creator><![CDATA[Hanna Babijew]]></dc:creator>
		<pubDate>Fri, 28 Nov 2025 18:29:47 +0000</pubDate>
				<category><![CDATA[Experimental]]></category>
		<category><![CDATA[Science]]></category>
		<category><![CDATA[Study]]></category>
		<category><![CDATA[cognitive load]]></category>
		<category><![CDATA[decision-making]]></category>
		<category><![CDATA[fNIRS]]></category>
		<category><![CDATA[PhotonCap]]></category>
		<guid isPermaLink="false">https://www.cortivision.com/?p=6731</guid>

					<description><![CDATA[<p>Published in Frontiers (Volume 19, 2025), Yang et al. (2025) presented a study examining how information-processing load influences the accuracy of decisions and conclusions in complex decision-making contexts, using a Photon Cap C20 fNIRS device to measure prefrontal-cortex blood oxygenation. Introduction The experiment Participants Procedure Data acquisition Outcome</p>
<p>The post <a href="https://www.cortivision.com/effects-of-cognitive-load-on-the-complex-decision-making/">Effects of Cognitive Load on the Complex Decision-Making</a> appeared first on <a href="https://www.cortivision.com">Cortivision</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<h2 class="wp-block-heading"><em>Published in Frontiers (Volume 19, 2025), Yang et al. (2025) presented a study examining how information-processing load influences the accuracy of decisions and conclusions in complex decision-making contexts, using a Photon Cap C20 fNIRS device to measure prefrontal-cortex blood oxygenation.</em></h2>



<h2 class="wp-block-heading">Introduction</h2>


<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <p><!-- wp:paragraph --></p>
<p>Yang et al. (2025) acknowledge that in engineering design, decision-making tasks often require collaboration among multidisciplinary teams. Differences in the knowledge structures of various disciplines can lead to ambiguities or misunderstandings of specialized terminology and concepts &#8211; <em>meaning discrepancy</em> &#8211; leading to <em>communication barriers</em> that may ultimately affect the quality of final decisions.</p>
<p>The research team suggests that such communication failures <em>occur due to cognitive conflict</em> <em>induced by increased cognitive load</em>.</p>
<p>Previous studies of decision-making paradigms have explored managerial influence theories and behavioral analyses, however, <em>the cognitive and neural mechanisms</em> of interdisciplinary group decision-making were largely<em> unexplored</em>.</p>
<p>To fill in this gap, Yang et al. (2025) developed an <em>experimental Multi-Attribute Decision-Making with Layered Group Dynamics (MADM-LGD) task</em> &#8211; a decision path model that connects diversity of meaning interpretations with cognitive conflict, increased cognitive load, and reduced communication efficiency.</p>
<p>This study uses fNIRS to track prefrontal oxygenation, integrating semantic and linguistic analyses to explore how cognitive level and interdisciplinary communication shape the quality of group decision-making.</p>
<!-- /wp:paragraph -->    </div>
</div>


<h2 class="wp-block-heading">The experiment</h2>



<h3 class="wp-block-heading">Participants</h3>


<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <p><span style="font-weight: 400">The study involved </span><i><span style="font-weight: 400">54 participants</span></i><span style="font-weight: 400">, including 25 undergraduates, 21 master’s students, and 8 doctoral students from universities in Shanghai. Participants were </span><i><span style="font-weight: 400">organized into 18 groups of three members each</span></i><span style="font-weight: 400">. The final selection consisted of 34 males and 20 females, all of whom had recent experience working or interning in engineering companies.</span></p>
    </div>
</div>


<h3 class="wp-block-heading">Procedure</h3>


<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <p><span style="font-weight: 400">To understand how different decision-making factors affect the quality of choices, Yang et al. (2025) used MADM-LGD task, designed to </span><i><span style="font-weight: 400">imitate a cruise ship cabin crew interaction</span></i><span style="font-weight: 400">, hence group decision-making dynamics.</span></p>
<p><span style="font-weight: 400">The experiment </span><i><span style="font-weight: 400">lasted for a month </span></i><span style="font-weight: 400">in the summer of 2024, and was carried out in an independent, quiet laboratory environment, where the researchers were collecting and analyzing Oxy-Hb levels data using Photon Cap C20 fNIRS technology.</span></p>
<p><span style="font-weight: 400">The procedure consisted of </span><i><span style="font-weight: 400">two decision-making phases</span></i><span style="font-weight: 400"> (individual and group) and </span><i><span style="font-weight: 400">a total of seven stages</span></i><span style="font-weight: 400"> with brief breaks, as illustrated in the overall flowchart (Yang et al., 2025) below.</span></p>
    </div>
</div>

<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h2 class="block-header"></h2>
                    <img decoding="async" class="mb-5 w-full" src="https://www.cortivision.com/app/uploads/2025/11/Figure-2.-Stimulus-sequences.jpg" alt="" />
                <p style="text-align: center"><em><span style="font-weight: 400">The full sequence of procedural steps. </span></em>(<a href="https://www.frontiersin.org/files/Articles/1594111/fnins-19-1594111-HTML-r1/image_m/fnins-19-1594111-g002.jpg">Figure 2 in Yang et al., 2025</a>)</p>
    </div>
</div>

<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <ul>
<li style="text-align: left"><strong>an individual phase (I1–I2)</strong> &#8211; making the individual decisions independently in accordance with the previously briefed instructions within the three-minute periods (Individual phases P1 and P2).</li>
<li style="text-align: left"><strong>a group phase (G1–G5)</strong> &#8211; conducting several group discussion-making rounds to compare and select the most effective scheme among the provided ones, followed by individual scoring for all selected schemes afterwards (Group phases P1 to P5).</li>
</ul>
    </div>
</div>

<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <p><span style="font-weight: 400">The fluctuations of Oxy-Hb levels were measured by Photon Cap across all </span><i><span style="font-weight: 400">seven stages</span></i><span style="font-weight: 400"> divided into individual (I1 to I2) and group (G1–G5) stages accordingly. The duration of the test session was </span><i><span style="font-weight: 400">approximately 45 minutes </span></i><span style="font-weight: 400">in total, divided into alternating intervals of cognitive stimulation and rest.</span></p>
    </div>
</div>


<h2 class="wp-block-heading">Data acquisition</h2>


<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <p><span style="font-weight: 400">As mentioned earlier, Yang et al. (2025) used Photon Cap C20 along with the Cortivision Pathfinder to measure cortical hemodynamic activity during both resting and decision-making conditions to monitor changes in Oxy-Hb levels, with participants sitting quietly with eyes closed during resting phases to allow Oxy-Hb levels to return to baseline. </span></p>
<p><span style="font-weight: 400">For greater precision, an enhanced “10–5 system” was used to place optodes over prefrontal  regions of interest (ROIs) such as frontopolar (FOA), pars triangularis Broca (PTBA), dorsolateral prefrontal cortex (DLPFC), and inferior prefrontal gyrus (IPFG).</span></p>
<p><span style="font-weight: 400">Overall, the montage consisted of 22 channels with 8 light sources and 8 detectors.</span></p>
    </div>
</div>

<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h2 class="block-header"></h2>
                    <img decoding="async" class="mb-5 w-full" src="https://www.cortivision.com/app/uploads/2025/11/Figure-4.-Experimental-environment-and-layout-of-channels.-Layout-of-brai.jpg" alt="" />
                <p style="text-align: center"><em>Photon Cap channel layout over the frontal lobe during the procedure.</em></p>
<p style="text-align: center">(<a href="https://www.frontiersin.org/files/Articles/1594111/fnins-19-1594111-HTML-r1/image_m/fnins-19-1594111-g004.jpg">Figure 4 in Yang et al., 2025)</a></p>
    </div>
</div>

<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <p><span style="font-weight: 400">Collected data included participants’ profile information (gender, age, handedness), dialogue texts from interdisciplinary group discussions, and cortical Oxy-Hb concentrations during the MADM-LGD task.</span></p>
    </div>
</div>


<h2 class="wp-block-heading">Outcome</h2>


<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <p><span style="font-weight: 400">Using Photon Cap the researchers identified that activation was mainly concentrated in the frontopolar area (FOA), </span><i><span style="font-weight: 400">especially during transitions between individual and group decision-making</span></i><span style="font-weight: 400">. Significant differences in averaged Oxy-Hb levels were observed across all </span><i><span style="font-weight: 400">seven stages</span></i> <i><span style="font-weight: 400">I1 to I2 </span></i><span style="font-weight: 400">and</span><i><span style="font-weight: 400"> G1 to G5</span></i><span style="font-weight: 400"> accordingly. Brain activity increased from individual stage I1 to I2, then dropped as participants moved into the group stages (G1–G5), suggesting a shift in cognitive effort and strategy use. </span></p>
<p><span style="font-weight: 400">According to Yang et al. (2025) these findings indicate that individual decision-making demanded higher cognitive load, while group collaboration tended to reduce it &#8211; highlighting the neural basis of how collaboration can ease mental strain, relieve pressure, improve performance, promote clearer understanding and better overall decision quality.</span></p>
<p><span style="font-weight: 400">The Photon Cap proved effective for examining cognitive load, and Yang et al. (2025) suggest that future studies combine fNIRS and eye-tracking to gain deeper insights into cognitive control and visual attention in interdisciplinary group decision-making.</span></p>
    </div>
</div>

<div class="block-wrapper">
    <div class="h- bg-grey-300 col-start-1 col-end-13 m:col-start-3 m:col-end-11"></div>
</div>

<div class="block-wrapper">
    <p class="block-lead m:b1-bold-150 b2-bold-150">
        This blog post is based on open-access publication: Yang, J., Jiang, Z., Cheng, K., &amp; Wu, L. (2025). Disciplinary barriers need communication: A behavioral and fNIRS study under group decision-making paradigm shift based on cabin design. Frontiers in Neuroscience, 19, 1594111. https://doi.org/10.3389/fnins.2025.1594111    </p>
</div><p>The post <a href="https://www.cortivision.com/effects-of-cognitive-load-on-the-complex-decision-making/">Effects of Cognitive Load on the Complex Decision-Making</a> appeared first on <a href="https://www.cortivision.com">Cortivision</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.cortivision.com/effects-of-cognitive-load-on-the-complex-decision-making/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Frontal Cortex Activation During CLFT: A Photon Cap C20 Data Analysis</title>
		<link>https://www.cortivision.com/frontal-cortex-activation-during-clft-a-photon-cap-c20-data-analysis/</link>
					<comments>https://www.cortivision.com/frontal-cortex-activation-during-clft-a-photon-cap-c20-data-analysis/#respond</comments>
		
		<dc:creator><![CDATA[Hanna Babijew]]></dc:creator>
		<pubDate>Thu, 30 Oct 2025 10:37:14 +0000</pubDate>
				<category><![CDATA[Science]]></category>
		<category><![CDATA[CLFT]]></category>
		<category><![CDATA[fNIRS]]></category>
		<category><![CDATA[ILFT]]></category>
		<category><![CDATA[PhotonCap]]></category>
		<category><![CDATA[Prefrontal cortex]]></category>
		<category><![CDATA[Study]]></category>
		<category><![CDATA[VFT]]></category>
		<guid isPermaLink="false">https://www.cortivision.com/?p=6697</guid>

					<description><![CDATA[<p>A recent study by Krukow et al. (2025), published in Scientific Reports1 presented a comparative analysis of a Combined Letter Fluency Task (CLFT)—newly introduced version of the verbal fluency test (VFT)—and a typical Initial Letter Fluency Task (ILFT), using fNIRS technology for data obtaining and its further analysis. Introduction The verbal fluency tests (VFT) are &#8230; <a href="https://www.cortivision.com/frontal-cortex-activation-during-clft-a-photon-cap-c20-data-analysis/">Continued</a></p>
<p>The post <a href="https://www.cortivision.com/frontal-cortex-activation-during-clft-a-photon-cap-c20-data-analysis/">Frontal Cortex Activation During CLFT: A Photon Cap C20 Data Analysis</a> appeared first on <a href="https://www.cortivision.com">Cortivision</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<h2 class="wp-block-heading">A recent study by Krukow et al. (2025), published in <a href="https://www.nature.com/articles/s41598-025-12558-7">Scientific Reports</a><sup data-fn="7346e539-bbbd-424e-a6e2-bde4a29a0cfa" class="fn"><a id="7346e539-bbbd-424e-a6e2-bde4a29a0cfa-link" href="#7346e539-bbbd-424e-a6e2-bde4a29a0cfa">1</a></sup> presented a comparative analysis of a Combined Letter Fluency Task (CLFT)—newly introduced version of the verbal fluency test (VFT)—and a typical Initial Letter Fluency Task (ILFT), using fNIRS technology for data obtaining and its further analysis.</h2>



<h1 class="wp-block-heading">Introduction</h1>



<p><strong>The verbal fluency tests (VFT)</strong> are widely used <strong>to study frontal cortex activity</strong> in clinical groups and cognitive neuroscience research. Traditionally, these tasks are analyzed using functional magnetic resonance imaging (fMRI), but there is growing interest in<strong> using fNIRS to measure oxygenated (oxy-Hb) and deoxygenated hemoglobin (deoxy-Hb) in brain tissue</strong>. </p>



<figure class="wp-block-pullquote has-text-align-center has-violet-900-color has-text-color has-link-color has-base-font-size wp-elements-5e0c1dad2e9c8e424352cedbd9fbd22a" style="padding-top:0;padding-right:var(--wp--preset--spacing--20);padding-bottom:0;padding-left:var(--wp--preset--spacing--20)"><blockquote><p><em><strong>Krukow et al. (2025) implies that the interest in fNIRS is driven by ease of use, cost-effectiveness, and minimal requirements for specialized equipment or dedicated space. Portable fNIRS systems have shown the ability to produce data that correlates well with fMRI’s blood oxygenation level-dependent (BOLD) signals.</strong></em></p></blockquote></figure>



<p>Commonly, VFTs are combined with fNIRS to study individuals with schizophrenia, depression, and bipolar disorder, as well as research on aging and dementia. These tests usually last up to 1 minute and are divided into <em>phonemic fluency</em>, focused on pronouncing words with particular letter(s) and<em> semantic fluency</em> &#8211; naming words within a particular category (animals, fruits etc). This division elicits hemodynamic responses in the corresponding brain regions involved in task performance.</p>



<figure class="wp-block-pullquote has-violet-900-color has-text-color has-link-color has-base-font-size wp-elements-b3a9b77ceff3e7f0f3e36bdbc7110f4e" style="padding-top:0;padding-right:var(--wp--preset--spacing--50);padding-bottom:0;padding-left:var(--wp--preset--spacing--50)"><blockquote><p><strong><em>Phonemic</em> tasks engage <em>the dorsolateral prefrontal cortex (DLPFC)</em> for controlled retrieval, while <em>semantic tasks</em> involve <em>the ventrolateral PFC</em> and <em>temporal regions</em> for associative processing.</strong></p></blockquote></figure>



<p>However, Krukow et al. (2025) consider that recent research has highlighted the limitations of standard VFTs. The cognitive and linguistic specificity of VFT remains unclear. Studies have shown that phonemic VFT activates the frontal cortex more strongly than semantic VFT. Moreover, factors such as processing speed, verbal intelligence, executive functioning, and education level significantly influence performance.</p>



<h2 class="wp-block-heading">Combined Letter Fluency Task</h2>



<p>According to Krukow et al. (2025), some alternations for VFTs have already been considered, such as picking words which did not contain a particular letter from an assigned category; or switching between two or more-word categories, which relied on cognitive flexibility and demanded more involvement of a working memory.<br><br>Since none of those modified verbal fluency tasks (VFTs) examined whether such tasks reveal any difference in patterns of frontal cortex activation compared to standard initial letter fluency tests (ILFT), <strong>Krukow et al. (2025) proposed a new task—the CLFT.<br><br></strong>The purpose of the study was to observe <strong>dynamic changes in frontal cortex activity using fNIRS</strong>, aiming to detect behavioral and hemodynamic differences during two tasks—<em>the CLFT </em>and the traditional <em>ILFT</em>—and to compare their activation patterns. More about the study <a href="https://www.nature.com/articles/s41598-025-12558-7#citeas">here</a>.</p>



<figure class="wp-block-pullquote has-violet-900-color has-text-color has-link-color has-base-font-size wp-elements-bcfd7dc52f943862292599e534d918df" style="padding-top:0;padding-right:var(--wp--preset--spacing--50);padding-bottom:0;padding-left:var(--wp--preset--spacing--50)"><blockquote><p><strong><em>Krukow et al. (2025) anticipated that CLFT would result in a <strong>lower number of correct responses</strong>, a <strong>higher error rate</strong>, and more <strong>irregular clustering and switching</strong> behavior, as well as the <strong>time-on-task curve</strong> (which tracks the number of words generated over successive time intervals) flatting for CLFT, reflecting a diminished reliance on automatic verbal associations, especially early in the task was expected. </em></strong></p></blockquote></figure>



<h2 class="wp-block-heading">The procedure</h2>



<h3 class="wp-block-heading">Participants</h3>



<p>The study involved 35 participants, but only <strong>32 were successfully evaluated</strong> due to signal artifacts in the fNIRS data from 3 participants, who were eventually excluded from the analysis.</p>



<p>All participants were adults, with an average age of 22.6 ± 2.8 years for females and 25.0 ± 6.1 years for males; all were Caucasian, had no recorded neurological disorders or chronic pain; all were based in Lublin (Poland, EU), where the study was conducted. Participation was voluntary and included established financial compensation.</p>



<h3 class="wp-block-heading">Methods &amp; Materials</h3>



<p>According to Krukow et al. (2025), the procedure was conducted using a computer with pre-installed PsychoPy software (version 2023.2.3v79), specifically designed for this test, and a Photon Cap C20 fNIRS system, to measure brain activity.</p>



<p>Instructions and tasks were displayed on a monitor positioned at eye level to minimize downward head tilt during the examination. Spoken responses were digitally recorded and manually analyzed according to linguistic standards.&nbsp;</p>



<p>Participants completed two tasks:&nbsp;</p>



<ol class="wp-block-list">
<li><strong>the traditional initial letter fluency task (ILFT)</strong> (5 rounds, 60 seconds each)—<em>verbally generated as many words as possible beginning with a given letter, presented in a randomized order for each participant.</em></li>
</ol>



<ol start="2" class="wp-block-list">
<li><strong>the combined letter fluency task (CLFT)</strong>—similar to ILFT, except participants <em>had to generate as many words as possible that began with a specific consonant, while avoiding a specific vowel.</em></li>
</ol>



<p>Initially, half of the participants began with the ILFT procedure, while the other half started with the CLFT procedure. Each task session lasted less than 15 minutes.</p>



<p>The study was intentionally divided into two separate sessions spaced 7-8 days apart due to cognitive load and in order to control transfer effects.&nbsp;</p>



<p>Optical signals were recorded using a two-wavelength (760 and 850 nm) continuous-wave system, equipped with a 32-channel montage covering the forehead regions of each participant. The collected data, sampled at 5 Hz, was analyzed using <strong>CortiPrism software</strong> (v. 06.2023). Photon Cap recordings were first converted from raw intensity to optical density using the modified Beer-Lambert Law. Motion artifacts were subsequently corrected using the TDDR—Temporal Derivative Distribution Repair algorithm (as shown in Fig. 4 of the study report, based on CortiPrism data).&nbsp;</p>


<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h2 class="block-header">Fig. 4: Channel distribution and regions of interest (ROIs) analyzed in the frontal cortex, as visualized using CortiPrism software v.1.2.2).</h2>
                    <img decoding="async" class="mb-5 w-full" src="https://www.cortivision.com/app/uploads/2025/10/Channel-distribution-and-Regions-of-Interest-considered-in-the-frontal-cortex.-image-generated-using-CortiPrism-software-v.1.2.2.webp" alt="" />
                <p style="text-align: center">From: <a href="https://www.nature.com/articles/s41598-025-12558-7" data-test="subtitle" data-article-title="true" data-track="click" data-track-action="back to article" data-track-label="link" data-track-category="figure">Effects of the combined letter fluency task on frontal cortex regional and dynamic oxygenation patterns</a></p>
    </div>
</div>


<h3 class="wp-block-heading">Evaluation criteria</h3>



<p>Krukow et al. (2025) considered responses correct if the participants followed the task instructions and if the produced words contained none of the following errors: repetitions, incorrect initial letters, use of proper nouns, or CLFT-specific vowel rule violations. Additionally, speech was examined for groups of words related by sound or meaning, and shifts between such groups, using phonetic and semantic criteria.</p>



<h2 class="wp-block-heading">Results</h2>



<h3 class="wp-block-heading">Photon Cap C20 data analysis</h3>



<p>Krukow et al. (2025) stated, that oxy-Hb concentration outcomes were analyzed with respect to<strong> task, time and lateralization</strong>, as well as their interactions <strong>across three distinct Regions of Interest (ROIs): </strong><em>orbital/frontopolar cortex </em>(<strong>OFC</strong>),<em> lateral frontal cortex</em> (<strong>LF</strong>), and <em>middle/superior frontal cortex</em> (<strong>pre-SMA</strong>).</p>



<p>The changes in hemoglobin concentration within three ROIs was analyzed across four time segments: 0–15, 15–30, 30–45, and 45–60 seconds, using mixed-model repeated-measures ANOVA approach (2 task types * (4*15 sec consecutive intervals) (as shown in Fig. 1 of the study report).</p>


<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h2 class="block-header">Fig. 1: Task × Time-on-Task ANOVA results for CLFT and ILFT.</h2>
                    <img decoding="async" class="mb-5 w-full" src="https://www.cortivision.com/app/uploads/2025/10/41598_2025_12558_Fig1_HTML.webp" alt="" />
                <p style="text-align: center">From: <a href="https://www.nature.com/articles/s41598-025-12558-7" data-test="subtitle" data-article-title="true" data-track="click" data-track-action="back to article" data-track-label="link" data-track-category="figure">Effects of the combined letter fluency task on frontal cortex regional and dynamic oxygenation patterns</a></p>
    </div>
</div>


<p>A repeated measures ANOVA method revealed (as shown in Fig. 2 of the study report):&nbsp;</p>



<ul class="wp-block-list">
<li><strong>in the OFC region</strong>—the oxy-Hb concentration was highly time-reliant, showing a significant increase in the final 15 seconds of the task (45–60 sec) compared to the start (0–15 sec). The higher oxy-Hb concentration on the left side of the brain signaled a significant effect of lateralization. No other effects or interactions reached significance.</li>
</ul>



<ul class="wp-block-list">
<li><strong>in the LF region</strong>—similar to OFC, oxy-Hb levels were affected by time, since oxy-Hb significantly increased during both the 15–30 sec and 45–60 sec intervals compared to the initial 0–15 sec segment.&nbsp;Additionally, the oxy-Hb levels during the 30–45 sec interval were significantly higher than during the 0–15 sec interval. A main effect of lateralization was also found with higher oxy-Hb levels on the left side compared to the right, with no other main or significant effects.</li>
</ul>



<ul class="wp-block-list">
<li><strong>in the pre-SMA region</strong>—a significant interaction between time, task type, and brain lateralization was detected. Further contrastive analysis indicated a notable drop in oxy-Hb concentration on the right hemisphere during the 0–15 sec interval during the CLFT compared to the ILFT.</li>
</ul>


<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h2 class="block-header">Fig. 2: Brain activity (oxy-Hb changes) over time and between left and right hemispheres in different frontal regions. Panels A–D show timing and lateralization effects; panels E–F compare two types of word fluency tasks (ILFT vs. CLFT). Results are statistically corrected for multiple tests.</h2>
                    <img decoding="async" class="mb-5 w-full" src="https://www.cortivision.com/app/uploads/2025/10/Fig2.webp" alt="" />
                <p style="text-align: center">From: <a href="https://www.nature.com/articles/s41598-025-12558-7" data-test="subtitle" data-article-title="true" data-track="click" data-track-action="back to article" data-track-label="link" data-track-category="figure">Effects of the combined letter fluency task on frontal cortex regional and dynamic oxygenation patterns</a></p>
    </div>
</div>


<h2 class="wp-block-heading">Conclusion</h2>



<p>The study aimed to examine how CLFT affects brain activity by comparing it to a standard phonemic fluency task. The analysis focused on differences in oxy-Hb response patterns between the two tasks, rather than comparing them to a baseline state.</p>



<p>While assessing the clinical utility of CLFT was not the primary objective, the results suggest that it may be a valuable tool for exploring cognitive selection processes and merits further investigation.</p>



<p>This demonstrates the reliability of our devices &#8211; such as Photon Cap—for accurate data acquisition and processing in experimental clinical studies. Ultimately, the data collected can support the ongoing development of improved methods for studying and visualizing brain activity.</p>


<ol class="wp-elements-18878f78889e3c5df6c6a616b6eb4620 wp-block-footnotes has-text-color has-stone-700-color"><li id="7346e539-bbbd-424e-a6e2-bde4a29a0cfa"><strong><em>This blog post is based on open-access publication:</em></strong><em> Krukow, P., Kopiś-Posiej, N., Rodríguez-González, V. et al. Effects of the combined letter fluency task on frontal cortex regional and dynamic oxygenation patterns. Sci Rep 15, 26468 (2025). https://doi.org/10.1038/s41598-025-12558-7</em>.  <a href="#7346e539-bbbd-424e-a6e2-bde4a29a0cfa-link" aria-label="Jump to footnote reference 1"><img src="https://s.w.org/images/core/emoji/16.0.1/72x72/21a9.png" alt="↩" class="wp-smiley" style="height: 1em; max-height: 1em;" />︎</a></li></ol>


<p></p>
<p>The post <a href="https://www.cortivision.com/frontal-cortex-activation-during-clft-a-photon-cap-c20-data-analysis/">Frontal Cortex Activation During CLFT: A Photon Cap C20 Data Analysis</a> appeared first on <a href="https://www.cortivision.com">Cortivision</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.cortivision.com/frontal-cortex-activation-during-clft-a-photon-cap-c20-data-analysis/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Pathfinder project regarding generalization in ASD</title>
		<link>https://www.cortivision.com/pathfinder-project-regarding-generalization-in-asd/</link>
					<comments>https://www.cortivision.com/pathfinder-project-regarding-generalization-in-asd/#respond</comments>
		
		<dc:creator><![CDATA[Robert]]></dc:creator>
		<pubDate>Fri, 25 Jul 2025 07:41:01 +0000</pubDate>
				<category><![CDATA[Science]]></category>
		<guid isPermaLink="false">https://www.cortivision.com/?p=6477</guid>

					<description><![CDATA[<p>Introduction How do autism spectrum disorder (ASD) shape the way we learn and generalise new information? Stacy Moppert has just wrapped up a Pathfinder-funded investigation that tackles this question head-on, pairing classic conditioning paradigms with monitoring the prefrontal cortex activity (PFC) via our Photon Cap functional-near-infrared-spectroscopy (fNIRS). Generalization is the ability to apply what you &#8230; <a href="https://www.cortivision.com/pathfinder-project-regarding-generalization-in-asd/">Continued</a></p>
<p>The post <a href="https://www.cortivision.com/pathfinder-project-regarding-generalization-in-asd/">Pathfinder project regarding generalization in ASD</a> appeared first on <a href="https://www.cortivision.com">Cortivision</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<h2 class="wp-block-heading">Introduction</h2>



<p>How do autism spectrum disorder (ASD) shape the way we learn and generalise new information? Stacy Moppert has just wrapped up a Pathfinder-funded investigation that tackles this question head-on, pairing classic conditioning paradigms with monitoring the prefrontal cortex activity (PFC) via our <a href="https://www.cortivision.com/products/photon/">Photon Cap</a> functional-near-infrared-spectroscopy (fNIRS).</p>



<p>Generalization is the ability to apply what you have learned in one context to new but related situations. It is essential for flexible behavior. Findings on learning and ASD have been mixed, with ASD being linked to benefits and deficits. Stacy Moppert wants to clarify behavioural and neuronal signatures of learning-based generalization in relation to ASD.</p>



<h2 class="wp-block-heading">Method</h2>



<h3 class="wp-block-heading">Hypothesis</h3>



<p>Stacy Moppert hypothesized that those with higher autistic traits, as determined by the Autism Spectrum Quotient (AQ) questionnaire, will have a greater amount of individual differences in their ability to generalize and increased prefrontal cortex activity compared to those with lower autistic traits who will have similar patterns of generalization and lower prefrontal cortex activity.</p>



<h3 class="wp-block-heading">Participants</h3>



<p>The research was conducted on 60 psychology students. The subjects completed the Autism Spectrum Quotient (AQ) questionnaire, on the basis of which they were selected into groups with high autistic traits and low autistic traits.&nbsp;</p>



<h3 class="wp-block-heading">Procedure</h3>



<p>In the first experiment, subjects were trained to learn the pairing between the Prototype shape, referred to as a Sea Ghost Prototype, and the image of a dog (CS+) and the unpairing of a Prototype shape from another family, referred to as a Non-Sea Ghost Prototype (CS-). After the initial training, subjects were given a generalization task to identify if a shape is a Sea Ghost or Not a Sea-Ghost. In total, 10 shapes, with 5 being distortions of a Sea Ghost and 5 being a distortion of a Non-Sea Ghost was used in the task. Then, participants was asked to identify which stimuli evoke conditioned responses in recognition trials.</p>



<p>In the second experiment the same college students were trained on the same paradigm using a spectrum of colors stemming from blue to green. The blue color (CS+) was presented with a dog picture while an unpairing of the stimulus will occur with a green color (CS-). Afterwards, the adults were participating in a generalization task with 14 stimuli stemming from blue patch (CS+) to the unpairing green color (CS-). Then, participants were asked to identify what stimulus results in the (CS+) in recognition trials.</p>



<h3 class="wp-block-heading">Summary</h3>



<p>Researchers used our Photon Cap to investigate the relationship between activity in the prefrontal cortex (PFC), performance in a generalisation task and autistic traits. The results of this study will expand our knowledge of heterogeneous learning outcomes and atypical brain activity associated with autistic traits. Examining autistic traits in a diverse population can help us to better understand the neural mechanisms behind specific traits and learning, thereby mimicking the heterogeneity observed in this disorder.</p>



<p></p>
<p>The post <a href="https://www.cortivision.com/pathfinder-project-regarding-generalization-in-asd/">Pathfinder project regarding generalization in ASD</a> appeared first on <a href="https://www.cortivision.com">Cortivision</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.cortivision.com/pathfinder-project-regarding-generalization-in-asd/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Diffuse Optical Tomography (DOT) in fNIRS: How It Works and Why It Matters</title>
		<link>https://www.cortivision.com/dot-fnirs/</link>
					<comments>https://www.cortivision.com/dot-fnirs/#respond</comments>
		
		<dc:creator><![CDATA[Wojtek]]></dc:creator>
		<pubDate>Wed, 21 May 2025 13:52:00 +0000</pubDate>
				<category><![CDATA[Science]]></category>
		<guid isPermaLink="false">https://www.cortivision.com/?p=6374</guid>

					<description><![CDATA[<p>Discover how Diffuse Optical Tomography (DOT) transforms traditional fNIRS into 3D brain imaging—bringing richer spatial resolution and new research possibilities.</p>
<p>The post <a href="https://www.cortivision.com/dot-fnirs/">Diffuse Optical Tomography (DOT) in fNIRS: How It Works and Why It Matters</a> appeared first on <a href="https://www.cortivision.com">Cortivision</a>.</p>
]]></description>
										<content:encoded><![CDATA[<div class="block-wrapper">
    <p class="block-lead m:b1-bold-150 b2-bold-150">
        Discover how Diffuse Optical Tomography (DOT) transforms traditional fNIRS into 3D brain imaging—bringing richer spatial resolution and new research possibilities.    </p>
</div>

<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <p><strong>Diffuse Optical Tomography (DOT)</strong> is quietly revolutionizing how we approach non-invasive brain imaging. For years, researchers using functional Near-Infrared Spectroscopy (fNIRS) have been able to observe cortical activity through light-based measurements of hemodynamics. But traditional fNIRS has always faced a spatial limitation: the method collects signals from narrow, isolated channels, with each channel essentially viewing a thin column of tissue between an emitter and detector. While powerful in its simplicity and portability, this setup leaves a great deal of the &#8220;bigger picture&#8221; of brain activity unobserved.</p>
<p>DOT is changing that.</p>
<h3 data-start="953" data-end="997">The Limitation of Single-Channel Views</h3>
<p data-start="999" data-end="1454">To understand what DOT improves, it helps to first appreciate how conventional fNIRS works. A single source-detector pair tells us about hemodynamic changes occurring along a restricted path beneath the scalp—effectively capturing the activity of a small cortical volume located between those two points. Channels are spatially distinct and non-overlapping, which means that brain activity is sampled in discrete patches rather than as a continuous map.</p>
<p data-start="1456" data-end="1647">This is similar to trying to understand a city’s traffic by only observing isolated intersections—you may get local information, but you won’t see how traffic flows across the entire network.</p>
<h3 data-start="1649" data-end="1677">How DOT Breaks Through</h3>
<p data-start="1679" data-end="2251">Diffuse Optical Tomography addresses this limitation by creating <strong data-start="1744" data-end="1767">high-density arrays</strong> of overlapping fNIRS channels. The first key change involves how the optodes—light sources and detectors—are arranged. DOT moves beyond the constraints of standard EEG-based placement systems (such as 10–10 or 10–5 layouts) because these cannot provide the small inter-optode distances required for effective DOT. Instead, DOT systems use custom-designed caps—such as <em data-start="2136" data-end="2147">ninjacaps</em>—that support a range of source-detector distances, typically from about <strong data-start="2220" data-end="2232">15–17 mm</strong> up to <strong data-start="2239" data-end="2248">40 mm</strong>.</p>
<p data-start="2253" data-end="2559">Why is this range of distances important? When multiple source-detector pairs overlap spatially at different depths and angles, it becomes possible to infer the underlying distribution of optical absorption within the tissue—in other words, to reconstruct a 3D image of where brain activity is occurring.</p>
<p data-start="2561" data-end="3081">In technical terms, the process involves running <strong data-start="2610" data-end="2637">photon migration models</strong>—advanced simulations (often based on Monte Carlo methods) that describe how light scatters and is absorbed as it moves through biological tissue. By comparing actual measurements from the dense array to the predictions of these models, researchers can solve what’s known as an <strong data-start="2915" data-end="2934">inverse problem</strong>: estimating where in the brain the observed hemodynamic changes originated. The result is a volumetric, depth-resolved map of cortical activation.</p>
    </div>
</div>

<div class="block-wrapper">
    <blockquote class="col-start-1 col-end-13 m:col-start-3 m:col-end-9">
        <div class="w- bg-theme h- mb-5"></div>
        <span class="b1-reg-150 m:h3-light-125 text-theme">
            If traditional fNIRS is like peeking through keyholes into the brain, DOT is like stepping into the room with a panoramic camera.        </span>
    </blockquote>
</div>

<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h3 data-start="3083" data-end="3121">From Keyholes to Panoramic Views</h3>
<p data-start="3123" data-end="3423">If traditional fNIRS is like peeking through keyholes into the brain, DOT is like stepping into the room with a panoramic camera. The spatial resolution improves dramatically because multiple overlapping measurements allow the reconstruction algorithm to sharpen and localize the observed activity.</p>
<p data-start="3425" data-end="3735">Moreover, by combining short-separation channels (which measure superficial scalp signals) with longer channels (which penetrate deeper), DOT can better isolate true cortical responses from confounding superficial artifacts. This capability is crucial for producing clean and accurate images of brain function.</p>
<h3 data-start="3737" data-end="3780">Practical Gains and New Possibilities</h3>
<p data-start="3782" data-end="4065">What does this mean for neuroscience and clinical research? DOT enables scientists to <strong data-start="3868" data-end="3939">observe brain activity with significantly greater spatial precision</strong>—approaching the resolution of functional MRI (fMRI) in some contexts, but in a much more portable and wearable form factor.</p>
<p data-start="4067" data-end="4430">It allows researchers to go beyond detecting whether <em data-start="4120" data-end="4126">some</em> activation occurred and instead begin mapping <em data-start="4173" data-end="4206">which specific cortical regions</em> are involved in a given task. For example, studies using DOT have demonstrated its ability to resolve fine-grained retinotopic maps of the visual cortex or to track functional connectivity across different brain networks.</p>
<p data-start="4432" data-end="4706">DOT also extends the reach of neuroimaging into environments where MRI is impractical or impossible: bedside monitoring of newborns and stroke patients, cognitive testing in VR environments, and studies of real-world interactions—all become viable with wearable DOT systems.</p>
    </div>
</div>

<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h3>Looking ahead</h3>
<p>DOT bridges the gap between the <strong data-start="5382" data-end="5408">accessibility of fNIRS</strong> and the <strong data-start="5417" data-end="5444">spatial insight of fMRI</strong>, delivering <strong data-start="5457" data-end="5486">wearable 3D brain imaging</strong>. It promises richer neuroscientific insight, from lab experiments to clinical applications—and we’re excited to bring this to our community.</p>
<p>At Cortivision, we are closely following the exciting developments around DOT and actively exploring how this powerful method can complement and extend the capabilities of fNIRS. As the technology continues to mature, we look forward to seeing how the global research community will apply DOT to new scientific questions and real-world challenges.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
    </div>
</div><p>The post <a href="https://www.cortivision.com/dot-fnirs/">Diffuse Optical Tomography (DOT) in fNIRS: How It Works and Why It Matters</a> appeared first on <a href="https://www.cortivision.com">Cortivision</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.cortivision.com/dot-fnirs/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>PhotonGrav: Pioneering BCI in Space</title>
		<link>https://www.cortivision.com/photongrav-pioneering-bci-in-space/</link>
					<comments>https://www.cortivision.com/photongrav-pioneering-bci-in-space/#respond</comments>
		
		<dc:creator><![CDATA[Wojtek]]></dc:creator>
		<pubDate>Wed, 30 Apr 2025 11:12:41 +0000</pubDate>
				<category><![CDATA[Events]]></category>
		<category><![CDATA[Science]]></category>
		<guid isPermaLink="false">https://www.cortivision.com/?p=6316</guid>

					<description><![CDATA[<p>In the annals of space exploration, certain milestones redefine our understanding of human potential. Cortivision's upcoming PhotonGrav project is poised to be one such milestone: the first-ever deployment of a fully operational brain-computer interface (BCI) in the microgravity environment of space.</p>
<p>The post <a href="https://www.cortivision.com/photongrav-pioneering-bci-in-space/">PhotonGrav: Pioneering BCI in Space</a> appeared first on <a href="https://www.cortivision.com">Cortivision</a>.</p>
]]></description>
										<content:encoded><![CDATA[<div class="block-wrapper">
    <p class="block-lead m:b1-bold-150 b2-bold-150">
        In the annals of space exploration, certain milestones redefine our understanding of human potential. Cortivision&#039;s upcoming PhotonGrav project is poised to be one such milestone: the first-ever deployment of a fully operational brain-computer interface (BCI) in the microgravity environment of space.    </p>
</div>


<div style="height:50px" aria-hidden="true" class="wp-block-spacer"></div>



<div class="wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex">
<div class="wp-block-column is-layout-flow wp-block-column-is-layout-flow" style="flex-basis:100%"><div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h1 data-start="705" data-end="750"><strong data-start="709" data-end="750">The Scientific Leap: BCI Beyond Earth</strong></h1>
<p class="" data-start="752" data-end="1016">BCI systems have captured imaginations and enabled new forms of communication — primarily through EEG (electroencephalography). However, EEG is notoriously vulnerable to motion, muscle activity, and environmental interference — all amplified in space.</p>
<p class="" data-start="1018" data-end="1326"><strong data-start="1018" data-end="1032">PhotonGrav</strong> is a project fully developed by Cortivision which will shift that paradigm by deploying a functional near-infrared spectroscopy <strong>(fNIRS)-based BCI system</strong> aboard the International Space Station (ISS). Unlike EEG, fNIRS is more robust against movement and environmental noise, making it a more viable solution for microgravity conditions.</p>
<p class="" data-start="1328" data-end="1801">The experiment is designed to determine whether cognitive intent — such as performing a mental calculation or entering a state of relaxation — can be decoded from brain activity in space. The targeted brain regions, Dorsolateral Prefrontal Cortex (DLPFC) and Middle Frontal Gyrus (MFG), are key to focus and mental workload. A machine learning algorithm will interpret these signals in real time, enabling participants to control feedback with their thoughts alone.</p>
<p class="" data-start="1803" data-end="1907">In essence: astronauts will communicate with machines using their brains — not their hands — from orbit.</p>
    </div>
</div></div>
</div>


<div class="block-wrapper">
    <blockquote class="col-start-1 col-end-13 m:col-start-3 m:col-end-9">
        <svg xmlns="http://www.w3.org/2000/svg" width="42" height="37" viewBox="0 0 42 37" fill="none" class="mb-5">
            <line x1="1.13397" y1="36.1218" x2="21.134" y2="1.48081" stroke="#7822FF" stroke-width="2"/>
            <line x1="21.134" y1="36.1216" x2="41.134" y2="1.48057" stroke="#7822FF" stroke-width="2"/>
        </svg>
        <p class="b1-reg-150 m:h3-light-125 text-theme mb-5">
            Imagine diving into a swimming pool just to sign your name at the bottom. May seem like a simple action you&#039;d normally do with your eyes closed but in those conditions, even basic tasks become unexpectedly complex. The same applies to microgravity, where typical muscle-based communication may no longer be so straightforward.        </p>
        <p class="block-text">
            Wojciech Broniatowski        </p>
    </blockquote>
</div>


<div style="height:20px" aria-hidden="true" class="wp-block-spacer"></div>



<div class="wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex">
<div class="wp-block-column is-layout-flow wp-block-column-is-layout-flow" style="flex-basis:100%">
<div class="wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex">
<div class="wp-block-column is-layout-flow wp-block-column-is-layout-flow" style="flex-basis:100%">
<div class="wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex">
<div class="wp-block-column is-layout-flow wp-block-column-is-layout-flow" style="flex-basis:100%"><div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h2 data-start="2211" data-end="2257"><strong data-start="2215" data-end="2257">fNIRS in space</strong></h2>
<p class="" data-start="2259" data-end="2667">The <em data-start="2263" data-end="2275">PhotonGrav</em> experiment will be conducted by both Polish and Indian astronauts, highlighting the project&#8217;s international scope. Additionally, our fNIRS device will be utilized in a separate experiment combining fNIRS with virtual reality (VR) technologies, involving Hungarian and Indian astronauts. These collaborative efforts exemplify the global commitment to advancing neuroscience research in space.</p>
<h3 data-start="4631" data-end="4673">Poland in Space: The Ignis Mission</h3>
<p class="" data-start="4675" data-end="5037">PhotonGrav is also deeply significant for <strong data-start="4717" data-end="4727">Poland</strong>, as part of a broader scientific return to space. The <strong data-start="4782" data-end="4801">“Ignis” mission</strong> represents Poland’s first crewed mission in cooperation with Axiom Space and second Polish astronaut ever in space. It marks a symbolic and strategic moment for the country — placing Polish science and technology at the heart of human spaceflight for the first time in decades.</p>
<p class="" data-start="5039" data-end="5200">The inclusion of the PhotonGrav BCI experiment in Ignis is a powerful statement: Poland is not just participating in space exploration — it’s shaping its future.</p>
    </div>
</div></div>
</div>
</div>
</div>
</div>
</div>



<div class="wp-block-media-text is-stacked-on-mobile" style="grid-template-columns:32% auto"><figure class="wp-block-media-text__media"><img fetchpriority="high" decoding="async" width="851" height="851" src="https://www.cortivision.com/app/uploads/2025/04/PhotonGrav-Patch.png" alt="" class="wp-image-6318 size-full" srcset="https://www.cortivision.com/app/uploads/2025/04/PhotonGrav-Patch.png 851w, https://www.cortivision.com/app/uploads/2025/04/PhotonGrav-Patch-300x300.png 300w, https://www.cortivision.com/app/uploads/2025/04/PhotonGrav-Patch-150x150.png 150w, https://www.cortivision.com/app/uploads/2025/04/PhotonGrav-Patch-768x768.png 768w" sizes="(max-width: 851px) 100vw, 851px" /></figure><div class="wp-block-media-text__content">
<p><em>Patch of the experiment PhotonGrav &#8211; Thoughts over gravity; a test of using fNIRS-based Brain-Computer Interface in LEO conditions.</em></p>



<p><em>Author:J.Marzoch </em></p>
</div></div>



<div style="height:25px" aria-hidden="true" class="wp-block-spacer"></div>



<div class="wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex">
<div class="wp-block-column is-layout-flow wp-block-column-is-layout-flow" style="flex-basis:100%"><div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h1 data-start="485" data-end="522"><strong data-start="489" data-end="522">Cortivision&#8217;s Legacy in Space</strong></h1>
<p class="" data-start="211" data-end="527">Cortivision’s involvement in space neuroscience is not only pioneering — it is now also consistent. After making history during <strong data-start="339" data-end="365">Axiom Mission 2 (Ax-2)</strong> in May 2023 by delivering the first-ever functional fNIRS system to the International Space Station (ISS), we have continued to strengthen our presence in orbit.</p>
<p class="" data-start="529" data-end="876">The Ax-2 mission marked the first time human brain activity was recorded in microgravity using <strong data-start="624" data-end="673">functional near-infrared spectroscopy (fNIRS)</strong>. This achievement proved not only the resilience of our technology but also Cortivision’s ability to deliver high-quality, high-integrity neuroimaging in one of the most extreme environments imaginable. <a href="https://shop.elsevier.com/books/neuroscience-research-in-short-duration-human-spaceflight/shirah/978-0-443-33918-9" target="_blank" rel="noopener">Find out a publication from Bader Shirah PhD.</a></p>
<p class="" data-start="878" data-end="1391">Our momentum continued with <strong data-start="906" data-end="932">Axiom Mission 3 (Ax-3)</strong> in January 2024. During this mission, Swedish astronaut Marcus Wandt used Cortivision’s fNIRS system as part of a cognitive monitoring experiment. The data collected contributed to understanding mental workload, stress regulation, and the brain’s adaptation to space conditions. The success of this mission further validated the robustness of Cortivision’s platform, reinforcing its value in both basic research and applied cognitive monitoring in space.</p>
    </div>
</div></div>
</div>



<div class="wp-block-media-text is-stacked-on-mobile"><figure class="wp-block-media-text__media"><img decoding="async" width="1024" height="683" src="https://www.cortivision.com/app/uploads/2024/02/ax03e005066_alt-1024x683.jpg" alt="" class="wp-image-5570 size-full"/></figure><div class="wp-block-media-text__content">
<p><em>Swedish astronaut Markus Wandt using Cortivision’s fNIRS in the Orbital Architecture Activity, from KTH Royal Institute of Technology, Ax-3 mission on the ISS, January 2024</em></p>
</div></div>



<div class="wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex">
<div class="wp-block-column is-layout-flow wp-block-column-is-layout-flow" style="flex-basis:100%"><div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <p data-start="485" data-end="522">
<p data-start="485" data-end="522">With two successful missions — <strong>Ax-2</strong> and <strong>Ax-3</strong> — Cortivision is now recognized as a trusted provider of neurotechnology solutions for space research. We partner with Axiom Space and our growing heritage in orbit is a testament to our rigorous engineering, scientific partnerships, and readiness for future challenges. Whether in collaboration with national agencies, commercial partners, or interdisciplinary academic teams, we bring not only technology but also experience and reliability to space science projects.</p>
    </div>
</div></div>
</div>


<div class="block-wrapper">
    <blockquote class="col-start-1 col-end-13 m:col-start-3 m:col-end-9">
        <div class="w- bg-theme h- mb-5"></div>
        <span class="b1-reg-150 m:h3-light-125 text-theme">
            Now, with the PhotonGrav project aboard Axiom Mission 4, Cortivision is preparing to take the next bold step: enabling the first fully functional brain-computer interface (BCI) experiment in space.        </span>
    </blockquote>
</div>

<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h3 class="" data-start="3955" data-end="3999"><strong data-start="3959" data-end="3999">Mission Ax-4: Launching June 8, 2025</strong></h3>
<p class="" data-start="4001" data-end="4203">Axiom Mission 4 (Ax-4) is led by <strong data-start="4034" data-end="4049">Axiom Space</strong> in cooperation with <strong data-start="4070" data-end="4078">NASA</strong>. It features a multinational crew of astronauts conducting research in health, life sciences, materials science, and beyond.</p>
<p class="" data-start="4205" data-end="4534">Cortivision’s PhotonGrav experiment is scheduled to be presented <strong data-start="4270" data-end="4303">live during the rocket launch</strong> — a rare public glimpse into the frontier of brain research in space. As the rocket climbs toward orbit, it will carry with it not only four astronauts but also the hopes of pushing neural science to the edge of known possibility.</p>
<blockquote data-start="4536" data-end="4624">
<p class="" data-start="4538" data-end="4624">For more about Ax-4: <a class="" href="https://www.axiomspace.com/missions/ax4" target="_new" rel="noopener" data-start="4559" data-end="4624">Axiom Mission Overview</a></p>
</blockquote>
<h3 data-start="5207" data-end="5260"></h3>
<h3 class="" data-start="5207" data-end="5260"><strong data-start="5211" data-end="5260">Cross-Industry Potential: From Orbit to Earth</strong></h3>
<p class="" data-start="5262" data-end="5340">While PhotonGrav focuses on space, its implications extend far beyond the ISS.</p>
<p class="" data-start="5342" data-end="5733">The same Cortivision systems used in orbit can also be applied to fields on Earth — including <strong data-start="5436" data-end="5543">defense, aviation, medicine, elite performance, rehabilitation, and human–machine interface development</strong>. The ability to read and interpret brain signals in real-time — especially under stress or isolation — has applications ranging from fighter pilot monitoring to surgical support and beyond.</p>
<p class="" data-start="5735" data-end="5845">We welcome partnerships across <strong data-start="5766" data-end="5819">aerospace, neurotechnology, human factors, and AI</strong> — on Earth and beyond it.</p>
    </div>
</div><p>The post <a href="https://www.cortivision.com/photongrav-pioneering-bci-in-space/">PhotonGrav: Pioneering BCI in Space</a> appeared first on <a href="https://www.cortivision.com">Cortivision</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.cortivision.com/photongrav-pioneering-bci-in-space/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Projekt fNIRS Brain Holter</title>
		<link>https://www.cortivision.com/projekt-fnirs-brain-holter/</link>
					<comments>https://www.cortivision.com/projekt-fnirs-brain-holter/#respond</comments>
		
		<dc:creator><![CDATA[Wojtek]]></dc:creator>
		<pubDate>Tue, 11 Mar 2025 12:02:00 +0000</pubDate>
				<category><![CDATA[Public Projects EU]]></category>
		<guid isPermaLink="false">https://www.cortivision.com/?p=6488</guid>

					<description><![CDATA[<p>Uprzejmie informujemy, że Cortivision Sp. z o.o. realizuje projekt pt. „BrainHolter – Innowacyjny autonomiczny system fNIRS do długoterminowego monitorowania aktywności mózgu” przy wsparciu środków Unii Europejskiej.</p>
<p>The post <a href="https://www.cortivision.com/projekt-fnirs-brain-holter/">Projekt fNIRS Brain Holter</a> appeared first on <a href="https://www.cortivision.com">Cortivision</a>.</p>
]]></description>
										<content:encoded><![CDATA[<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h2 class="block-header"></h2>
                    <img decoding="async" class="mb-5 w-full" src="https://www.cortivision.com/app/uploads/2025/03/FEL_logotyp_kolor_poziom-scaled.jpg" alt="" />
                    </div>
</div>

<div class="block-wrapper">
    <p class="block-lead m:b1-bold-150 b2-bold-150">
        Uprzejmie informujemy, że Cortivision Sp. z o.o. realizuje projekt pt. „BrainHolter – Innowacyjny autonomiczny system fNIRS do długoterminowego monitorowania aktywności mózgu” przy wsparciu środków Unii Europejskiej.    </p>
</div>

<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h3 data-start="39" data-end="56">Opis projektu</h3>
<p data-start="58" data-end="550">Cortivision realizuje ambitny projekt badawczo-rozwojowy pod nazwą <strong data-start="125" data-end="140">BrainHolter</strong>, którego celem jest opracowanie autonomicznego systemu do długoterminowego monitorowania aktywności mózgu z wykorzystaniem funkcjonalnej spektroskopii w bliskiej podczerwieni (fNIRS). Projekt koncentruje się na <strong data-start="352" data-end="379">badaniach przemysłowych</strong>, ukierunkowanych na rozwój automatycznego przetwarzania danych fNIRS, oraz na <strong data-start="458" data-end="481">pracach rozwojowych</strong>, których celem jest stworzenie w pełni autonomicznego systemu fNIRS.</p>
<p data-start="552" data-end="916">Przełomowość projektu BrainHolter polega na wprowadzeniu technologii fNIRS jako nowej metody ciągłego monitorowania mózgu — analogicznie do tego, jak obecnie wykorzystuje się EEG, EMG czy EKG. System będzie działał niezależnie, bez konieczności stałego nadzoru badacza, oferując zautomatyzowaną analizę sygnałów i eliminując potrzebę ręcznego przetwarzania danych.</p>
<hr data-start="918" data-end="921" />
<h3 data-start="923" data-end="941">Grupa docelowa</h3>
<p data-start="943" data-end="1155">System BrainHolter jest przeznaczony dla <strong data-start="984" data-end="1048">instytucji naukowo-badawczych, ośrodków badawczych oraz firm</strong>, w tym działających na terenie województwa lubelskiego. Odbiorcy projektu działają w takich obszarach jak:</p>
<ul data-start="1157" data-end="1284">
<li data-start="1157" data-end="1202">
<p data-start="1159" data-end="1202">Badania naukowe o zastosowaniu klinicznym</p>
</li>
<li data-start="1203" data-end="1241">
<p data-start="1205" data-end="1241">Monitorowanie wydolności sportowej</p>
</li>
<li data-start="1242" data-end="1284">
<p data-start="1244" data-end="1284">Zastosowania w warunkach ekstremalnych</p>
</li>
</ul>
<hr data-start="1286" data-end="1289" />
<h3 data-start="1291" data-end="1313">Kluczowe rezultaty</h3>
<p data-start="1315" data-end="1640" data-is-last-node="" data-is-only-node="">Głównym rezultatem projektu będzie opracowanie <strong data-start="1362" data-end="1414">zautomatyzowanego systemu przetwarzania sygnałów</strong>, dostosowanego do długoterminowego monitorowania fNIRS. Innowacja ta jest kluczowa dla stworzenia odpowiednika „holtera EEG” w technologii fNIRS, który umożliwi użytkownikom śledzenie zmian aktywności mózgu w dłuższym czasie.</p>
    </div>
</div>

<div class="block-wrapper">
    <blockquote class="col-start-1 col-end-13 m:col-start-3 m:col-end-9">
        <div class="w- bg-theme h- mb-5"></div>
        <span class="b1-reg-150 m:h3-light-125 text-theme">
            System BrainHolter będzie unikatem na skalę światową, ponieważ żadne obecnie dostępne rozwiązanie fNIRS nie umożliwia nieprzerwanego rejestrowania hemodynamicznej aktywności mózgu przez co najmniej 12 godzin.
        </span>
    </blockquote>
</div>

<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h3 data-start="46" data-end="75">Etapy realizacji projektu</h3>
<p data-start="77" data-end="139">Projekt <strong data-start="85" data-end="100">BrainHolter</strong> został podzielony na dwa główne etapy:</p>
<h4 data-start="141" data-end="169">1. Badania przemysłowe:</h4>
<ul data-start="170" data-end="1168">
<li data-start="170" data-end="380">
<p data-start="172" data-end="380"><strong data-start="172" data-end="183">Etap 1:</strong> Opracowanie wielu zautomatyzowanych algorytmów przetwarzania sygnałów z wykorzystaniem nowoczesnych technik uczenia maszynowego w celu określenia najskuteczniejszego podejścia do analizy danych.</p>
</li>
<li data-start="381" data-end="575">
<p data-start="383" data-end="575"><strong data-start="383" data-end="394">Etap 2:</strong> Stworzenie prototypu urządzenia zdolnego do długoterminowego rejestrowania aktywności mózgu, niezbędnego do pozyskiwania danych do testów algorytmów automatycznego przetwarzania.</p>
</li>
<li data-start="576" data-end="791">
<p data-start="578" data-end="791"><strong data-start="578" data-end="589">Etap 3:</strong> Opracowanie specjalnej opaski fNIRS, zapewniającej komfort podczas długotrwałego monitorowania i umożliwiającej szybkie zakładanie, na podstawie badań ergonomicznych z udziałem testerów i operatorów.</p>
</li>
<li data-start="792" data-end="902">
<p data-start="794" data-end="902"><strong data-start="794" data-end="805">Etap 4:</strong> Testy laboratoryjne opracowanego prototypu urządzenia w celu weryfikacji działania algorytmów.</p>
</li>
<li data-start="903" data-end="1023">
<p data-start="905" data-end="1023"><strong data-start="905" data-end="916">Etap 5:</strong> Stworzenie interfejsu badacza oraz usługi chmurowej do zbierania, przechowywania i przetwarzania danych.</p>
</li>
<li data-start="1024" data-end="1168">
<p data-start="1026" data-end="1168"><strong data-start="1026" data-end="1037">Etap 6:</strong> Opracowanie pierwszej wersji aplikacji mobilnej służącej do monitorowania aktywności mózgu podczas długoterminowych rejestracji.</p>
</li>
</ul>
<h4 data-start="1170" data-end="1194">2. Prace rozwojowe:</h4>
<ul data-start="1195" data-end="1494">
<li data-start="1195" data-end="1361">
<p data-start="1197" data-end="1361"><strong data-start="1197" data-end="1208">Etap 7:</strong> Integracja wszystkich komponentów w finalny prototyp oraz dopracowanie sprzętu i oprogramowania na podstawie wyników wcześniejszych etapów badawczych.</p>
</li>
<li data-start="1362" data-end="1494">
<p data-start="1364" data-end="1494"><strong data-start="1364" data-end="1375">Etap 8:</strong> Walidacja kompletnego systemu w warunkach zbliżonych do rzeczywistych w celu oceny jego skuteczności i użyteczności.</p>
</li>
</ul>
<hr data-start="1496" data-end="1499" />
<h3 data-start="1501" data-end="1528">Finansowanie i wsparcie</h3>
<p data-start="1530" data-end="1738">Projekt <strong data-start="1538" data-end="1553">BrainHolter</strong> jest współfinansowany ze środków Unii Europejskiej w ramach programu <strong data-start="1623" data-end="1673">Fundusze Europejskie dla Lubelskiego 2021–2027</strong>, <strong data-start="1675" data-end="1735">Działanie 1.3 – Badania i innowacje w przedsiębiorstwach</strong>.</p>
<ul data-start="1739" data-end="1886">
<li data-start="1739" data-end="1791">
<p data-start="1741" data-end="1791"><strong data-start="1741" data-end="1772">Całkowita wartość projektu:</strong> 2 729 014,06 PLN</p>
</li>
<li data-start="1792" data-end="1837">
<p data-start="1794" data-end="1837"><strong data-start="1794" data-end="1818">Dofinansowanie z UE:</strong> 1 982 615,45 PLN</p>
</li>
<li data-start="1838" data-end="1886">
<p data-start="1840" data-end="1886"><strong data-start="1840" data-end="1856">Numer umowy:</strong> FELU.01.03-IP.01-0107/24-00</p>
</li>
</ul>
<p data-start="1888" data-end="1932" data-is-last-node="" data-is-only-node="">Projekt realizowany jest w Lublinie.</p>
    </div>
</div>

<div class="block-wrapper">
    <blockquote class="col-start-1 col-end-13 m:col-start-3 m:col-end-9">
        <svg xmlns="http://www.w3.org/2000/svg" width="42" height="37" viewBox="0 0 42 37" fill="none" class="mb-5">
            <line x1="1.13397" y1="36.1218" x2="21.134" y2="1.48081" stroke="#7822FF" stroke-width="2"/>
            <line x1="21.134" y1="36.1216" x2="41.134" y2="1.48057" stroke="#7822FF" stroke-width="2"/>
        </svg>
        <p class="b1-reg-150 m:h3-light-125 text-theme mb-5">
            BrainHolter stanowi istotny krok naprzód w dziedzinie neurotechnologii, torując drogę dla nieinwazyjnych rozwiązań do długoterminowego monitorowania pracy mózgu. Dzięki autonomicznemu działaniu oraz zautomatyzowanemu przetwarzaniu danych, system ten ma potencjał, by zrewolucjonizować sposób badania aktywności mózgu, wspierając zastosowania w neuronauce, medycynie oraz badaniach nad wydajnością człowieka.
        </p>
        <p class="block-text">
            Wojciech Broniatowski, Prezes Cortivision        </p>
    </blockquote>
</div>

<div class="block-wrapper">
    <div class="h- bg-grey-300 col-start-1 col-end-13 m:col-start-3 m:col-end-11"></div>
</div>

<div class="block-wrapper">
    <div class="h- bg-grey-300 col-start-1 col-end-13 m:col-start-3 m:col-end-11"></div>
</div>

<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
            </div>
</div><p>The post <a href="https://www.cortivision.com/projekt-fnirs-brain-holter/">Projekt fNIRS Brain Holter</a> appeared first on <a href="https://www.cortivision.com">Cortivision</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.cortivision.com/projekt-fnirs-brain-holter/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>fNIRS Brain Holter project</title>
		<link>https://www.cortivision.com/brain-holter/</link>
					<comments>https://www.cortivision.com/brain-holter/#respond</comments>
		
		<dc:creator><![CDATA[Wojtek]]></dc:creator>
		<pubDate>Tue, 11 Mar 2025 08:12:41 +0000</pubDate>
				<category><![CDATA[Public Projects EU]]></category>
		<guid isPermaLink="false">https://www.cortivision.com/?p=6273</guid>

					<description><![CDATA[<p>We kindly inform you that Cortivision Sp. z o.o. is conducting a project titled "BrainHolter – An Innovative Autonomous fNIRS System for Long-Term Brain Activity Monitoring," with support of the European Union Funds.</p>
<p>The post <a href="https://www.cortivision.com/brain-holter/">fNIRS Brain Holter project</a> appeared first on <a href="https://www.cortivision.com">Cortivision</a>.</p>
]]></description>
										<content:encoded><![CDATA[<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h2 class="block-header"></h2>
                    </div>
</div>

<div class="block-wrapper">
    <p class="block-lead m:b1-bold-150 b2-bold-150">
        We kindly inform you that Cortivision Sp. z o.o. is conducting a project titled &quot;BrainHolter – An Innovative Autonomous fNIRS System for Long-Term Brain Activity Monitoring,&quot; with support of the European Union Funds.    </p>
</div>

<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h2>Project overview</h2>
<p>Cortivision is undertaking an ambitious R&amp;D initiative titled <strong>BrainHolter</strong>, aimed at developing an autonomous functional near-infrared spectroscopy (fNIRS) system for long-term brain activity monitoring. The project focuses on industrial research to advance automated fNIRS data processing and experimental development to create a fully autonomous fNIRS system.</p>
<p>The breakthrough innovation of BrainHolter lies in introducing fNIRS as a new modality for continuous brain monitoring, similar to how EEG, EMG, and ECG are used today. The system is designed to function independently—without constant supervision from researchers—and provide automated signal analysis, eliminating the need for manual data processing.</p>
<h3 data-pm-slice="1 3 []">Target Group</h3>
<p>The BrainHolter system is designed for research organizations and institutes, businesses, including those from the Lublin region. The target users operate in fields such as:</p>
<ul data-spread="false">
<li>Scientific research in clinical applications</li>
<li>Sports performance monitoring</li>
<li>Extreme environmental applications</li>
</ul>
<h2><strong>Key Outcomes</strong></h2>
<p>The core outcome of the project is the development of an automated signal processing framework tailored for long-term fNIRS monitoring. This innovation is crucial for the creation of an &#8220;EEG Holter equivalent&#8221; for fNIRS, offering users insight into changes in brain activity over extended periods.</p>
    </div>
</div>

<div class="block-wrapper">
    <blockquote class="col-start-1 col-end-13 m:col-start-3 m:col-end-9">
        <div class="w- bg-theme h- mb-5"></div>
        <span class="b1-reg-150 m:h3-light-125 text-theme">
            The BrainHolter system will be a globally unique product, as no current fNIRS solutions provide uninterrupted hemodynamic brain activity recording for at least 12 hours.        </span>
    </blockquote>
</div>

<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
        <h2>Project Stages</h2>
<p>The project is divided into two main phases:</p>
<p>1. Industrial Research:<br />
&#8211; Stage 1: Development of multiple automated signal processing algorithms using state-of-the-art machine learning techniques to determine the most effective data processing approach.<br />
&#8211; Stage 2: Creation of a device prototype capable of long-term brain activity recording, essential for collecting data to test automated processing algorithms.<br />
&#8211; Stage 3: Development of a special fNIRS cap ensuring comfort during prolonged monitoring and rapid application, based on ergonomic research involving test subjects and operators.<br />
&#8211; Stage 4: Laboratory testing  of the developed device prototype to validate the performance of the algorithms.<br />
&#8211; Stage 5: Development of the researcher interface and cloud-based service for data collection, storage, and processing.<br />
&#8211; Stage 6: Creation of the first version of a mobile application for monitoring brain activity during long-term recordings.</p>
<p>2. Experimental Development<br />
&#8211; Stage 7: Integration of all components into a final prototype, refining both hardware and software based on findings from the research phase.<br />
&#8211; Stage 8: Validation of the complete system under real-world conditions to ensure effectiveness and usability.</p>
<h2>Funding and Support</h2>
<p>The BrainHolter project is co-financed by the European Union under the European Funds for Lublin 2021-2027 program, Action 1.3 – Research and Innovation in the Business Sector.<br />
&#8211; Total project value: PLN 2,729,014.06<br />
&#8211; EU funding: PLN 1,982,615.45<br />
&#8211; Contract number: FELU.01.03-IP.01-0107/24-00</p>
<p>Project is performed in Lublin, Poland.</p>
<p>&nbsp;</p>
    </div>
</div>

<div class="block-wrapper">
    <blockquote class="col-start-1 col-end-13 m:col-start-3 m:col-end-9">
        <svg xmlns="http://www.w3.org/2000/svg" width="42" height="37" viewBox="0 0 42 37" fill="none" class="mb-5">
            <line x1="1.13397" y1="36.1218" x2="21.134" y2="1.48081" stroke="#7822FF" stroke-width="2"/>
            <line x1="21.134" y1="36.1216" x2="41.134" y2="1.48057" stroke="#7822FF" stroke-width="2"/>
        </svg>
        <p class="b1-reg-150 m:h3-light-125 text-theme mb-5">
            BrainHolter represents a significant advancement in neurotechnology, paving the way for non-invasive, long-term brain monitoring solutions. With its autonomous functionality and automated data processing, it has the potential to revolutionize how we study brain activity, supporting applications in neuroscience, medicine, and human performance research.        </p>
        <p class="block-text">
            Wojciech Broniatowski, CEO of Cortivision        </p>
    </blockquote>
</div>

<div class="block-wrapper">
    <div class="h- bg-grey-300 col-start-1 col-end-13 m:col-start-3 m:col-end-11"></div>
</div>

<div class="block-wrapper">
    <div class="h- bg-grey-300 col-start-1 col-end-13 m:col-start-3 m:col-end-11"></div>
</div>

<div class="block-wrapper">
    <div class="col-start-1 col-end-13 m:col-start-3 m:col-end-11 block-textarea">
            </div>
</div><p>The post <a href="https://www.cortivision.com/brain-holter/">fNIRS Brain Holter project</a> appeared first on <a href="https://www.cortivision.com">Cortivision</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.cortivision.com/brain-holter/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
