Publication: Canonical maximization of coherence: a novel tool for
investigation of neuronal interactions between two
datasets
Date
Authors
Director
Publisher
Project identifier
Abstract
Synchronization between oscillatory signals is considered to be one of the main mechanisms through which neuronal populations interact with each other. It is conventionally studied with mass-bivariate measures utilizing either sensor-to-sensor or voxel-to-voxel signals. However, none of these approaches aims at maximizing syn-chronization, especially when two multichannel datasets are present. Examples include cortico-muscular coherence (CMC), cortico-subcortical interactions or hyperscanning (where electroencephalographic EEG/magnetoencephalographic MEG activity is recorded simultaneously from two or more subjects). For all of these cases, a method which could find two spatial projections maximizing the strength of synchronization would be desirable. Here we present such method for the maximization of coherence between two sets of EEG/MEG/EMG(electromyographic)/LFP (localfield potential) recordings. We refer to it as canonical Coherence (caCOH). caCOH maximizes the absolute value of the coherence between the two multivariate spaces in the frequency domain. Thisallows very fast optimization for many frequency bins. Apart from presenting details of the caCOH algorithm, we test its efficacy with simulations using realistic head modelling and focus on the application of caCOH to the detection of cortico-muscular coherence. For this, we used diverse multichannel EEG and EMG recordings and demonstrate the ability of caCOH to extract complex patterns of CMC distributed across spatial and frequency domains. Finally, we indicate other scenarios where caCOH can be used for the extraction of neuronal interactions.
Description
Keywords
Department
Faculty/School
Degree
Doctorate program
item.page.cita
item.page.rights
©2019 Published by Elsevier Inc. This manuscript version is made available under the CC-BY-NC-ND 4.0.
Los documentos de Academica-e están protegidos por derechos de autor con todos los derechos reservados, a no ser que se indique lo contrario.