Show simple item record

dc.creatorVidaurre, Carmenes_ES
dc.creatorNolte, Guidoes_ES
dc.creatorVries, I. E. J. dees_ES
dc.creatorGómez Fernández, Marisoles_ES
dc.creatorBoonstra, Tjeerd W.es_ES
dc.creatorMüller, Klaus Robertes_ES
dc.creatorVillringer, Arnoes_ES
dc.creatorNikulin, Vadim V.es_ES
dc.description.abstractSynchronization between oscillatory signals is considered to be one of the main mechanisms through which neuronal populations interact with each other. It is conventionally studied with mass-bivariate measures utilizing either sensor-to-sensor or voxel-to-voxel signals. However, none of these approaches aims at maximizing syn-chronization, especially when two multichannel datasets are present. Examples include cortico-muscular coherence (CMC), cortico-subcortical interactions or hyperscanning (where electroencephalographic EEG/magnetoencephalographic MEG activity is recorded simultaneously from two or more subjects). For all of these cases, a method which could find two spatial projections maximizing the strength of synchronization would be desirable. Here we present such method for the maximization of coherence between two sets of EEG/MEG/EMG(electromyographic)/LFP (localfield potential) recordings. We refer to it as canonical Coherence (caCOH). caCOH maximizes the absolute value of the coherence between the two multivariate spaces in the frequency domain. Thisallows very fast optimization for many frequency bins. Apart from presenting details of the caCOH algorithm, we test its efficacy with simulations using realistic head modelling and focus on the application of caCOH to the detection of cortico-muscular coherence. For this, we used diverse multichannel EEG and EMG recordings and demonstrate the ability of caCOH to extract complex patterns of CMC distributed across spatial and frequency domains. Finally, we indicate other scenarios where caCOH can be used for the extraction of neuronal interactions.en
dc.description.sponsorshipC.V. was supported by the Spanish Ministry of Economy with Grant RyC 2014-15671. G.N. was partially funded by the German Research Foundation (DFG, SFB936 Z3 and TRR169, B4). K.-R.M. work was supported by the German Ministry for Education and Research (BMBF) under Grants 01IS14013A-E, 01GQ1115 and 01GQ0850; the German Research Foundation (DFG) under Grant Math+, EXC 2046/1, Project ID 390685689 and by the Institute for Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (No. 2017-0-00451, No. 2017-0-01779). T.W.B. was supported by a Future Fellowship from the Australian Research Council (FT180100622). V.V.N. was partially supported by the Center for Bioelectric Interfaces NRU HSE, RF Government grant, ag. No. 14.641.31.0003. The authors thank Katherina von Carlowitz-Ghori for her support with rCMC code and results.en
dc.format.extent30 p.
dc.relation.ispartofNeuroImage 201 (2019) 116009en
dc.rights©2019 Published by Elsevier Inc. This manuscript version is made available under the CC-BY-NC-ND 4.0.en
dc.subjectCoherence optimizationen
dc.subjectMultivariate methodsen
dc.subjectMultimodal methodsen
dc.subjectCortico-muscular coherence (CMC)en
dc.subjectElectroencephalography (EEG)en
dc.subjectElectromyography (EMG)en
dc.subjectHigh density electromyography (HDsEMG)en
dc.subjectMagnetoencephalography (MEG)en
dc.subjectLocalfield potentials (LFP)en
dc.titleCanonical maximization of coherence: a novel tool for investigation of neuronal interactions between two datasetsen
dc.typeArtículo / Artikuluaes
dc.contributor.departmentUniversidad Pública de Navarra. Departamento de Estadística, Informática y Matemáticases_ES
dc.contributor.departmentNafarroako Unibertsitate Publikoa. Estatistika, Informatika eta Matematika Sailaeu
dc.rights.accessRightsAcceso abierto / Sarbide irekiaes
dc.type.versionVersión aceptada / Onetsi den bertsioaes

Files in this item


This item appears in the following Collection(s)

Show simple item record

©2019 Published by Elsevier Inc. This manuscript version is made available under the CC-BY-NC-ND 4.0.
Except where otherwise noted, this item's license is described as ©2019 Published by Elsevier Inc. This manuscript version is made available under the CC-BY-NC-ND 4.0.