EEG negativity in fixations used for gaze-based control: Toward converting intentions into actions with an eye-brain-computer interface SL Shishkin, YO Nuzhdin, EP Svirin, AG Trofimov, AA Fedorova, ... Frontiers in neuroscience 10, 528, 2016 | 76 | 2016 |
A passive BCI for monitoring the intentionality of the gaze-based moving object selection DG Zhao, AN Vasilyev, BL Kozyrskiy, EV Melnichuk, AV Isachenko, ... Journal of Neural Engineering 18 (2), 026001, 2021 | 11 | 2021 |
A greedy feature selection algorithm for brain-computer interface classification committees AG Trofimov, SL Shishkin, BL Kozyrskiy, BM Velichkovsky Procedia computer science 123, 488-493, 2018 | 10 | 2018 |
Improving eye-brain-computer interface performance by using electroencephalogram frequency components S Shishkin, BL Kozyrskiy, AG Trofimov, YO Nuzhdin, AA Fedorova, ... Bulletin of Russian State Medical University, 36-41, 2016 | 10 | 2016 |
MEG-based detection of voluntary eye fixations used to control a computer AO Ovchinnikova, AN Vasilyev, IP Zubarev, BL Kozyrskiy, SL Shishkin Frontiers in Neuroscience 15, 619591, 2021 | 9 | 2021 |
Passive detection of feedback expectation: Towards fluent hybrid eye-brain-computer interfaces. YO Nuzhdin, SL Shishkin, AA Fedorova, BL Kozyrskiy, AA Medyntsev, ... GBCIC, 2017 | 9 | 2017 |
The expectation based eye-brain-computer interface: an attempt of online test YO Nuzhdin, SL Shishkin, AA Fedorova, AG Trofimov, EP Svirin, ... Proceedings of the 2017 ACM Workshop on An Application-oriented Approach to …, 2017 | 8 | 2017 |
Locally Smoothed Gaussian Process Regression D Gogolashvili, B Kozyrskiy, M Filippone Procedia Computer Science 207, 2717-2726, 2022 | 7 | 2022 |
Classification of the gaze fixations in the eye-brain-computer interface paradigm with a compact convolutional neural network BL Kozyrskiy, AO Ovchinnikova, AD Moskalenko, BM Velichkovsky, ... Procedia computer science 145, 293-299, 2018 | 5 | 2018 |
EEG potentials related to moving object selection with gaze: A possible basis for more flexible eye-brain-computer interfaces DG Zhao, AV Isachenko, EV Melnichuk, BL Kozyrskiy, SL Shishkin Opera Medica et Physiologica 4 (Suppl. 1), 109-110, 2018 | 4 | 2018 |
Imposing Functional Priors on Bayesian Neural Networks B Kozyrskiy, D Milios, M Filippone ICPRAM 2023, 12th International Conference on Pattern Recognition …, 2023 | 3 | 2023 |
Bayesian opportunities for brain–computer interfaces: Enhancement of the existing classification algorithms and out-of-domain detection EI Chetkin, SL Shishkin, BL Kozyrskiy Algorithms 16 (9), 429, 2023 | 2 | 2023 |
Variational Bootstrap for Classification B Kozyrskiy, D Milios, M Filippone Procedia Computer Science 207, 1222-1231, 2022 | 2 | 2022 |
Unconditional EEG Synthesis Based on Diffusion Models for Sound Generation EI Chetkin, BL Kozyrsky, SL Shishkin 2024 IEEE International Multi-Conference on Engineering, Computer and …, 2024 | 1 | 2024 |
Meta-Optimization of Initial Weights for More Effective Few-and Zero-Shot Learning in BCI Classification DA Berdyshev, AM Grachev, SL Shishkin, BL Kozyrskiy 2023 IEEE Ural-Siberian Conference on Computational Technologies in …, 2023 | 1 | 2023 |
An expectation-based EEG marker for the selection of moving objects with gaze. DG Zhao, AN Vasilyev, BL Kozyrskiy, AV Isachenko, EV Melnichuk, ... GBCIC, 2019 | 1 | 2019 |
EEG-based classification of the intentional and spontaneous selection of moving objects with gaze D Zhao, A Vasilyev, B Kozyrsky, E Melnichuk, S Shishkin THE 5TH INTERNATIONAL CONFERENCE BCI: SCIENCE AND PRACTICE. SAMARA 2019, 46-47, 2019 | 1 | 2019 |
Binarization for Optical Processing Units via REINFORCE B Kozyrskiy, I Poli, R Ohana, L Daudet, I Carron, M Filippone Proceedings of the 3rd International Conference on Advances in Signal …, 0 | 1 | |
EEG-Reptile: An Automatized Reptile-Based Meta-Learning Library for BCIs DA Berdyshev, AM Grachev, SL Shishkin, BL Kozyrskiy arXiv preprint arXiv:2412.19725, 2024 | | 2024 |
Exploring the Intersection of Bayesian Deep Learning and Gaussian Processes B Kozyrskiy Sorbonne Université, 2023 | | 2023 |