%0 Conference Proceedings %T Brain Computer Interface, Visual Tracker and Artificial Intelligence for a Music Polyphony Generation System %+ Polytechnic University of Bari / Politecnico di Bari %A Ardito, Carmelo %A Colafiglio, Tommaso %A Di Noia, Tommaso %A Di Sciascio, Eugenio %Z Part 7: Posters %< avec comité de lecture %@ 978-3-030-85606-9 %( Lecture Notes in Computer Science %B 18th IFIP Conference on Human-Computer Interaction (INTERACT) %C Bari, Italy %Y Carmelo Ardito %Y Rosa Lanzilotti %Y Alessio Malizia %Y Helen Petrie %Y Antonio Piccinno %Y Giuseppe Desolda %Y Kori Inkpen %I Springer International Publishing %3 Human-Computer-Interaction – INTERACT 2021 %V LNCS-12936 %N Part V %P 368-371 %8 2021-08-30 %D 2021 %R 10.1007/978-3-030-85607-6_39 %K EEG %K Brain computer interface %K Leap motion %K Slonimsky %Z Computer Science [cs]Conference papers %X In the Brain Computer Interface domain, studies on EEG represent a huge field of interest. Interactive systems that exploit low cost electroencephalographs to control machines are gaining momentum. Such technologies can be useful in the field of music and assisted composition. In this paper, a system that aims to generate four-part polyphonies is proposed. An artificial intelligence algorithm permits to generate polyphonies based on the N. Slonimsky’s theory by elaborating data coming from a Leap Motion device, to detect user’s hand movement, and a five-channel EEG signal detection device. %G English %Z TC 13 %2 https://inria.hal.science/hal-04291195/document %2 https://inria.hal.science/hal-04291195/file/520519_1_En_39_Chapter.pdf %L hal-04291195 %U https://inria.hal.science/hal-04291195 %~ IFIP-LNCS %~ IFIP %~ IFIP-TC13 %~ IFIP-INTERACT %~ IFIP-LNCS-12936