top of page
screen2.jpg

MindDrawPlay

from Leipzig & Rostov with Love :)

MindDrawPlay (MDP) – is a project of experimental interactive educational audio-visual art, representing translation of brain waves and heart beat to visual and sound spaces, flows and controls. It has been grown on a base of research work in the direction of Brain-Computer Interfaces and Heart wave analysis. The project integrates several applications and biosignal devices.

The story


Originally, MDP has started in 2017 as a hobby project during years of my PhD in computational neuroscience in Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig.

I couldn't imagine how far and deep it will go in my life, when I got MindWave NeuroSky mobile EEG device just to play, and was curious - how can I use it in an artistic way, connecting brain waves patterns to sound samples and drawing.

MindDrawPlay v.1 (2020)

This video is an overview of 1st version of MindDrawPlay project made as a single Qt / C++ application obtaining only EEG signals from MindWave / MyndPlay / BrainLink Lite devices. 

Currently, this application is not much in usage (only some modules).

Combining technology advances – such as mobile EEG devices and musical knowledge – such as pentatonic scales (tones from hang and tank drums), it allows everyone to see,  to hear the brain activity represented by set of sounds and to use brain waves as a brush for drawing, as parameters for image filtering, attention modulated pictures flows, in “puzzle gathering”, “find the same” and “go through” games.

MindDrawPlay v.2 (2023)

This video is an overview of 2nd version of MindDrawPlay.
Currently, the project includes several Qt / C++ applications dedicated to processing of both brain waves (EEG) and heart beat (ECG) signals from BrainBit and Callibri devices. Most of mathematics as well as audio translations of biosignals are also done in these applications. The main visual interface part and particular cases of controlled generative animations are implemented through TouchDesigner projects.

MindDrawPlay v.2 examples can be divided on 3 groups:

1) generative animations controlled by Attention / Meditation / sound levels: Example 0 - 8, 12, 13, 15-17

2) animations based on neural network generated images ("Dreamflow" module from MDP v.1): Examples 9 - 11, 17

3) live performances with dancing artists: Example 14

Audio translation in all examples is implemented in the same way:
There are 2 sound samples playing in loops for Attention and Meditation,
Attention level modulates pitch of the sample, Meditation level - volume of the sample, for Heart Beat there is a single play of corresponded sound sample on each detected beat.
Update: 4 sound samples for theta, alpha, beta, gamma waves,
with its volume modulation by their expression levels (Example 16, 17).

Example 0: MindGlobe

Attention level modulates size and surface distortion of the globe.

Example 1: MindScapes

Attention and Meditation levels modulate landscapes structure, sound level - saturation of landscape color.

Example 2: MindPhysarum

Physarum generative particles model: Attention level modulates speed and length of particles movement, Meditation - angle of the movement.

Example 3: MindBrushes

Attention and Meditation levels modulate brush size, brush style (blurring degree) and speed of the canvas rotation.

Example 4: MindTopus

Attention level modulates structure, Meditation and sound levels - colors.

Example 5: MindWaves

Attention and sound levels modulate amplitude of generative waves.

Example 6: MindFlex

These examples based on Nvidia Flex particles model.

MindFlex 1: Attention level modulates cohesion parameter of the particles movement.

MindFlex 2 & 3: Attention level modulates border area for the particles movement, several other parameters are linked with midi controller.

Example 7: MindParticles

based on ParticlesGPU TouchDesigner node: Attention level modulates through sound spectrum scaling parameter size of the particles, Meditation level - Drag parameter ~ particles spread.

Example 8: MindNeuron

based on ParticlesGPU TouchDesigner node: Attention level modulates speed of the particles movement.

dream-flow-mind

DreamFlowMind is an NFT collection with stable diffusion images, brain and heart waves translations:

https://rarible.com/dream-flow-mind

Every work in the collection contains 2 storylines. 1st - is an innovative animation based on 180 stable diffusion images. Each flow has a sequence of transitions between source images. The story starts from a short presentation of core image, and then it unfolds in the flow with all others.

​2nd story - is a visualization of brain waves and heart beat, and translation of both biosignals to sounds. Attention and Meditation levels are estimated from EEG signal and linked to sound samples, which pitch and volume are changing depending on mental levels. Heart beat is detected from ECG signal and translated to sound. Overall sound level impacts saturation of colors in the flow.

​Therefore, at the same time, you see the visual flow and hear a music of brain and heart recorded during the flow. Each item in the collection has a unique signature of the moment in the mindspace flow. Will be only 180 items in the collection.

The devices used for obtaining biosignals: 
neurointerface BrainBit, cardiodevice Callibri.

Example 10: DreamFlowMind (particles transitions)

DreamFlowMind modification using ParticlesGPU for transitions: Attention level modulates expression of particles traces, Meditation level modulates spread of the particles.

Example 11: DreamFlow 3.0 (emergence in particles )

The difference to a previous version is that here is only particles layer (with 10k points), the switch of images leads to transformation of particles space according to geometry difference between images. Attention modulates particles life variance, Meditation - particles life duration,
music: Klaada - Fyaka (ft. Ivan Judas)

Example 12: Particles Twist

Based on Particles node in TouchDesigner: Attention modulates Axial force and Twist strength, Meditation - Vortex force

Example 13: Particles Dance 2

Modification of a previous project based on ParticlesGPU. Audio and Attention reactive particles: sound and Attention levels modulate size of particles and camera changes, Attention modulates area of particles, its traces length and spread of particles, music: Klaada - Encounters With Other Levels Of Reality

Example 14: Heart Brain Dance

Audio-visual translation of brain waves and heart beat in a dancing performance. The music of heart and brain is changing depending on dance movements and Attention level, projections of heart and brain images and waves are animated on heart beats. Collaboration with DomTanzTheater:
choreography: Tamara Ryazantseva,
brain: Alisa wearing neurointerface,
heart: Pavel wearing cardiodevice.

Example 15: Mind Blanket

Brain activity and sound control both structure and coloring of the form: audio level modulates amplitude of waves, Attention modulates size of particles and waves period, bloom intensity level, audio reactivity delay, Meditation - glow level, music: Suduaya - Flow

Example 16: Updated Audio-Visual translation

6 sounds, 4 linked to theta, alpha, beta, gamma waves and their volume is modulated by waves expression levels, the same for Meditation sound, for Attention sound pitch is modulated by its level, visual generative waves amplitudes are modulated by waves expression and real amplitudes relation. Saturation of image color is modulated by average sound level.

Example 17: Wind of Particles

Audio-visual translation of brain waves and heart beat with particlesGPU; the flow integrates biosignals, music modulated by them (6 sounds, 4 linked to theta, alpha, beta, gamma waves, their volume is modulated by waves expression levels, the same for Meditation, for Attention sound pitch is modulated by its level), stable diffusion images and generative animation in 2 ways - on brain waves visualization and on particles.

bottom of page