Gesture Lab

The Gesture Lab is specific attempt to generate a notation from choreomusical gestures. Within the scope of this lab is the implementation of certain quantitative methods, that can be applied to the research of human gestures in a performative context.

Here is some of the output produced in this Lab:

Dore Hoyer Project
Movement and Computing Conference (MOCO'22)

Lab participants

👤 Leonhard Horstmeyer
👤 Maria Shurkhal
👤 Damián Federico Cortés Alberti
👤 Adrián Artacho
👤 Sara Glanzer

Dore Hoyer Project

In Cooperation with:

FAR – Fashion and Robotics. (University of Art and Design Linz in cooperation with the Johannes Kepler University Linz.

The cooperation consists of the realization of Motion Capture of a dance solo from a famous repertoire (“Angst” by Dore Hoyer)

The technology used was::

Vicon motion tracking with 6 infrared cameras Vero. The obtained data will be analyzed with the software Nexus (version 12).

Researchers involved:

Julio Andrés Escudero (FAR)

Damián Cortés Alberti, Adrián Artacho, Leonard Horstmayer

  •  Dance as an artistic practice is a multilayered phenomenon. It is a historically specific practice; it constitutes bodies; it depicts expression forms; it develops specific skills in bodies transgressing the dance’s realm. This project puts dance practices center stage to investigate the human body’s multilayeredness as place and source of agency,revealing their multi-layered manifestation in multiple layers of rendering.

Sample of raw date of motion capture:

  • Pelvis 1: Left Pelvis Front
  • Pelvis 2: Right Pelvis Front
  • Pelvis 3: Left Pelvis Back
  • Pelvis 4: Right Pelvis Back
  • LeftLeg2: Left Knee
  • LeftLowerLeg2: Left Ancle
  • RightLeg2: Right Knee
  • RightLowerLeg2: Right Ancle
  • Thorax1: Sternon up
  • Thorax2: Sternon down
  • Thorax3: Left Shoulder
  • Thorax4: Right Shoulder
  • Thorax6: Back, Right Side
  • LeftArm2: Left Elbow
  • LeftArm3: Middle Left Upper Arm
  • LeftLowerArm2: Left Wrist
  • RightArm2: Right Elbow
  • RightArm3: Middle Right Upper Arm
  • RightLowerArm2: Right Wrist
  • Head1: Left Forehead 
  • Head3: Left Head Back
  • Head4: Right Head Back

Sample of cluster notations: 

 Experimental notation resulted from the analysis of the motion capture. 



Salta is a bundle of python scripts conceived to process performance data in ecological conditions.

The characteristics...

Movement and Computing Conference (MOCO’22)

MOCO is an interdisciplinary conference that explores the use of computational technology to support and understand human movement practice (e.g. computational analysis) as well as movement as a means of interacting with computers (e.g. movement interfaces). This requires a wide range of computational tasks including modeling, representation, segmentation, recognition, classification, or generation of movement information but also an interdisciplinary understanding of movement that ranges from biomechanics to embodied cognition and the phenomenology of bodily experience.

The 8th International Conference on Movement and Computing Conference took place 22-24 June 2022 in Chicago, Illinois (USA). Adrián Artacho and Leonhard Horstmeyer presented the SmoothOperator: A Device for characterizing Smoothness in Body Movement.

Who: Adrián Artacho, Leonhard Horstmeyer, Maria Shurkhal