In yesterday's post we looked at a fashion collection characterized by pleats, folds and pockets built to guarantee freedom of movement to the wearer. There are artists and choreographers, instead, celebrating movement by exploring the possibilities of mathematics and technology.
Inspired by Alan Turing, the Enigma codebreaker, Rhizomatiks - led by Daito Manabe and Motoi Ishibashi - created a groundbreaking dance performance in collaboration with the ELEVENPLAY dance company, choreographer and art director Mikiko and media artist and open source arts-engineering toolkits collaborator Kyle McDonald.
Debuting in Japan four years ago, "Discrete Figures" was performed at the Osaka University of Arts earlier on this year on the occasion of Manabe's appointment as a visiting professor at the Osaka University of Arts. The piece is on tonight and tomorrow at the Tbilisi International Festival of Theater (Tbilisi/Georgia).
In "Discrete Figures" performing arts and mathematics are used to explore the relationship between the human body and computer-generated movement born from mathematical analysis. Numeric data and analytical results inform indeed the choreography.
The piece is based on the interplay between the real and the virtual world: dancers in white costumes comprising basic T-shirts, pants and a pleated half skirt form sketched geometries or interact with shapes and figures projected on screens, and with ghost-like dancers.
One of the most effective parts of the choreography is the section with the dancers behind frames with screens reproducing on them stylized versions of their movements that look like drawings. The virtual dancers's movements, created through pose estimation by markerless motion capture and graphic generation using photogrammetry, intersect the world beyond the screens that turn into dynamically moving canvases, creating two different planes similar to the real/comics environments in A-ha's video for their 1985 hit "Take On Me".
As an additional layer of complexity, the performance piece also uses drones, AI, and machine learning to create new movemens and modes of expressive dance that go beyond the limits of conventional human subjectivity and emotional expression.
The research behind the piece was extremely time-consuming and started with a machine learning phase. Researchers used Openpose to study the poses of public domain stage footage and movie scenes, collected the pose data, and developed a neighborhood search system matching pose data from stage footage and movies with dancer pose data to project imagery of poses closest to theirs into a rectangular on-stage frame.
Besides, a booth is usually set up in the venue lobby to photograph visitors. Until just before the start of the performance, the characteristics and movements of the participants' clothing and movements are analysed on multiple remote servers to make the audience appear on the screen on stage as dancers.
The performance integrates different hardware pieces including five palm-sized micro drones and the frames with screens mentioned earlier on. The drones, more maneuverable and easy to use than ordinary ones, create balls of light floating on the stage. The frames incorporate seven infrared LEDs and a battery and the motion capture system recognizes the whole as a rigid body, responding to dancers' movements based on various rules and algorithms to generate new bodily expressions.
The result is a new set of movements that transcend the limits of conventional human subjectivity and emotional expression and prove that new technologies, such as motion tracking and data-driven computer graphics, can be used to create harmonious and elegant dance steps. Expect to see this technology integrated in a fashion show at some point, maybe by a futuristic designer à la Iris van Herpen.
Comments
You can follow this conversation by subscribing to the comment feed for this post.