Lisa Müller-Trede

 

Artificial Kinaesthetic Intelligence

Lisa+Mt,+Berlin+-+oh+und+mit+Blut+2015+(1).jpg

Synchronicities, 2023

14:39 min 6-channel 4k video, bodies, windows, algorithm, camera, wearable digital stethoscopes, sound recorders, timecode, attention, breaths, kinaesthesia, silence

Solo Exhibition, Zemeckis Center for Digital Arts, Los Angeles, USA, 2023

 

Relational Intelligence

Through the electromyography and audio signal processing of bodily sounds, I create mechanisms and situations which aid communication that does not rely on classifications such as words and which make accessible the subtle negotiations which bodies perform on a somatic level in relation to one another.

For a more thorough discussion, see “Artificial Relational Intelligence: Breath as Interface” in Culture Machine and “Discerning Relational Data in Breath Patterns: Gilbert Simondon’s Philosophy in the Context of Sequence Transductionin MATTER: Journal of New Materialist Research.

 
 

Artificial Kinaesthetic Intelligence

I investigate kinaesthetic expression as a means of relational, shared intelligence. Simulating or predicting the direction and velocity of movement that is in the process of being formed is part of any encounter between bodies. It is a form of situated action and a foundation of intelligence itself. 

 
 

Breath Dataset | Synchronicities

The films below show the breath data I have analysed. Avoiding a laboratory setting to show the impact of atmosphere and environment on encounters, the minute movements of the faces defy affective computing’s attempts to average out emotions based on discrete facial expressions. The films capture intensity and difference in non-dialectical expressivity.

Together with an interface for breath analysis I have designed, the analysis was exhibited at the 2022 International Society for Research on Emotion (ISRE) Pre-Conference.

 

(Headphones suggested.)

 
 

Incident 1 (01:20-01:22 min)

 

Incident 2 (01:37-01:39 min)


 

Incident 3 (03:20-03:22 min)

 
 

Breath as Interface

Breath informs an organism’s kinaesthetic awareness (i.e., the awareness of the body’s positioning) while, in turn, kinaesthesia informs its breathing. This feedback engages multiple organisms which organize their bodies by somatic means and coordinate their movement nonverbally. With the help of audio-signal processing this coordination between multiple organisms can be deciphered by processing their joint audible breaths. This joint creation reveals nuances of non-categorical cooperation between humans, as well as between human and computer. This relational knowledge, which has largely been neglected in the predominantly positivist development of affective computing, reframes and counteracts the popular utilitarian mechanics of artificial intelligence that contribute to the algorithmic oppression we face.

Recurring Patterns in Joint Breathing

The films/spectrograms show three incidences of recurring patterns in the joint breath of the two people in the frame – one in front and one behind the camera, visible in the reflection. While it is difficult for a human to hear two breaths as one rhythm, the algorithm employed here finds a recurring pattern in the merged breath signal. From the eight minute video, three three-second events were singled out by the algorithm – all turned out to be jerks of the subject’s body.

These jerks reveal a form of

KINAESTHETIC INTERDEPENDENCE

as a form of intensity (shared context beyond signification) in the Simondonian reading of the term. Observing these particular forms of expressivity in the spectrograms of joint breath signals reveals a "language" that functions without signifiers, yet points to a subconscious “vocabulary” between bodies. 

INTENSE EXPRESSIVITY

The pattern of the merged breaths produced during the head twitch (01:20-01:22 min) and the shoulder twitch (01:37-01:39 min) are similar to the final twitch of the head (03:20-03:22 min). The twitches are isolated as recurring events which disclose similar physical gestures only after merging of the subject and the camera person’s breath. The individual breath signals are dissimilar and do not reveal the recurring pattern on their own. There is information in relational data that has so far been overlooked due to the pervasive focus on the individual.


 

Excerpts Breath Dataset. Presented at The International Symposium on Electronic Art (ISEA), Barcelona, 2022.

 
 
 
 
 
 

Relational Intelligence in the Automatic Pause –

Non-fiction Gone WronG, 2022

live performance

At the 2022 International Society for Research on Emotion (ISRE) conference I asked an actor to present my paper to an unwitting audience. My paper, Relational Intelligence, was presented as part of the Affective Computing panel. Playing a fictional character of a sniper’s spotter, I blithely disrupted the presentation of “her” paper, saying that this research was beneficial to the profession of sniper-spotter duos as they are dependent on instantaneous communication and that the research could, hence, allow them to shoot more accurately. Walking it back, I emphasised that I was only spotting, not pulling the trigger – much like the university often claims to merely facilitate research rather than employ the tools that are invented.

The piece contextualizes how research is funded, specifically, research involving artificial intelligence, which far too often lacks ethical review processes. Complicit in the computational capitalism practiced at academic institutions, I seek to negotiate its horrors and allures in their inconvenient immediacy while exposing, celebrating, and problematising the meta-narrative’s immorality and parasitic resistance (Watkins Fisher 2020).

 

Breathing down my neck, 2023

6:40 min 4k Video, four actors, three cameras, conference, algorithm, academic paper, affective computing panel, audience, synchronicities, punctured contemporary temporality, projector, breath of a spotter‘s sniper, breath of a snipers‘s spotter, breath of a victim, samosa

with Sophie Dia Pegrum, John Smith, Rahjul Young, and Amalia Mathewson

4k Video published in Cambridge University Press special edition of TDR: The Drama Review, forthcoming 2024

 
 
 

Violet Eyelids ↓