Injecting an artificial being into the physical space

The complex algorithmic processes that make up the digital sphere are often masked by comforting user interfaces and speech synthesis mimicking the human form. This representation that we identify as ‘the internet' can quickly fall into the uncanny valley when questioning what processes lurk beneath this pleasurable aesthetic which users engage with.

Psyche contextualises the fear of digital exposure through a hyperrealist personification of the internet if it were some kind of 80s horror monster. Using a generative video manipulation method, artificial subjects are projected onto synthetic plastics and step foot into the physical space.

This piece utilises a generative adversarial network-turned-video manipulation tool developed as my thesis project during my master's at University of the Arts London.

Will be exhibited at the CVPR Computer Vision Art Gallery in 2021.

Collaboration with Ryan Blackwell.

Imagining a future in which the role of the machine is that of a camera, paintbrush or chisel
VIDEO, 2020

4104 frames (2020) is an experimental video exploring nature, architecture and life through the lens of a machine learning algorithm trained only to understand faces. It is an attempt at revealing the inner workings of a traditionally 'black-box' machine learning model. This results in scenes of human faces deteriorating into environments as part of a delirious exploration of the farthest edges of a model’s latent space. A demonstration of a novel use case of generative adversarial networks (GANs) for video manipulation.

Exhibited at the NeurIPS AI Art Gallery in December 2020.

Using a generative neural network to extract visual features of horror
VIDEO, 2020

This piece explores the visual aesthetics of horror with the help of a generative neural network. Horror film scenes were used as input in an attempt to capture how 'cold' computational tools interpret and reimagine the aesthetics of something untangible and subjective - fear.

A video study in the form of a triptych on cinematography and the ability to replicate a cinematic 'feeling' using a generative adversarial network.

Immersive, interactive digital forest exploring collaboration and consequence in nature

Collaborative project with MSc students and Phoenix Perry. Personally contributed on circuit design, electronics, Arduino programming and the construction of the installation.

Exhibited at Wellcome Collection, 2020.

Visualising true generated randomness
VIDEO, 2020

This generative video is a digital reimagination of Sir Arthur Eddington's photograph of the 1919 solar eclipse. This particular eclipse was used in testing Albert Einstein's theory of general relativity - according to which objects with large enough mass, such as the sun, would warp space-time around them. On that day, the solar eclipse darkened the sky enough for scientists to measure the location of stars close to the sun. Compared to the night sky, these stars were offset by exactly the amount that Einstein predicted, thus proving his theory.

This piece explores another fascinating property of light, namely that it produces fundamentally unpredictable measurements at a quantum level. The random motion of the rays in this piece is generated with quantum number data collected using light measurements, provided by Humboldt University Berlin.

Exhibited at Tate Exchange, Tate Modern, 2020.

Collaboration with Mark Kvetny.