9 copy.png

//behind the lens; 


Key words:


Project Name;

behind the lens; immersions beyond data


working with ACMI



Adam Nash 


Digital and physical data


ACMI Melbourne 

Technology used;
C4D , PS ,ID ,AI,

Model making





Interpreting interior as a device and as a system, in which enables the presentation of recorded data, live data and more importantly to address that data can inform the productions of interior, and in-turn interior can also be the data producer. The proposal is designed with three-layered interior relations; Through techniques of activation, disruption and intensification enable an immersion beyond the representation of numerical data. Recorded, live, data in producing corresponded with past, present and future. Each of them responded to the different immersion of materials, immaterial and senses; that is, reflection, projection and interconnection with sight, touch and sound

live data:
tapping vs location
recorded data:
tap location vs duration
producing data:
immersion and interaction

Seeing beyond

The site is chosen at the Flinder’s St entrance of ACMI, next to the exhibition entrance: The Story of the Moving Image. As an interior designer, I’m quite curious about the experience of seeing beyond the restriction sights of an individual as the Lens enables an individual collection of their journey and interests. In reference to John Berger, Ways of Seeing. He addresses that our sight is actually highly influenced by our knowledge and beliefs, that seeing is actually another way of filtering, interpreting and searching for meaning. He also implied that our experience of seeing embeds with layers of multiplicity, which he defines that a visual image is actually a record of a particular seeing of others. He then further elaborated on seeing as these in-between moments contributed with situations and relations with the memory of a specific sight of others and the viewer. 1 Which raise the question of How can we see through Lens as a collective encounter?

//activating; recorded data

A kinetic ceiling sculpture is designed in response to activations of recorded data. Visitor can scan their Lens on the sensor next to the staircase to activate the sculpture to perform a choreography of their exhibition journey and an unfolding of memory through their own Lens. The sculpture is designed according to the locations of tapping points within the exhibition, which is developed through series of networking. This will enable the system to recognise data from the Lens and respond in relation to the duration between tap locations. The structure is proposed in reflective sheets, which enable both sunlight and artificial lights to cast their own foot on the sidewalls, creating a dynamic encounter that is temporal and ephemeral.

networking recorded data: tap location vs duration

reflectivity of materials, form and flow to achieve atmospheric performances

//disrupting; producing data

Coloured light installation continuous with an interactive mobile browser. Besides the performance by existing lens carrier. This interactive program is designed to create dialogues between the visitors. With smartphones and tablets, visitors are able to paint vivid beams of light across the sculpture at a massive scale: small moments of touch made on phones become a vital projection onto the surface of the sculpture. The location of each device will be referred through the location of the light, while the coloured light is actually generated from the photos of the exhibition, The Story of the moving images. This enables those who don’t have or yet to have their Lens of the exhibition to have a different interaction with these digital cultures. Light and flows are choreographed by the public in real-time through the collaboration of Lens, whether it’s a scanned lens or simply one’s immediate sight. Through their mobile devices, the surface, sculpture and the whole interior becomes a crowd-controlled visual on a massive canvas.

//interconnecting; live data

While a live sound system is proposed to enable encounters of live data through sonification. Sound here is interpreting as a present connection with the activities within the exhibition. Sound is temporal; it is measure by time, but one that can actually travel beyond numerical time to the immersion of durations. In this case, the sound is emotional, spatial, and virtual. It enables people who aren’t in the exhibition to ‘see’ other’s Lens through a sonic input—listening to spaces where you are absence but virtually presence in the atmosphere. The system is proposed in close reference to the simultaneousness of visitor scanning their Lens, which the more people scan at the same time, the higher intensified ambient sound will be output to the interior.

Related tools and curiosities;

20210326 data visulisation 3_edited.jpg

Data visualisations