|
ARTE E CIÊNCIA
Transforming advanced nanoscience data into interactive art. It is hard to imagine just how small one nanometerone-billionth of a meterreally is. Ten hydrogen atoms in a row are one nanometer long. For perspective, consider that a sheet of paper is 75,000 nanometers thick. A red blood cell is 7,000 nanometers across. A typical virus is about 100 nanometers wide, and a strand of DNA is two nanometers wide. To see at the atomic and molecular scale, scientists use instruments such as atomic force microscopes that “feel” surfaces with a mechanical probe, electron microscopes that scan a highly focused beam of electrons across a sample, and x-ray scattering instruments that direct x-rays at a sample surface. With these instruments, scientists can probe the crystal structure, chemical composition, and electronic nature of materials. Understanding these properties is key to designing and optimizing materials with the desired functions for particular applications. However, modern-day experiments are producing data of a highly complex and abstract nature. Thus, data interpretation can be difficult. “We have some exquisite methods for reconstructing three-dimensional (3-D) structures at the nanoscale,” said physical chemist Kevin Yager, leader of the Electronic Nanomaterials Group at the Center for Functional Nanomaterials (CFN)a U.S. Department of Energy (DOE) Office of Science User Facility at Brookhaven National Laboratory. “But even if you’ve measured the structure perfectly, you haven’t learned anything until you understand how the components are organized. New visualization and sonification methods can really help provide this understanding.” Multimedia artist Melissa Clarke (center) made more than a dozen 3-D printed glass-like sculptures based on nanoscience data collected by scientists at Brookhaven Lab's Center for Functional Nanomaterials (CFN) and National Synchrotron Light Source II, including CFN physicist Kevin Yager (right). For the virtual reality (VR) component of the project, viewers can walk through and interact with the sculptures by wearing a VR headset. During the immersive experience, different sonifications created by Margaret Schedel (left)a professor of computer music at Stony Brook Universityplay as the user performs various actions.
Different materials produce vastly different corresponding sounds. From top to bottom: a commercial plastic, a misaligned sample (beam missed the sample), and a composite of carbon nanotubes dispersed in an elastic polymer. For example, the commercial plastic has many striations (scattering rings, which arise from the diffraction of x-rays from the sample's internal structure) because of its semi-crystalline packing, leading to several distinct tones over time.
The FFT algorithm is a mathematical tool used to represent any periodic function as a sum of sine waves. In Yager’s case, the periodic function is the arrangement of atoms in the material; for Schedel, it is sounds produced over time. “In computer music, we use a technique called additive synthesis to combine sound waves together,” said Schedel. “In a similar way, x-ray scattering data can easily be represented as sound. Kevin provides me with x and y coordinates and brightness information from the x-ray scattering detector, and I turn that information into sine waves, with the pitches (frequency of sound wave) based on the positions and the loudness (amplitude of sound wave) based on the intensity.” In this case, different qualities of the structures map to different sound qualities, or tones (“timbre”). For example, metallic samples with tightly packed atoms have a higher-pitched sound than materials with spaced-out atoms. Similarly, well-ordered crystalline structures have clear and distinct pitches, while polymers such as plastics generate chaotic noise. At first, Yager was skeptical that sonified data could be useful. But after he heard nuances that correlated with a misaligned sample, he changed his mind. As described in a paper they published in 2012 ("Hearing Nano-Structures: A Case Study in Timbral Sonification"), data sonification could not only help scientists identify instrument errors during data collection but also enable them to better understand highly abstract datasets, simultaneously perform other tasks, and extract insights that had been overlooked during visual analysis. Schedel and Yager are also trying to apply machine learning algorithms for computer vision and music information retrieval. These algorithms could automatically detect similarities and differences between new materials, guiding potential applications and uses. “Melissa’s work is not only aesthetically beautiful but also based on actual data,” said Schedel. Clarke’s dive into scientific data visualization began in 2009, when she stumbled upon acoustic images that geophysicists had created to map the topography of the bottom of the Hudson River. She ended up turning these images into a video installation on an old ship. “I began my career primarily in video and sound, but data visualization was a natural progression as I sought to provide a narrative and more meaning for my work,” explained Clarke. “The seismic images of the sub-bottom profiles were stunning, and I was fascinated by the process of sending sound down to the river floor and using the sounds that bounce back to create a drawing of the landscape. I come from upstate New York, the landscape of which was created by glaciers. It hit me that data visualization is something I needed to pursue seriously in my work.” Soon after being introduced, Clarke and Schedel co-founded arts.codes, a collective focusing on different art forms with computational underpinnings. And through Schedel, Clarke met Yager and learned of the x-ray scattering data sonification project. Their discussions led to the idea of creating a virtual reality (VR) experience that combined sonification and 3-D installations based on experimental nanoscience data. “Kevin started showing me nanostructure visualizations, and I thought they were so beautiful that I was inspired to draw them,” said Clarke. “I really got interested in how variable the structures looked. During tours of the CFN and NSLS-II, I learned about the equipment that Kevin and other scientists use to take nanoscale 3-D photographs capturing the interaction of a beam of electrons or x-rays with a material. I was very much intrigued.” Scientists can generate 3-D height maps of the features on a sample surface at atomic resolution through scanning probe microscopy. An ultrasharp mechanical probe (tip) is brought very close to the surface of interest, and the local interaction of this tip with the surface is measured as the tip is scanned back and forth.
Glass Menagerie incorporates a VR component that enables headset-wearing viewers to walk up to and into the sculptures as they hear sonifications. Melissa Clarke holds a nanostar (left) and DNA octahedron (right) sculpture that she 3-D printed based on data collected through electron tomography with a transmission electron microscope. In electron tomography, a beam of electrons is passed through a sample at different angles to generate 3-D structural information.
Glass Menagerie was first displayed at the U.S. Library of Congress for an event in November 2018. Its unveiling at Brookhaven occurred during the CFN 10-year anniversary event in December 2018. It has since been on display at various venues, including the Knockdown Center for the arts and performances, the Pioneer Works cultural center for the arts and sciences, and Creative Technology Week. The trio is currently working on a more portable “traveling” edition of the installation that uses the freestanding Oculus Go VR headset, which does not require a computer connection. They are also in the process of creating a second “procedural” VR piece that is more interactive. “It is procedural in the sense that every time you interact with it, data are being processed in the VR software program in real time,” explained Clarke. “The initial piece that we made was based on images that Kevin had already created; the second piece will use numerical scientific data. What’s exciting about the procedural piece is that scientists will be able to put their data into VR themselves. That’s the end goalfor them to make their own sculptures and VR pieces for personal use and demonstrations. It isn’t just art for art’s sake.”
The second VR piece has a new look and improved user experience that gives users more control over what parts of the data they see and interact with. Users can pick up the sculptures and rotate them to see the view from different angles.
Brookhaven National Laboratory. Accessed: Aug 12, 2019.
|
|