The Chukchi Sea, north of Alaska, is one of the most inaccessible places to humans on earth. Six seasons in the Arctic, according to the Inuit, are not demarcated by a fixed calendar, but by what we hear in the changing environment. Hydrophones were placed about 300 meters below the sea surface at a seafloor recording location 160 km north of Utqiagvik, Alaska. They capture the sound of sea ice, marine mammals, and the underwater environment throughout an entire year. Our journey begins on October 29, 2015, just three days after new ice had started to form – the birth of ice. ---- By Lei Liang and Joshua Jones
"Six Seasons" is a collaborative research project directed by Lei Liang and Joshua Jones, where I (Mingyong Cheng) create a visual interpretation using generative AI and visual computing for an experimental and interdisciplinary music series in 2024. This work guides listeners through the six seasons of the Chukchi Sea, north of Alaska, where Liang and Jones's team integrate real-world recordings of sea ice, marine mammals, and underwater environments with speculative sounds crafted by performers.
"From Speculative Soundscape to Landscape: The Visual Interpretation of 'Six Seasons'" introduces a novel approach to environmental data visualization by transforming Arctic hydrophone recordings into dynamic, culturally informed landscapes. This research addresses the challenge of representing remote ecological data—particularly for regions inaccessible to people—through an artistic lens that bridges Western scientific inquiry with Eastern aesthetic traditions.
The project interprets recordings captured 300 meters below the Chukchi Sea surface, merging them with musical compositions to create audio-reactive visual experiences. The visualization methodology combines three essential components: processing NASA satellite imagery, fine-tuning generative AI models to synthesize art that blends Arctic icescapes with stylistic elements of traditional Chinese Shan-shui painting, and real-time audio-visual synthesis through Stream Diffusion and TouchDesigner. These methods translate environmental data into contemplative forms that dynamically respond to the acoustic structure of the composition.
This work advances the field of environmental storytelling by demonstrating how AI-driven visualizations can enhance scientific-artistic collaboration while preserving the integrity of both scientific data and cultural traditions. By offering new strategies for representing ecosystems in regions inaccessible to traditional observation methods, this project reveals the potential of blending computational techniques with Shan-shui-inspired aesthetics. The resulting framework transforms raw environmental data into meaningful, culturally resonant visual narratives, illustrating how technology can mediate between empirical measurement and creative interpretation.
The Chukchi Sea, north of Alaska, is one of Earth's most inaccessible regions. The Inuit recognize six Arctic seasons defined by environmental changes, not fixed dates.[^1] Hydrophones placed 300 meters below sea level, 160 km north of Point Barrow, captured year-round recordings of sea ice, marine mammals, and the underwater soundscape.[^2]
In Six Seasons, composer Dr. Lei Liang and oceanographer Dr. Joshua Jones integrate these recordings with live performance by Marco Fusi (violin and viola d'amore). Using improvisation and specific instrumental techniques, performers create sonic gestures inspired by bowhead whale calls, sea ice formation, and beluga songs, bringing this remote underwater world into concert spaces.[^2] This artistic interpretation allows audiences to both hear and feel the powerful forces at work in this extreme environment.
In 2024, new media artist Mingyong Cheng joined Dr. Liang's team to develop an audio-visual experience. The original visual component featured high-resolution NASA satellite images showing seasonal changes across six Arctic sites, documenting ice fracturing, ocean currents, and coastal erosion. While scientifically accurate, these fixed perspectives limited the artistic interpretation Dr. Liang envisioned. He noted: "These satellite views mirror Shan-shui-hua—not just landscapes, but nature's own brushwork."
This observation inspired Cheng to incorporate "Shan-shui" principles but raised a challenge: How could they apply these principles to a place none had physically visited? Traditional Chinese landscape painting draws from artists' lived experiences and memories of mountains and rivers, yet the Arctic existed only as data—hydrophone recordings, satellite images, and scientific interpretations.
Cheng recognized parallels with her earlier project Fusion: Landscape and Beyond,[^4] where she used generative AI to synthesize “cultural memory” by fine-tuning models on Shan-shui paintings to explore a speculative vision of the past, present, and future. Could generative AI similarly reimagine environmental data as speculative Shan-shui, creating imagined landscapes of this remote place? How might these speculative landscapes preserve Arctic features while embracing Shan-shui aesthetics? Additionally, how could visual and audio elements interact to represent dynamic ecosystem changes across seasons?
To answer the proposed research questions, the team designed three key features for this visual experience:
Using Generative AI and Creative Computing
The Six Seasons composition comprises seven musical pieces: Seasons 1 through 6 based on environmental sonic changes, plus a Coda that embeds Dr. Liang's narrative about a beluga whale separated from its group, wandering the ocean in search of reunion.
For Seasons 1-6, the generative AI implementation followed a systematic process:
The audio-reactive system, designed in TouchDesigner,[^8] layered these generated landscape paintings with satellite imagery collections from each season. Cheng analyzed each season's musical composition by isolating "mid," "high," and "snare" signals alongside linear timestamps to drive specific visual effect parameters, including ice formations, ocean-land transformations, and particle systems.
This process involved close collaboration with Dr. Liang and Dr. Jones. Dr. Liang suggested creating a rhythmic cadence in the visual experience that reflected the natural pace of environmental changes, avoiding rapid visual blooms that might typically accompany reactive audio-visual work. Dr. Jones contributed oceanographic expertise, recommending the integration of color and brightness changes based on seasonal timeseries of sun angles and daylight patterns.[^1]
Each season features distinct audio-visual design elements based on thematic exploration. The revelation of Chinese landscape painting aesthetics was achieved through creative coding in TouchDesigner, establishing spiritual connections between Arctic landscapes and Shan-shui traditions.
For the Coda, which represents an imagined space without specific image data, Cheng and Dr. Liang decided to emphasize "Shan-shui" aesthetics more prominently—revealing the conceptual connections embedded throughout the previous seasons. Cheng merged all six seasonal LoRAs and implemented real-time generative AI using StreamDiffusion-TD in TouchDesigner.[^9] This system generates icescapes guided by Simplex noise inputs (simulating the topography) that react to musical elements. The final composition transitions between different aesthetic states, concluding with a blurred spectral imagery of the wandering whale, directly connecting to the narrative arc of the music.
Six Seasons reimagines environmental storytelling by merging generative AI, Chinese Shan-shui art, and Arctic science to create an immersive encounter with Earth’s most inaccessible regions. Like astronauts sharing Earth’s first orbital portrait, we stand alongside bowhead whales and shifting ice through hydrophone data and painterly imagination. Audiences witness the Arctic transformed through Shan-shui principles: satellite imagery reshaped into mist-clad peaks, beluga migrations rendered as calligraphic gestures.
Here, scientists supply ecological data, while artists shape and curate generative AI outputs within scientific and cultural constraints. This interdisciplinary “echolocation” rejects passive documentation: ice sings, landscapes breathe, and migrations unfold across inhuman timescales. Crucially, AI serves as a bridge, not a replacement, for artistic interpretation—constrained by satellite data and brushwork traditions to avoid aesthetic dilution. Six Seasons becomes both a gentle and urgent love letter to Earth, confronting us with the fragility and tenacity of polar ecosystems.
See the full collection below:
If the embedded showcase could not display normally on your device, please visit the showcase via: https://vimeo.com/showcase/11366903
This project is led by Composer and Artistic Director Lei Liang, with advice from Oceanographer and Principal Scientific Advisor Joshua Jones. The violin and viola d’amore are performed by Marco Fusi, while the audio systems are designed and software developed by Charles Deluga. Audio software development is handled by Zachary Seldess. New media artist Mingyong Cheng responds to "Six Seasons" soundscape by reimagining the Arctic through creative computing and generative AI, merging the serene motifs of traditional Chinese paintings with Arctic landscapes.
This series was published as a QR code access in Lei Liang's Six Seasons album booklet, published by KAIROS.
Performance at Experimental Theater, UC San Diego.
Season 4 included in the DVD of the book: 走向新山水:梁雷音乐文论与作品评析