Artist uses AI to create an ever-changing musical score shaped by the skies above an NYC hotel

People can’t yet be in two places at the same time. But Los Angeles-based musician Julianna Barwick is getting around that little problem to treat guests at a new hotel in New York City to a never-ending, ever-changing, live performance that funnels the sound and mood of the skies above Manhattan down into the lobby.

A generative music program created with artificial intelligence (AI) has been trained to keep a constant watch skyward through a camera on the Sister City hotel rooftop, which boasts a sweeping view from the Empire State Building down to the Statue of Liberty. It will take the live inputs it sees — maybe a sudden sunburst, a passing airplane, a full moon or a flock of pigeons — and spontaneously create a musical score, drawn from Barwick’s corresponding compositions, that will play all day and night and never repeat.

“It will be a magical, delightful thing to experience that will pique your sense of wonder,” Barwick says. “But the most exciting thing, to me, is that I’ll be going home to LA and the AI will still be there helping me perform in New York.”

Barwick, an experimental musician, has been working on the project with Sister City and Microsoft for the past year, but in an unusual twist for a musician, she won’t hear the composition she created until the public does, when the hotel officially opens on May 16.

Once the AI performance starts that day, it won’t stop.

a woman composes music with a laptop and sound board

Experimental musician Julianna Barwick composed dozens of soundscapes to be played by a generative music program that will reflect the skies above Manhattan.

Barwick composed five movements within an overall soundscape that reflect the constantly changing nature of the sky throughout the day, each with its own background of bass, synthesizer and vocal lines that weave in and out. For each “event,” identified by Microsoft AI, she then created six synthesized and six vocal sounds for the generative audio program to choose from – for example, 60 different musical options a day for every time an airplane passes above. The sounds are an expression of Barwick’s emotions in response to each stimulus.

“I didn’t want it to be too literal,” she says. “I could have made it sound ‘raindroppy,’ but it’s more about the attitude of the event. An airplane is a lot different than the moon, so it has more of a metallic sound than a warm sun sound or a quiet ‘moony’ kind of feeling. I wanted people who listen to it to be curious and wonder what that sound meant, what’s going across the sky right now.”

Barwick has never been afraid of technology, even if she didn’t have access to it. She recorded her first album in 2007 using a guitar pedal to form vocal loops on a cassette tape. “I didn’t even have a computer then,” she remembers. “I took my bag of tapes in somewhere to get mastered to produce the CD.”

Now she relies on technology to compose, record and perform her multilayered, ambient music. She uses effects on everything, including her voice. There’s no such thing as an unplugged Julianna Barwick set. Still, she says, “Before I was approached to do this project, the only thing I knew about artificial intelligence was from the movies. I’d never seen an application of it in my daily life.”

So as she began exploring sounds, Barwick grappled not only with what AI was and could do, but also with what her role would be in comparison to its. Who was the actual composer – she or the program? Was AI a partner or a tool?

“I contemplated how the project would play out in my absence and realized that I can make all the sounds, but I’m not going to be there to detect all the events — you have to rely on the AI to do that,” Barwick says. “And that’s such an important part of the score; it’s almost like it’s a 50-50 deal. And that’s what makes this project interesting. It almost brings in another collaborator, and the possibilities are endless. It’s opened up a new world of thinking and approaching future compositions and scores.”

a woman composes music on a laptop
A camera sends live images to a Microsoft Azure computer vision tool, which assigns tags such as “clouds” or “sun.” Those are fed into the system that technologists programmed after analyzing Barwick’s compositions and distilling them into an algorithm, which then chooses which tracks to play together.

Barwick’s wordless music features her own ethereal vocals and synthesized sounds. She composes spontaneously as she performs or records her pieces, never knowing just what the end result will sound like. And the AI program’s adaptability mirrors her own.

“My music is very abstract and interpretive, and I’m a filter for stimuli,” she says. “Whether I’m in a different country or it’s raining or something really sad just happened, that comes out in the music. And this is kind of like the AI is the filter for the stimuli now, because I can’t sit there and watch the sky and perform 24/7 like it can.”

Barwick relied on a team of Microsoft AI technologists, to train the AI in how to respond to events in the sky, and to help organize the tracks and create a generative program. The camera sends live images to a Microsoft Azure computer vision tool, which assigns tags such as “clouds” or “sun.” Those are fed into the system technologists programmed after analyzing Barwick’s compositions and distilling them into an algorithm, which then chooses which tracks to play together, depending on what types of events are happening.

“The only thing I knew about artificial intelligence was from the movies. I’d never seen an application of it in my daily life.”

All the sounds for each of the five sets throughout the day are harmonious, even though the key changes for each. Barwick says she chose chord progressions that made sense to her for the time of day, such as calm but energetic in the morning and “starting to chill out a little bit” by nighttime. She dubbed the project “Circumstance Synthesis.”

“Creating something that will change from bar to bar, from minute to minute, hour to hour, from morning to afternoon and then evening, and different seasons, too – it’s been both challenging and fascinating,” says Luisa Pereira, a technologist working with the Microsoft team. “It has to work within all of these different types of constraints.”

Music is steeped in the background of Atelier Ace, the creative agency and management company behind Sister City and Ace Hotels — the company got its start in the late ‘90s as a concert promotion firm. Ryan Bukstein, vice president of brand for Atelier Ace, remembers when he was an intern for the company in 2000 and first heard Brian Eno’s “Music for Airports,” an album of ambient music that Barwick’s work is reminiscent of. “We’ve had this idea to do an audio score like this for many years, but haven’t had the right place to do it,” Bukstein says. “But Sister City is different than any of our other hotels, and it’s a place where we wanted to be experimental and try new things that we’ve never done anywhere else.”

The 14-story, 200-room boutique hotel on Bowery, a wide avenue on the Lower East Side, has a minimalist aesthetic drawing from Japanese and Scandinavian influences to offer a clean and calm ambiance.

hotel lobby
As she composed, Barwick kept in mind the aesthetic of Sister City, which draws from Japanese and Scandinavian influences to offer a clean and calm ambiance.

“I picture people who want to enjoy the serenity and aesthetic of Sister City, because it’s very sparse, clean, pretty, light woods, white walls, very airy and light,” Barwick says. “So I want to give them a peaceful, serene feeling.”

Sister City guests can have as much or as little human interaction as they want; if they’re overwhelmed by the crush of humanity in Manhattan and crave solitude, they can check in via a self-serve kiosk, tap their room key to get in and out of the building and use the hotel’s app to request services, without having to speak to anyone.

“We’re including technology in a smooth, intuitive and additive way so it will create more space for us to enjoy life,” Bukstein says. “So we wanted to take music and tie it to AI in a way that could create something special and filter what’s going on outside into Sister City. Julianna’s music is unique, and even though it’s created with technology, it comes out sounding very organic and very human.”

“(AI) almost brings in another collaborator, and the possibilities are endless.”

Although Barwick lived in New York City for 16 years before moving to LA, she was born in Louisiana and grew up in Missouri and Oklahoma. Her album “The Magic Place” was named after a tree on her family’s farm with a canopy that was big enough to crawl into. Her lifelong love of nature influenced her compositions as she focused on bringing the skies above Manhattan down into the lobby of the hotel.

“This is almost like a living score, because it’s interacting with nature and what’s going on outside at the moment,” Barwick says. “It’s like a live synthesis. And I love creating something that will live in New York, since I don’t anymore.

“I can’t wait to hear it in the space and see what people think about it. Maybe AI will seem more tangible and not this far-off sci-fi thing that people only see in movies, but rather something they can use in their own compositions and projects.”