Research & Reference
Robert Hodgin- Magnetosphere (iTunes Visualiser)
The magnetosphere was originally a Processing sketch that visualised music. Apple later commissioned The Barbarian Group to port it to C++ to exist in iTunes.
From observing the video, it looks as though there are two spheres (positive and negative) that are attracted to one another. Specific noises in the music, either by amplitude or pitch will force the spheres apart.
Also, it seems as though the program can detect significant changes in music- changing colour schemes and behaviour.
The number of magnetospheres can increase- positive spheres appear to attract and can merge to form a larger and brighter sphere.
The camera appears to be hurtling around the spheres- possibly controlled by noise.
I find the fact that the Magnetosphere manages to follow the narrative of a song interesting. Most audio-vis is controlled directly by the music, but in Magnetosphere, the music seems to control the behaviour- creating a more natural effect.
I assume this approach is procedural.
A 3D wireframe is in the centre of the screen. The vertices rearrange periodically with the tempo of the song. The movement appears calculated and precise- very digital.
A curved line often flows through the object, corrupting the digital. This is often accompanied by a loading icon and stuttering.
This phenomenon keeps reappearing in my research. Sound to sight/ touch is the most common form of synaesthesia (if I remember correctly). The sounds of some words can be jagged and others smooth.
New York-based artist whose work focuses on synaesthesia.
Run Off In Front
“This is based on an especially colourful photism that occurred while I listened to Santana’s version of a song called Adouma.
The colours I see are the colours of light, not the colours of pigment, and I played this song over and over again as I painted the moving colours. The advantage of sound visions, or photisms as the researchers call what we synesthetes see, is that I don’t have to rely on
my memory. I can replay the song as often as I want to watch the colours.
These moving colours will swirl around, one seemingly chasing the others and any previously seen blackness will be pushed all the way to the edge until the colours just explode in their brilliance like fireworks.
The colours, for me, are triggered by the sounds of the instruments, including voices,
not the sound of individual notes, with the exception of the Shakuhachi flute I heard that winter day. (I am hoping that one day I will know what notes are what colour, and if that ever happens then I will have perfect pitch, something I’d love to have.”
Clouds Rise Up
“I made this painting last winter after I heard a musician play an untitled piece on his Shakuhachi flute. Unlike the fast-tempo songs I usually work to because I like to watch the colours change quickly, the song he played had a very slow tempo. I call this Clouds Rise Up because this is exactly what I saw as I listened to him play his flute. Each note he played had two sounds and two colours: red and orange, which is why the two colours you see move together as one shape on the slightly metallic green surface.”
This approach to audio visualisation is an embodiment of music in a visual format rather than an accompaniment to the music.
Texture and colour play a role in creating the image of music. It can suggest tempo, mood and timbre.
Seven Sirens – ISO?
I have had a lot of trouble trying to locate any footage of ‘Seven Sirens’. Seven Sirens was a VR piece at ISO. The room would crumble away and the audience would be taken into an abstract/ digital/ psychedelic environment where geometric heads would fly past. It isn’t exactly an audio visualisation but a similar experience could be created in VR/ Unity.
Square waves and sine waves- square waves have a harsh timbre whereas sine waves are smooth. A square wave is difficult to produce as it requires many sine waves to produce it.
I like this one…
Although these can be quite pleasant visually, I have seen hundreds of these circular audio visualisations.
Perhaps I could plug in a keyboard (or laptop keyboard) and link it with a VR headset within a Unity sketch. The press of the keys could trigger events or create a sort-of reverse Guitar Hero type game.
Maybe every note you play is out of tune and you get booed off the stage.
I liked the idea of having a virtual environment in which events are triggered by certain sounds. Like a museum where artefacts are lit up with music. It would create an interesting experience that wouldn’t take too much time.
I ran into the issue of finding meaningful artefacts that tie in with sound.
After some thinking, the song ‘Whiplash’ could work well. I remember it having obvious sounds that could work well at triggering events.
I could recreate a scene or objects from the film. Or I could choose my own artefacts that represent the themes of the film e.g. the dichotomy between ‘genius’ and life.
I liked the idea of having industrial visuals. Things like molten steel, hammers and sparks in a dark room. The issue would be finding sounds that work with the visuals. The sounds of a steel mill would be full of white noise and it would be difficult to get sounds to trigger the corresponding visuals without manually animating it.
In order for the visuals to have a meaningful grounding, it may be interesting to make the piece fit within the theme of digital or automated production. This opens it up to having audio that isn’t as literal as production sounds and can be music that is relevant.
The more I think of the piece the more Modernist it becomes. My understanding of modernism is almost entirely based on ‘The Shock of the New’ but I recall that Modern artists were often concerned with production and industrialisation.
Artists would depict natural phenomena from a mechanical perspective. This can be seen in the works of futurist artist Giacomo Balla and Marcel Duchamp.
When I thought of industrial music, or more music with the theme of production at heart, I thought of Kraftwerk. Kraftwerk produced the bulk of their discography in the seventies and eighties which is half a decade after futurism. It is, however, very futurist, at least from the information I have been exposed to. Kraftwerks admiration for cars and transport is very futurist.
Kraftwerk were pioneers in electronic music meaning their music was produced in the context of computer-production.
I decided I would use the song “Computer Love” in my piece, as it isn’t that complex/ crowded a song and it is very relevant to the themes I wish to explore despite being so literal.
While looking into computer-production I quickly looked into AI production of art. There has been a lot of buzz around AI painting recently but I wanted to look into AI music. The best I could find was AIVA. It is very difficult to find how the AI works as I think the creators want investors to believe it is much more sophisticated than it is. I think it uses machine learning from basic tunes and generates its own tune. Musicians then create music from this tune and they slap “composed by AIVA” on the front of it.
Audio Visuals in Unity
I have not yet decided on an idea, but they all seem to exist in 3D space so I have decided to work in Unity.
I worked through an Audio Vis Unity tutorial to understand how to access the properties in audio to be able to visualise them.
Starting simple, I thought I would make a dark room full of spheres that light up depending on their audio band.
The first thing I needed to create was a mesh light that I could manipulate the intensity of. Unity doesn’t offer mesh lights in the free version so I had to create a sphere with a point light component and an emission material. I then wrote a script that simultaneously increases the emission value and light intensity.
Using the script in the tutorial, I managed to access the amplitude of one of the 8 bands of audio, then use it to control the intensity of the light.
I created a particle effect that is supposed to look like sparks. I also made the camera first person.
I managed to link the particle system to the audio. I wrote a script that essentially says “if this audio band is this loud, play the particle system. Else, stop it.”
I didn’t want to spend too much time in Maya building assets as that’s not what the project is for and where I drew a lot of criticism in last years AR project.
I decided on having 3 artefacts- A piston/press, a hammer, and a bucket pouring molten steel. These assets will all use hot metal to emit light into the scene and follow the music.
I will also use particle effects to keep in time with the music.
Bucket Pouring Steel
All materials are built in Unity. The molten steel is a funny shape as it was originally supposed to fill a mould. If I have the time, I will fix this at the end of the project.
I am going to animate the piston to keep the tempo of the song. “Computer Love” is around 125bpm so that means there is a beat every 0.48 seconds. At 60 frames per second, the piston needs to strike every 28.8 frames to be in time with the song.
I added a slight shake to the base of the piston. I also squashed the molten and made it brighter when it is hit. This was all animated within Unity.
I built a cube in Maya as it comes with all the UVs built-in then brought it into Substance Painter. I applied the default concrete material, exported the maps and applied them to the model in Unity. It may seem like a minor detail but in the final version, you can see the light reflect off the concrete and the illusion of depth is created.
I built a girder in Maya, built its UVs, brought it into Substance painter and applied the steel material. I then imported the materials into Unity.