|
a special rhythmic pattern, is an utterly useless, beatless rhythm. (Hardly a rhythm at all)
A number of proto-rhythms are used in Walking on Four:
To form the following rhythms:
Each of those rhythms may be used in different parts, visual or musical ones, and with a different underlying metronome.
We will nevertheless notice that was is missing from this system is a way to emphasise each beat differently: there is no notion of accents, and it must be emulated by hand by layering multiple rhythms together, hence the presence of M4J4E5 together with m4j4e5, or A5j9A5r7 together with a5j9a5t7.
Generating notes at random, like I did previously produces feelings of unresolution, aimlessness, but when done completely at random, fails to have any sort of harmonic character. No atmosphere, except that of a sterile, if somewhat dissonant mechanism.
To obtain a degree of control on this, I wanted to reduce the set of notes - to work in a given mode - selected by the generation algorithms. The relative, average interval between notes will enable or disable certain consonance or dissonances.
The intro would still feature a random generator as a base for note generation, but this time only selecting among notes from certain sets, while also being tweaked in favour of certain preferred sequences.
Modes are specified as a set of notes indexed by their semi tones interval from a given root:
An example would be : which would represent the following set of notes: , and .
The octave is not taken into account here.
For reference:
0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 |
In Walking on Four I picked
Which would otherwise be known as C Phrygian[7]. It comprises minor chords, giving a slightly dissonant atmosphere.
The note generator was specified in terms of probabilities of notes, but also evolves as time does, going up or down that scale, following the () rhythm. (at frequency: ).
Where
,
,
,
,
,
and
are defined as:
alternating between
and
along a
(
)
rhythm.
increases with each beat of the underlying rhythm.
produces an exponential distribution of random numbers.
returns a value
with a given probability.
The idea is to generate a pattern of notes with a globally raising (but sometimes descending) motif, with notes wrapped around an alternating register of 7 to 21 semi-tones.
The atonal instruments like the click and bassdrums are controlled directly by the following rhythms:
clicks | t3ppp |
bassdrum | a5j9a5r7 accented via A5j9A5r7 |
I've always loved feedback-based effect, and the patterns that emerge from their often chaotic behaviour. I tried to capture this via the implementation of an IIR filter in video.
This is a transposition in video of the simple audio low pass filter I use in the synthesiser. But with one problem: the sampling rate of video is quite small (50-60hz) and non homogeneous as well. Both factors make the results of the filter quite difficult to control. Nevertheless, we get interesting results.
Frames are saved, rendered into six different textures, three capturing the input frames, while the three others capture the output frames. In turn, if one texture stands for the current frame at one point in time, during the next frame that texture will represent the past. When the texture is too old, it now again becomes the present time frame.
Using OpenGL it comes down to using glCopyTexSubImage2D
to copy the currently rendered picture to a texture.
Figure: Video low-pass filter. delays a video frame by one frame.
Then multiplication and addition of frames is achieved through blending, using the filter's parameters as alpha channels for each frame, with:
For the output stage, an OpenGL extension,
EXT_blend_subtract
is used to apply negative
alpha channels:
The filter is also combined with a rotating and zooming stage to give it a small twist: the most dramatic effect is achieved towards the end of the intro, when the filter produces patterns resembling a tunnel.
I am always following many different things at a time, and I especially like to look around to see what others are doing. So, watching out for pieces made with the processing[3] software package, I stumbled on Quasimondo's portfolio[4] and one example especially attracted my eyes: his caustics simulator, Raycoaster[2]. Caustics are the subtle light effects one can experience in a swimming pool, when the refracted light rays accumulates in certain areas and thus produce almost geometrical patterns.
The original idea was kept: a simulation where light is represented by particles, grains projected inside a chamber with varying properties, and whose resulting trajectories form a final image. This had to form a source material for the actual visual effects.
My simulation is formed of two subparts. The chamber, where the simulation is performed, which ends with a projection screen. And the particle emitter, which defines how grains of lights are injected in the chamber, and their initial conditions.
The chamber was designed as a stack of different media, each with their different properties. Each medium is composed of elements that pulls, pushes or otherwise affect the trajectory of grains inside them, until the grain exits to the next media.
Let , , be three axis. , form planes parallel to the final projection screen. The corresponding coordinates of a grain of light are , and .
One medium is defined by:
The force fields are of two main kind:
Let be the position in the field's referential. The force field is here centered around with
Which tends to rotate the grain around its center.
When f is below zero, the forces appear to be attractive, when above zero repulsive.
By then, if you have any elementary knowledge of physics you will have realized this is not a sound physical simulation of a known, real phenomenom. Indeed, very rightly so. Modelling photons as if they were balls on a pool table would not be acceptable. But we are not interested here in modelling reality, but to produce material of a certain degree of complexity while being easy to manipulate, and staying continuous.
It is very easy to create different force fields, and even animate them by animating the sources of rotational and attractive forces, or changing their magnitudes.
Contrary to the other two ones, this scene displays the inside of the simulation. Instead of taking its result, we are following the trajectory of grains sent inside the chamber. Here, their inertia / speed is being tweaked so that each force fields affect their trajectory in a much stronger way than in the actual simulation, but the principles and parameters for each force fields are the same.
Splines aren't even used! We get second order continuity for free, by virtue of integrating the Newtonian equations, with continuously varying forces.
Particles are injected in sequence from starting point around the screen via a simple sine based equation.
Every effect starts with a sine after all.
As they get modified by the simulation the grains record their own trajectory: the history of their position, speed and acceleration.
To display them, we thus sampling the path of each grain. At each step in the trajectory, the path is drawn by going through through this history, connecting each point via a quadrilateral, slightly slanted from the horizontal to give the drawing a calligraphic look. The width of each quad of light is also mapped on the velocity of the particle at this position, the faster it was, the thinner the ribbon appears around this part.
Throwing all my grains of light through a weird simulation of medium/light interaction was initially devised as a way to generate an animated texture. A texture which would be later used in other effects throughout the intro.
At the origin of any grain, we have a particle emitter. It generates grains over a whole area, spaced out regularly. But covering the screen in a natural way is not so straightforward. I first used a regularly spaced grid, but then the grid would show up in the resulting texture as persistently rectangular features. In order to fix it, a Fibonacci-lattice[8] was used instead.
A Fibonacci-lattice of rank 1 is a two dimensional sequence defined from the Fibonacci sequence F as:
With F:
Choosing here means choosing the number of points we can generate inside the plane. I chose , for points.
Now, each grain is injected in the chamber, and its movements
calculated as it goes through it. As grains encounter medium changes,
or the final screen they release a new particle, called a
ParticleImpact
, which shows up as an additive Gaussian grain of
red/pink light.
And as we change the simulation parameters with time (for example by moving force fields around) we can produce an animation.
Now we have to capture the resulting frames to standard OpenGL textures.
An extension enables precisely that: the EXT_framebuffer_object
extension, also abbreviated as FBO extension. It was introduced in
2005 to replace the platform specific pbuffer extension. Pbuffers have a
crufty, slightly different interface for each operating system.
This extension appeared rather late in OpenGL drivers, first for NVIDIAcards then later in the year for ATI cards. Support for older card from other manufacturers is unfortunately absent.
So, running the simulation multiple times with slightly varying forces then capturing the result in a texture resulted in an animation of 6 textures, that I would later use in the visual effects.
Of course a great deal of time was lost tracking down a bug that led me defining non-power-of-two textures, which were silently accepted by one driver while leaving the other drivers crawling to a halt. I also did experiment a bit with texture compression, but it would reliably crash as the sets of frames in the animated texture grew. Scaling down the animation to a reasonable number was in the end considered more compatible.
This is one of the big classics of demo effects. It is the basis of tunnel effects, and most of the 2d transformations seen in demos of the mid 1990s.
The design involves only one component, but remains very flexible:
A grid of cells is laid out over the screen. Cells are fixed in the screen, and composed of four vertices. Each vertex holds a set of input coordinates. A typical meaning for input coordinates is to represent the coordinates in an input texture. Thus, each quadrilateral maps over the screen a certain area from the input texture. The whole grid enables us, by using varied coordinates in space and time to deform an input texture.
This grid forms a sampling of a more general, pre-calculated function.
This texture can be animated as I did here: the input texture comes from a number of frames rendered from the simulation.
Animating the grid itself is a useful tool. The method I adopted is a very simple cycle:
The deformation is customised, but here corresponded with an expansion of the grid's input coordinates around a continuously moving center. To avoid too great of a change, this deformation is only applied at certain points in time, here with a frequency of . One can compose deformations rather freely, and another step was added, zooming the input texture in and out in alternation, at a frequency of .
The relaxation gradually brings the grid to its resting state: a plain grid mapping each point of the screen to its corresponding point in the texture.
All of this results in two turgescences that appear to rotate over the screen, as we apply two expansion deformations around two different, rotating centers. Each deformation is rotating in opposite directions around the center, about away from the screen's center. The full rotations take about 12 seconds each to go around the screen.
This effect aims to imitate movings lights as they would appear through a dense, fog like atmosphere. It uses the animation created during the simulation in a different way, here in three dimension.
At the center of the screen, a small mapped cube is drawn. This cube moves on its own, pushed regularly by forces in opposite directions, like somebody was punching it from side to side. The rhythm was already mentioned: (), with a frequency of ). Another rhythm is used to control a greater impact of the “punch” each time the accenting rhythm () produces its beats.
The cube is mapped via OpenGL's spherical mapping, which is normally used for environment mapping. It is then recopied many times at greater and greater scales, transparently.
The bigger and bigger cubes, accumulated further on the screen, and combined with a low pass filter () give the illusion of rays of light coming from the origin of the initial cube, towards the extremities of the screen.
The transparent clones of the base cube are themselves expanded and contracted as the accenting rhythm kicks in.
Delivering the intro in Barcelona was difficult, but a great pleasure, and I learned a bit while creating it. Physical simulations remain a very interesting model for creating complex content.
The color palette seemed to have irked quite a few people, although that might not be surprising as I intended it as slightly aggressive. Nevertheless, I'd like to experiment a bit more with representing colors and their relations, a quite fascinating topic all in itself.
Connections between Walking on Four and the old 90s intros also didn't go unnoticed, which was nice to read.
I took a few months out of coding, to concentrate a bit on music making. In contrast with my work in trying to represent timelines algorithmically, I also became interested in taking the counterpoint: interactive control of a predefined set of effects.[12]
Why? Why do I hate talking about myself? Sincerely I hate it. It's like conceptualising one's desires. Don't you find one must be authentically pretentious to employ those words! Conceptualising one's desires! I mean that I hate all those guys who pretend to be authors. A sequence of images, a bit of music, and there, here we are. Nothing's cruder than this fashion of using and showing one's very own desires all by oneself. And I'm not talking to you about masturbation. The most incredible thing is, people have the utmost respect for these guys.
TPOLM lazy sunday radio was born in the year 7000, as live music sessions broadcasted through the internet by electronic musicians in helsinki, finland. Over the years the lazy sunday sessions have evolved into an internet festival, in which musicians from several countries and continents stream music to a global audience from their homes. This session on 19 march, is the first time visual artists will also stream live video to accompany the music.http://www.tpolm.com/archive/lsv200603/index.html