redFingerprint

a different plot for supercollider. though it is actually more useful as an 'artistic' visualizer rather than a serious way to represent data. it works with collections like arrays, envelopes and wavetables. the technique is to translates them to length/angle pairs and then draw a shape from that.



it's distributed via supercollider's package system quarks. all open source.
(update 090511: redFingerprint is now part of the redUniverse quark)
how to install:
download supercollider
Quarks.checkoutAll
Quarks.install("redUniverse")
+recompile

there's also some older code here that does similar drawings...
http://swiki.hfbk-hamburg.de:8888/MusicTechnology/833

data?

and of course it's always more fun when things break. here the same thing with a nice bug...

work with mark: RedUniverse - a simple toolkit

i made a short demo/poster session at the LAM conference 19dec in london. see livealgorithms.org
below is the handout describing the toolkit.


this toolkit is now distributed via supercollider's package system quarks. all open source.
how to install:
download supercollider
Quarks.checkoutAll
Quarks.install("redUniverse")
(if you run OSX and prefer to use SwingOSC over Cocoa gui, you'll need to move the file RedJWindow.sc to the osx folder and recompile.)

RedUniverse - a simple toolkit

Mark d'Inverno & Fredrik Olofsson

This is basically a set of tools for sonification and visualisation of dynamic systems. It lets us build and experiment with systems as they are running. With the help of these tools we can quickly try out ideas around simple audiovisual mappings, as well as code very complex agents with strange behaviours.

The toolkit consists of three basic things... Objects, Worlds and a Universe. Supporting these are additional classes for things like particle systems, genetic algorithms, plotting, audio analysis etc. but preferably many of these functions you will want to code your self as a user.

We have chosen to work in the programming language SuperCollider (www.audiosynth.com) as it provides
tight integration between realtime sound synthesis and graphics. It also allows for minimal classes that are easy to customise and extend. SuperCollider is also open for communication with other programs and it run cross-platform.

So to take full advantage of our toolkit, good knowledge of this programming language is required. We do provide helpfiles and examples as templates for exploration, but the more interesting features, like the ability to live-code agents, are hard to fully utilise without knowing this language.

Detailed overview

In SuperCollider we have the three base classes: RedObject, RedWorld and RedUniverse.

RedObject - things like particles, boids, agents, rocks, food etc.
RedWorld - provides an environment for objects.
RedUniverse - a global collection of all available worlds.

Objects all live in a world of some sort. There they obey a simplified set of physical laws. They have a location, velocity, acceleration, size and a mass. They know a little about forces and can collide nicely with other objects.

Pendulums are objects that oscillates. They have an internal oscillation or resonance of some sort.

Particles are objects that ages with time. They keep track of how long they have existed.

Boids are slightly more advanced particles. They have a desire and they can wander around independently seeking it.

Agents are boids that can sense and act. They also carries a state 'dictionary' where basically anything can be stored (sensory data, urges, genome, phenome, likes, dislikes, etc). Both the sense and act functions as well as the state dictionary, can be manipulated on the fly. Either by the system itself or by the user in runtime.

Worlds provide an environment for the objects. They have properties like size, dimensions, gravity etc and they also keep a list of all objects currently in that world.
For now there are three world classes:
RedWorld - endless in the sense that objects wrap around its borders.
RedWorld2 - a world with soft walls. Objects can go through but at a cost. How soft these walls are and how great the cost is depends on gravity and world damping.
RedWorld3 - a world with hard walls. Objects bounce off the borders - how hard depends on gravity and world damping.

The Universe is there to keep track of worlds. It can interpolate between different worlds. It can sequence worlds, swap and replace, and also migrate objects between worlds. All this while the system is running.
The RedUniverse class also does complete system store/recall to disk of all objects and worlds.

So the above are the basic tools. They should be flexible enough to work with e.g. objects can live in worlds of any number of dimensions. But as noted, one can easily extend functionality of these classes by subclassing.

Conclusion

How the objects and worlds behave, sound and look like are open for experimentation. That is, this is left for the user to code. So while there is great potential for customisation, it also requires more work form its users.
The RedUniverse as a whole tries not to enforce a particular type of system. E.g. one can use it purely without any visual output or vice-verse.
We see it both as a playground for agent experiments as well as a serious tool for music composition and performance. We hope it is simple and straightforward and while there is nothing particularly novel about it, we have certainly had fun with it so far. Foremost it makes it easy to come up with interesting mappings between sound and graphics. In a way we just joyride these simple dynamic systems to create interesting sounds.

The software and examples will be available online on the LAM site. Of course as open source.

(note:and also in the supercollider package system quarks)

default synth hack

i recently implemented something nick collins and i discussed a long time ago (sc2 era - custom event class). it is a 'hack' of the default synth in supercollider. ie the one that many of the help and example files uses. so when you install my class, the default file will be overwritten and all the slightly daft pattern examples will from there on spring into new life.

distributed via supercollider's package system quarks.
how to install:
download supercollider
Quarks.checkoutAll
Quarks.install("redDefault")
+recompile

update 111116: redDefault is no longer a quark. it's available here.

then run some examples. most of the ones in Streams-Patterns-Events5.help.rtf and Streams-Patterns-Events6.help.rtf work very well. see RedDefault.help.rtf for more info.

(and yes, it is easy to uninstall and get back to the boring default synth)

just to compare - here's first an example taken from a helpfile playing on the default synth...

and this is the exact same example with my hack installed...

not only does it create a new synthesiser, it also changes duration, attack/release times, amplitude etc. the pitches are mapped to a diminished chord in a somewhat strange way: the slower the duration - the greater the leap between the notes to quantise to. eg if half or whole notes, only octaves will be heard.

a tiny little white one

this chunk of sc code will create a tiny but not so well behaved audiovisual creature.
(i must admit i stole the title from a.berthling's album on mitek)


a_tiny_little_white_one.mov (32,7mb)

/*a tiny little white one  /redFrik 061009*/
 
/*
GUI.cocoa;
GUI.swing;
*/

 
(
s.waitForBoot{
        n= 25;                                                          /*number of arms*/
        b= {Buffer.alloc(s, 32, 1)}.dup(n);             /*length must be power of 2*/
        SynthDef(\wormsnd, {|out= 0, bufnum, freq= 60, amp= 0.01, pan= 0|
                Out.ar(out, Pan2.ar(OscN.ar(bufnum, freq, 0, amp), pan));
        }).send(s);
})
 
(
        var width= 300, height= 300, freqSpread= 100.rrand(1000).postln, muckProb= 0.0008,
                muck= 0, i= 0, j= 0, shapes, synths, pnt, w, u, freq,
                centerX= width/2, centerY= height/2, o= 0.1, frict= 1, lfo= 1, lfoSpeed= 0;
        w= Window("a tiny little white one", Rect(128, 64, width, height), false);
        u= UserView(w, Rect(0, 0, width, height));
        u.background= Color.black;
        w.onClose_({synths.do{|x| x.free}});
        CmdPeriod.doOnce({w.close});
        w.front;
        shapes= {|x| {1.0.rand}.dup(b[x].numFrames)}.dup(n); /*init shapes*/
        synths= {|x| Synth(\wormsnd, [\bufnum, b[x].bufnum, \pan, x/(n-1)*2-1])}.dup(n);
        u.drawFunc= {
                shapes.do{|shape, x|            /*iterate shapes, x is index*/
                        var dist;
                        if((muckProb*0.1).coin, {muck= 4.rand});
                        if(muck>0, {
                                ([
                                        {pnt= Point(x/n*10, x/n*10)},
                                        {pnt= Point(x/n* -10, x/n*10); if(muckProb.coin {muck= 0})},
                                        {pnt= Point(x.rand2, x.rand2); if(muckProb.coin {muck= 0})}
                                ][muck-1]).value;
                                if(i%2000==0, {muck= 0});
                        }, {
                                pnt= Point(0, 0)
                        });
                        lfo= (lfo+lfoSpeed).fold(0.05, 1);
                        i= i+1;
                        j= (j+10.rand2).fold(0, shape.size-1);
                        shape.put(j, (shape[j]+o).fold(0.01, 1));
                        if(muckProb.coin, {
                                o= [0.15.rand2, -1, 1].wchoose(#[0.95, 0.025, 0.025]);
                                frict= [0.997.rrand(1), 0.95.rrand(1.5)].wchoose(#[0.95, 0.05]);
                                lfoSpeed= 0.0001.rand2;
                                [
                                        #[\o, \frict, \lfo, \lfoSpeed],
                                        [o, frict, lfo, lfoSpeed].round(0.0001)
                                ].lace(8).postln;
                        });
                        o= o*frict;
                        b[x].sine1(shape.clip(0.01, 1));
                        Pen.strokeColor= Color.grey(x+1/n);
                        Pen.moveTo(Point(centerX, centerY));
                        shape.clump(2).do{|ll, k|
                                var distance, angle, temp;
                                #distance, angle= ll;
                                pnt= Point(distance, distance).rotate(angle*2pi*lfo)+pnt;
                                Pen.lineTo(
                                        Point(
                                                (pnt.x*10+centerX).clip(0, width),
                                                (pnt.y*10+centerY).clip(0, height)
                                        )
                                );
                        };
                        Pen.stroke;
                        dist= pnt.dist(0).clip(0.1, 20);                /*distance from 0, 0*/
                        freq= dist/20+lfoSpeed+muck+(lfo*0.01.rand)*freqSpread+60;
                        synths[x].set(\freq, freq, \amp, (1/n)*dist/20);
                }
        };
        {while{w.isClosed.not} {u.refresh; (1/30).wait}}.fork(AppClock);
)
 
b.do{|x| x.free};

skare - new video online

i recently made my first short video for skare. we like to make things a little bit complicated for ourselves and we also have a hook up on ice, snow and all other variations on cold water.
first - to get some cheap audiovisual correlation - i put an old cd in the freezer for two weeks. then one night i took it out and placed it over the bass element of a speaker. as the piece of plastic slowly adapted to room temperature, i let it vibrate to the deep fat bass found in the track 'To the Other Shore' (released on glacial movements). this was all filmed twice, close up and in nightshot mode.
i then wrote a little max/jitter patch that mixed the two takes, matched it with the audiofile and saved the whole thing to disk. the resulting video is here

screenshot of one of my uglier patches...


work with mark: genetics

i also spent time at UoW learning about genetic algorithms and genetic programming. mainly from john h holland's books and karl sims' papers. i found it all very interesting and inspiring and again i got great help and input from rob saunders.

one of our ideas was to construct synthesis networks from parts of our agents' genomes i.e. to have the phenomes be actual synths that would synthesise sound in realtime. the first problem to tackle was a really hard one. how to translate the genome - in form of an array of floats - into a valid supercollider synth definition?
of course there are millions of ways to do this translation. i came up with the RedGAPhenome class which works with only binary operators, control and audio unit generators. unfortunately there can be no effects or modifier units. on the other hand the class is fairy flexible and it can deal with genomes of any length (>=4). one can customise which operators and generators to use and specify ranges for their arguments. you can also opt for topology of the synthesis network (more nested or more flat).
there is no randomness involved in the translation, so each gene should produce the exact same synthdef. of course generators involving noise, chaos and such might make the output sound slightly different each time but the synthesis network should be the same.
this class produces a fantastic range of weird synths with odd synthesis techniques, and it is useful just as a synth creation machine on its own. here are some generated synths... n_noises.rft, n_fmsynths.rft, and corresponding 5sec audio excerpts are attached below.

then, after the struggle with the phenome translation, the code for the actual genetic algorithms was easy to write. the genome and its fitness are kept in instances of a class called RedGAGenome, and the cross breeding and mutation are performed by the class RedGA. there are a couple of different breeding methods but i found the multi-point crossover one to give the generally best results. all the above classes and their respective helpfiles and examples are available here. and there are many more automatically generated synths in the attached krazysynths+gui.rtf example below.

i also made a couple of fun example applications stemming from this. one is a six voice sequencer where you can breed synths, patterns and envelopes. it is attached as 'growing soundsBreedPatternEnv.rtf' below. (note that the timing is a bit shaky. i really should rewrite it to run on the TempoClock instead of the AppClock.)


ref articles:

Frankensteinean Methods for Evolutionary Music Composition, Todd and Werner
Sounds Unheard of – Evolutionary algorithms as creative tools for the contemporary composer, Palle Dahlstedt
Evolutionary Design by Computers, Peter J. Bentley
Technical Papers, Karl Sims
Evolving Sonic Ecosystems, Jon McCormack

ref books:

John H. Holland - Hidden Order: How Adaptation Builds Complexity
Melanie Mitchell - An introduction to Genetic Algorithms
Richard Dawkins - The Blind Watchmaker

update 101128: growing_soundsBreedPatternEnv.rtf file updated, also see this post.

 

//--n_noises

 

//--n_fmsynths

Pages

Subscribe to f0blog RSS