ForumsHardware ← Granular App for Soundplane

Hi There,

I hacked together a granular app in Supercollider during the July iteration of the Monthly Music Hackathon in NYC.

It took me a while, but I finally made a functional GUI and an experimental standalone version (OS X only) so you don't have to know how to use Supercollider to mess around with it. I also cleaned up the code so I wouldn't be totally embarrassed ;-)

It's pretty simple, but also fun. It maps 5 samples across the 5 rows of keys on the Soundplane. Think of your finger as a playhead dragging across the sample. Grain size and density are controlled by pressure (z). Dragging your finger to the right plays the grains forward, dragging to the left plays in reverse.

To use, your Soundplane client has to be transmitting on port 3123 (the default) and the layout must be 'Chromatic'.

The UI is just a rectangle with 5 rectangles in it representing the rows. Drag supported sound files (basically any uncompressed format, not .mp3 or .aac) onto the box representing the row you want to play the file on in the UI to load. If the file loads successfully a waveform view is loaded.

Oh, and only mono buffers for now. You can drag multi-channel files into the window, but only the first channel will load. There's probably a limit to the length of the sample, though I find anything over a minute gets hard to predictably control. Let me know if you find it.

6 touches are supported by default. The currently playing area will be highlighted for each touch (for visibility, the size of the area is not to scale, but the start point is correct).

You can download the experimental .app bundle here.

It's basically a shortcut to a script that opens Supercollider and loads the file automatically, wrapped in an application bundle.

Code is below.

Let me know if you like it, or have any questions or suggestions.

Cheers,
Nick

------------ copy and past everything below this line into supercollider --------------

s.boot //

(
// boot your server, then double-click the line above this to select all, then shift-return to evaluate.

// INIT //////////////////////////////////////////////

~maxTouches = 6; // set this to the maximum number of touches you want to support
~oldX = Array.fill(~maxTouches, 0); // for keeping track of direction of movement in the x axis

fork({

SynthDef(\tapeGrains,
    { | out = 0, pan = 0, x = 0, y = 0, z = 0, buf, rate = 1, gate = 0 |


        var snd, env, trig;

        env = EnvGen.ar(Env.asr(0.01, 1, 0.01), gate);

        trig = Impulse.ar((30 * z) + 10);

        snd = GrainBuf.ar(
            numChannels: 1,
            trigger: trig,
            dur: 0.0001 + (0.1 * z),
            sndbuf: buf,
            rate: rate,
            pos: x + WhiteNoise.kr(0.001),
            pan: WhiteNoise.kr * (1 - z)
        );



        snd = snd * env;

        snd = Pan2.ar(snd, x - 0.5);

        Out.ar(out, snd);


}).add;


s.sync; // wait for synthdef to load to server fully before continuing


~samples = 5.collect({Buffer.new}); // init buffers

~players = Array.fill(~maxTouches, { Array.fill(5, {|i| // spawn synths

    Synth(\tapeGrains, [\buf, ~samples[i]]);

    });

});

});

// GUI //////////////////////////////////////////////

w = Window.new("Magnetophon", Rect(400, 400, 600, 510));

~soundViews = 5.collect({|i| SoundFileView(w, Rect(5, 5 + (100 * i), 590, 95))
.name_(i)
.background_(Color.grey)
.waveColors_([Color.black])
.canReceiveDragHandler_({ View.currentDrag.isKindOf(String)})
.receiveDragHandler_({
arg v;

    ~samples[v.name.asInt].allocReadChannel(View.currentDrag, channels: [0],
        completionMessage: { |buf| buf.loadToFloatArray(action: { |a| { v.setData(a) }.defer });
            fork({0.25.wait; ~samples.do({|i| i.query});}) } );

});

});

~soundViews.do({|view| ~maxTouches.do({|i| view.setSelectionColor(i, Color.yellow.vary(0.2, 0.2, 0.8))};);});

w.view.onResize = {|v| ~soundViews.do({|view, i|
var width, height, x, y;

height = v.bounds.height/5 - 6;
width = v.bounds.width - 10;

x = 5;
y = 5 + (i * height) + (i * 5);

view.moveTo(x, y);
view.resizeTo(width, height)};

)};

w.front;

// Generate OSCDefs //////////////////////////////////////////////

~maxTouches.do({ |i|

OSCdef(("touch" ++ i).asSymbol, { | msg, time, addr, recvPort|

    var x = msg[1] - 0.006, y = msg[2], z = msg[3];

    var rate = if(x > ~oldX[i]) { 1 } { -1 }; // set direction of grain playback
    var row = 4 - (y * 5).floor;

    ~oldX[i] = x;

    ~players[i].do({
        | player, j|

        if(row == j)
        {
            player.set(\gate, z, \x, x, \y, y, \z, z, \rate, rate);
        } {
            player.set(\gate,0);
        };
    });

    ~soundViews.do({
        | view, j |

        if(row == j && ~samples[row].numFrames.notNil)
        {
            { view.setSelection(i, [(~samples[row].numFrames * x), 2000 * z]) }.defer;
        }{
            { view.selectNone(i) }.defer;
        };

    });


}, ('/t3d/tch' ++ (i+1).asString).asSymbol, recvPort: 3123 ); // make sure recvPort is set to Soundplane port

}

);

)

Amazing, thanks for sharing! I don't have time to try it this week but I hope someone does and reports back.

This really makes me want to add better formatting for things like code to the forums here... we are always tweaking and have some more redesign in the works.

So I made a quick video to demonstrate the app.

https://vimeo.com/113115850

Here's a little improv I did with an earlier version of it (plus a monome white whale module that I'd just gotten):

https://soundcloud.com/nicholas-colvin/improv-on-a-declaration-of-war

The newer version doesn't map z to playback rate (pitch) anymore, but that would be easy to remap if you wanted to. It was kinda hard to control.

looks interesting, hope to have a play with this next week, when my soundplane arrives.

Ive not tried SC but looks good, very compact. alot of functionality for a relatively small amount of code.