thetechnobear's Recent Posts

Not sure if anyone else has been having fun with Reaktor 6, and the new modular blocks

I certainly have, and there are lots of new user blocks :)

anyway, I thought Id do my bit of the community, and publish 2 blocks which soundplane owners might be interested in:

MPE Expression - a midi polyphonic expression block for 8 voices

T3D OSC - a block supporting T3D

both pretty easy to use, create a set of voice chains as normal in Reaktor blocks, and then link P/G/X/Y/Z where you want :)

If you have a soundplane and havent tried Reaktor 6 yet... why not :)

blocks are really cool, just build patches like a physical modular synth, loads of modules, and the user library is growing at an amazing rate... link this up with a soundplane and its a fantastic playground.

its almost worth it alone, to build an 8 voice poly Monark (takes about 5 minutes!) ,that is completely controlled by x/y/z... the oscillator and filters are lovely

Hi,

(Id assume this also applies to Kaivo, but not checked)

Ableton Live (9.5) now supports loading VSTs/AU from the Push (yippee!)

so now my workflow is to use the Push to create new sets and tracks without having to go back to my computer/mouse ...
this is really working nicely, as I use AUs I can save presets in ableton and all is good, I can browse them and load them
(I don't need all aalto presets, just a few of my own, so I'm not too fussed that I cannot get to the Aalso 'factory presets' as they are not stored as aupresets)

EXCEPT... when I load an au preset, it doesnt restore the input transport properly.
It correctly displays that i have it set to OSC and to offset 2, but its not actually listening.
so I have to go back to my keyboard/mouse, bring up aalto and change it back and forth.

please... can you fix this, with Push/Maschine and controller like the P8 (and quite a few others) its becoming more common place that people work away from the computer ... and don't want to return to it have to 'set things up' ... which is a real workflow killer.

also it would be handy to have the transport and port and automation targets, this would mean that I could adjust them from the push, handy if Ive put the soundplane in VpC mode, to use other instruments (e.g. u-he) and then need to switch to aalto.

BTW: I don't know if your considering NKR support but might be worth it, if this becomes a standard for preset browsing.

Im not sure how much you've been considering this trend for musicians to (physically) move away from the computer and treat it as an instruments, I find it enjoyable, allows me to focus more on music making...

I'm finding the soundplane software also needs a bit of tweaking for this... e.g. due to limitations of software synths, some might require me to use VpC, but I use T3D for ML... and also I sometimes want to switch the SP to single channel midi to control hardware synths. again these kind of things I don't want to have to come back to the computer to do...

ideally Id like to be able to do this 'switching' via midi, so I could either do via the Push, or use a midi pedal.
but perhaps thats for a different discussion ...

Ive found another issue with Live/Aalto not going into T3d mode

Im not sure if its related...

due to the limitations above, as a workaround I decided to save an Aalto track (using AU) as into my default template (which is just a project) with the OSC mode already enabled.

this nearly works.... but not quite...
what happens is Aalto is correctly loaded but it does not start processing OSC/T3D messages until you make the aalto window visible

after some experimentation, i found this is because before the UI is displayed it is actually in midi mode i.e. only when the UI becomes visible does it 'switch' over to T3D mode.

*note: the same happens with saved projects when I reload them :(

EDIT: I can also confirm both issues occur on VST as well as AU

hmm, but MPE mode should theoretically also be automatic... as MPE sends an mpe on/off from the controller.

but yeah, tricky as also if your using multiple kaivo/aalto you also need to specify offset port.

I can understand your reluctance to not save in the patch... as you may not consider it a sound parameter, but being able to alter it (and also offset) via automation would appear to be the best option given there is no real alternative.

sorry, I'll be more explicit

  • start live & soundplane app (in OSC mode)
  • add aalto AU to track
  • change aalto to OSC mode
  • save aalto as aupreset (save icon in live device) lets call it test.aupreset

so lets try to now reload it

  • delete previous aalto track
  • locate test.aupreset
  • double click, or load into a track

you will see the aalto AU says its in t3d mode (and also in the menus)
but the soundplane will not play it....

this is because its not really in t3d, its in midi mode... ie. the UI is not consistent to its real input protocol , you need to reselect OSC from the input menu

from what you saying above, you view this as a UI bug.

however, the issue, and hence why I tried to explain my 'use-case', is when you create a new instance of aalto its always in midi mode, I really just want to be able to create new instances of Aalto, and it to be in OSC mode (or to be able to remotely put it into OSC mode, probably more useful), and not to have to go back to my mac/mouse and go into the input menu, to change it.

this was the advantage of the way the previous version of aalto worked, which auto detected OSC, it just worked, your didnt have to go and configure it each time you dropped a new instance in a daw.

I definitely want to release a fully modular system in the future. Stay tuned (for a long while, maybe).

Id love to see this, partly just so we can have some utility 'modules' , e.g. so we can mix and attenuate signals when we have multiple inputs into a modulator.
(e.g. I'm having real issues balancing mixing Y axis and pitch tracking into cutoff in aalto... want high modulation on Y, and much less on pitch)

on the other hand, id like a modular that allows mixing of other developers modules, e.g. bring in say vahalla reverbs :)

worth musing over whilst I await verta :)

cool, just be careful with midi monitor, I know sometimes I forget Ive filtered out certain messages when Ive been looking for others.

I can really think of any reason why the soundplane app would stop sending, but if you can find a reproducible scenario let me know, and I will look into it.

greetings from sunny spain :)

agreed, good software design is about clean interfaces which hide complexity, and yet provide functionality its users require.

with expressive controllers its a tough area, they are a musical instrument but at the same time they are required to interface to existing vsts/hardware which take no heed of their requirements. (e.g. they are much more sensitive than keyboards, AT is not the same as continuous pressure) ... so 'tuning' them in, unsurprisingly takes a bit of customisation either on the synth side or the controller.
(MPE is only going to make a small in-road into this issue)

imho, this is why Kaivo/Aalto are so good with the soundplane, they are designed with it in mind, they work as 'a whole'
(its the same reason EaganMatrix + Continuum ... and perhaps Seaboard + Equator, make good pairings)

"Everything Should Be Made as Simple as Possible, But Not Simpler" A. Einstein (?)

The only thing is that it would be great to have pitch control on the x . Not sure why you did not put it

not quite sure I understand you, in single mode PB is also sent, and operates exactly as official app. i.e. single PP sends out PB, 1, PP. (assuming bend range >0)

I guess we could have an option to send X on a CC (rather than PB), but usually you don't need this, since in the synth you turn bend range to 0, and then use a mod matrix to route PB to a control. (this has the advantage PB is 14 bit)

could you describe exactly what you would like... as I'm a bit confused.

frankly, I don't use single channel mode much... Ive not got enough synths (+reakto / max / axoloti) that support VpC or MPE that I just stick with them, and when I do (rarely) use single channel mode, its usually for something like pianoteq, or something with pp (alchemy)

what exactly do you use x/dX for? personally I only use pitch, as I use it for key tracking, I never used dX as to modulate it would also affect pitch which is undesirable for me.

the only use I can see is, if you are not using not using continuos pitch (e.g. synth PB range = 0) then of course you could use X as an additional modulation source.

note: PB in midi is essentially dX.

This is the great thing about the SP we all use it in different ways, but it does make it tricky when adding features, as obviously I add for how I use it :)

x/dX... yeah there are quite a few options once you get into this.. (e.g.key position)
i support with dX we could send 0 to 1 rather than -1 to 1, where 0.5 = original position. this would essentially be the same approach as midi. the 'downside' is things like aalto, currently have dY as -1 to 1...

frankly this, is what I was saying about the Eigenharp approach, its much better, as it allows all aspects to be changed... because there are just so many possible combinations, depending upon the capabilities of the synth.
check this out: EigenD Matrix
note: this is an older version, the newer one has curves, so you can choose anything from exponential/linear (by varying degrees)

I think randy is not in favour of this, due to the 'complexity', but Im getting increasingly tempted to add something like it to my setup. probably a simplified form, as its true, i tend to use only a few different combinations.

my pleasure, good to know it works for you.

adding toggles for x/dX y/dY to the client app is pretty trivial,
(guess they could go up the top with vibrato etc, as they can apply to both midi and osc)

the slight question is how will they be interpreted at the synth end, when using the dX/dY

for midi its straight forward we use 0,63,127 as -1,0,1

for osc, well its only really aalto/kaivo (& my reaktor macros etc).
will they be ok with x/y coming across as negative...

and I think they will actually be fine, as I seem to remember when I added OSC T3D to the eigenharps, I implemented this incorrectly initially, and was sending -1 to 1 , and it still worked... so as long as they have not changed i think it will be ok, but would need to check again.

personally, I'm not sure this is really the correct solution... dx/dy are easy to calculate on the synth end, so whilst i think it was a good idea for aalto/kaivo to have OSC be made compatible with midi. I think an extra output should have been added... so we have dx/dy x, y and mod , +1 +2 +3 in midi.

anyway, its not going to happen so I can add dX/dY to client.

it would be nice longer term if the soundplane client had more options for output.

on the eigenharps not only can you select absolute or relative (for every input), but apply curves, scaling etc... (you can also output to multiple CC with different scalings etc)

It might seem overkill, but it means you can really 'dial in' how a particular vst/preset feels.

do you want it in max/msp? the issue is you still won't be able to use it in Kaivo/Aalto?

its pretty easy to implement.... (following assumes OSC/T3D)

  • at the start of a touch, store x (orig x=x)
  • every touch , dx=x-orig x

notes:

  • you need to do this independently for each touch, so in max you would store in a coll, or similar
  • 'start of a touch', this is a little 'tricky'. you have to use the fact when there is no touch, note = 0, so when you first see note != 0, then its a new touch.

you can actually combine the two, by storing 0.0 in the coll for a touch on note-off, and then only store X when you get a touch and coll value = 0.0

I can post a patch if you want, but Im not sure how you plan to use it... it will be no use in kaivo/aalto, so only really useful if you are using it for another synth.

anyway, let me know if its still useful to you.

ok, posted details on a separate topic, in case others are interested.

Ive developed an 'extended' soundplane app that allows this, if your interested I could post it. it has a few other 'extras' and bug fixes too :)

I use it all the time so is stable, and is always based off the latest Soundplane app so 1.4 at the moment.

obviously, it cannot be supported by Randy, though what I do is have both my version and the official version installed. Then if I find an issue with mine I test to see if it exists in the official release.

fyi: I asked Urs @ Uhe when they would start supporting MPE, and he said once it accepted by the MMA. he didn't want to do it whilst it was a draft.
so it could be quite a while yet. (the MMA are notoriously slow, like most standards committees )

interesting a 5 by 4 grid... or is it just a small section for prototyping?

and the cells are independent, will the sensor stack be too? to avoid 'bleed' between cells, or your thinking an independent top layer will reduce this as it doesn't pull down adjacent cells?

does this fit on to your curved surface you showed previously :)

looking forward to see how this goes.

The main problem I'm thinking of is pressing very hard and then sliding your finger, which tends to create an unpleasant squashed-finger drag feeling

no your not pressing that hard (its configurable too), so thats not an issue... in fact, if anything its slightly more pleasant than the continuum where you are sunk into the surface. (not that that is bad either) , as you say this is probably down to the fact you are not pressing that hard, and the surface is easy to slide on.

(I can understand the concern though, as Ive head some reports that the linnstruments rubber surface is a bit 'sticky' for some... Ive not tried myself, so don't know if this is true or not, or if its personal preference)

haptic feedback...

hmm, wouldn't that cause vibration, which might interfere ( or create noise) with the soundplane surface? I'm sure Randy could provide details.

frankly though, I don't think its necessary, the give on the surface + audio feedback is already sufficient for me to feel 'in touch' with the surface.

(I suspect Id actually get fed up of a buzzing under my fingers before too long too :) )

its tricky, the best way, is to some how get to try one hands on... ideally, we could get in a room and try all the controllers side by side, and see which one works for us, as really they are all different, rather than better/worst.

where are you based?

Im working on a project that will hopefully see the Soundplane 'untethered' from the computer :)

Im going to be posting images/updates etc over at the axoloti forum, as this will provide the synth engine, but I think Soundplane connectivity will be interesting to those here:

AxoCube project

basic details so far are:

  • Soundplane and Eigenharps controlled via a BeagleBone Black (recently changed!)
  • 4 Axoloti boards for virtual modular synth voices, fx, and sequencers
  • Ableton push for 'configuration' , sequencing UI etc
  • STM32F7Disco board - touch sensitive UI , for control when Push not connected, patch loading etc.

(it may be I eventually move the Soundplane/Eigenharp 'drivers' to the STM32 boards)

Beaglebone black is using the libusb drivers for the Soundplane developed by Per Eckerdal , Eigenharp libusb driver i put together, and a new app that will utilise Randy's Touch Tracker code.

Early days, but its looking promising...

more news/updates on the above thread.... and I guess I'll cross post a demo once things are developed a little further :)

I have Aalto (1.7) and Kaivo (1.2) running on El Capitan using Live (64 bit) so far doesn't appear to be a problem. (tested both VSTs and AUs)

(not on my main machine, but a machine Im using for 10.11 testing, so (I've not tried Numerology yet)

Does the soundplane suffer issues during the summer with warmer temperatures?
(Ive got about 29C during the day in my studio, though no direct sun)

I ask, as I play my soundplane daily, and its become noticeable over the past few weeks, its tracking seems to have really deteriorated... (and Ive spent an unhealthy amount of time trying to recalibrate/tune parameters to sort it out :( )

Ive always got some ghosts notes, and dubious track arrange the edges, but now even row 2 and 4 and the 3 outer columns seem to all be suffering... either by the tracking being inaccurate in the X/Y direction by nearly half a cell, or introducing additional touches.

I perhaps notice it more as I play in fourths, so the Y being out can really be heard when is modulating things and sometimes its bad enough to think its in the next cell, so triggers the wrong note ... I guess I'm also potentially notice the X tracking more, as I am playing un-quantized.

(of course, Im only talking about issue which are for touches > 1" apart, as per current spec)

Ive tried both the official release of the soundplane software, and the current development branch, both behave the same :(

Is there any hope of seeing a new version of the soundplane software soon, that might improved the calibration routine, or the tracking software... I know it was mentioned many months back ... perhaps this will help resolve the issues?

I really cant think of anything else to try, to gets its tracking more accurate. :(

Im really disappointed, as for 6 months its been the controller i always just turn to, it lets me relax and noodle, and just enjoy the flow - but this has turned to frustration as I hear it glitching, and I find myself again trying (in vain) to recalibrate it.

Looking really good here :)

a couple of questions:

  • is the canDo(MPE) now in place?
    (if so, its still not working in bitwig, so we can throw it back into their court)

  • "+/-" pitch, has this changed? for a patch I've recently created, it seems a bit more 'choppy' than before. could be I just need to tone it down a bit in my patches now.

looking interesting... the formant filter could be fun :)

hey, I'll be buying verta too ( have to have all ML products ;) ) and completely understand new products (and upgrades, aalto 2.0? ) pays the bills, we all need to eat and make a living
(and as developers we all need to do new things too, to keep fresh!)

and I also completely understand maintaining/improving existing software is hard given operating system changes etc (I just spent a day fixing EigenD for el capitan),
then there are feature requests etc... its never ending (I know been there too!)

as i said, no criticism intended... just a request.

perhaps, soundplane software 2.0, could be a paid upgrade? that could help fund its development? ( though i suspect time is more the issue, but its a thought)

@rastkopravi , cool music and video :)

interesting i dont have air conditioning, and actualities a very dry (low humidity) climate here... but id guess high humidity would be more a problem than low.

I will say, Im not 100% sure if my issues are common, perhaps its also partly comes down to type of pieces played or technique? unfortunately there is little feedback from other soundplane owners (either one way or the other)

anyway, it has improved, so I'm back to playing the soundplane each day, so thats cool.

... and Im looking forward to virta :)

@andrewbird... I think we all have similar problems.

mine had recently improved slightly, I suspect partly due to reduced ambient temperatures... but I'm still getting stray touches. but I think, we all know that these are due to limitations within the touch tracker.

(unfortunately, due to family/work pressures Ive not had time to take apart the soundplane yet to see if mechanical changes can help)

Its a bit frustrating, some days/times the Soundplane feels like the perfect instrument, you feel in touch with it... but then this is spoilt by it false touches/glitching and the moment is 'lost'... it would be a dream to have it behave perfectly all the time :)

(I'll point out, for me, the ghosts notes/glitch tend to make midi mode pretty pointless, if your on the edge of the board, or close multiple touches... as it causes gate triggers, its not so bad with T3D/Aalto/Kaivo, as I avoid gates... its why I pretty much only use Aalto/Kaivo with the soundplane which is a real shame!)

Randy, is your plan to give the touch tracker some 'quality' time after Virta's release?

This is what Ive been impressed with on Geerts development on the Linnstrument, they have not added many features, but over 6 months they have consistently been improving response, consistency and 'feel', and Roger has clearly stated this is his #1 priority, to ensure the feel is 100% , as why most users buy these instruments.

this is no 'criticism', I completely understand the competing time pressures, more just a plea to give the soundplane software some priority, over perhaps the more 'lucrative' plugin development... but improved consistency in the touch detection would be enormous for soundplane users.
(to be clear I only want consistent to the 1"/non-adjacent cell, and over the whole board... Id much prefer this over any attempt to 'add features' e.g adjacent cell detection.)

anyway, fingers crossed...

"appreciation pattern" = arpeggiated ?

could the synth not be clocked to do the arp?

but yeah, tricky if your trying to arp manually... if the sound has strong enough transients, Id have thought Live could do this even with a audio clip.

for sure though, easier to do with midi. doing the above in Live for one or two tracks is not really a problem... its just a pain if you do it routinely, as you'll find every time you change the VST you have to reset the routing of the midi tracks. (I actually have a small M4L object that does this)

Bitwig have said they do support MPE for VSTs, I'm just waiting for a new version of Aalto which will support the 'canDo' operation (see separate post) to see if this fixes MPE within Bitwig.

Personally though, if midi editing of MPE/Voice per channel data was important to me (and I did it a lot), Id seriously consider Cubase. but for the odd time here and there, Live (or any other daw) is fine once you know how to do it.

@yorke, if you are trying to record multi channel midi (rather than just play it, i.e. hosting) , there appear to be only 2 solutions that offer proper support.

  • Cubase (Artist and Pro) , not the basic package
  • Bitwig 1.2 in beta BUT this currently is not working properly with MPE in VSTs only built-in instruments.

outside, these, the solution is either to:

  • record audio (this is what I do)
  • create separate midi tracks and route them to the VST this is what you can see in the above, and is a technique that works in most hosts, Live and Logic included.

its not a fault of MPE (as Voice per Channel midi suffers the same issues), nor Soundplane (its the same with Linnstrument/Seaboard/Continuum/Eigenharp)... its just the DAWs have not yet 'caught up'.

Hopefully with the new Seaboard RISE, and ROLIs marketing, we might see more widespread use, and so there the DAW developers will sit up and notice the demand :)

STM32F7 (and Axoloti) has both FS and HS support.

the advantage of using a hub, is not only being able to connect multiple devices, is that I'm using it as a single power source... which is not only convenient but also will help when I add a USB batter pack to power the whole thing :)
I guess later, I might put in a power rail, but for now this make its easy to get on with the software side :)

it should be noted, the PI2 and Axoloti's can deliver 500mA to USB devices, so its feasible to use these without a hub with the soundplane, however the BBB, can only deliver 100mA so currently you need a powered hub for use with soundplane

(Id need to check the STM32F7 disco, for what it can supply as a USB host)

(I wonder if there is a device, thats a straight thru USB-USB, but can add external power? this might alleviate the need for the hub in some scenarios!)

Im also assuming that once the soundplane software is on-board that will pretty much max out a CPU/board. (perhaps excluding PI2) ... so you need another USB (or midi din) to get the data to another device... so devices with only one usb port will need a hub for that purpose

MTT, well every (2.0/3.0) hub has a translator in it to do 2.0 to 1.x... the MTT just has one per port, important IF you connect multiple 1.x devices.

does a translator create latency... I don't think any more than any hub will...
Im pretty sure they are 'protocol' aware, so the translation function is not used HS to HS,
only HS to FS.
Id have thought (may be wrong though) the latency is 0.125 (due to HS leg), e.g.:

w/o hub computer < 1.0ms FS < device

w/ hub computer < 0.125 HS < hub < 1.0ms FS < device

but honestly, id need to read the USB hub specification to see if this is true.

But, in practice Ive used my Eigenharp Alpha (HS) for quite some time through a MTT hub , and never noticed any latency difference to plugging it directly into the Mac. so technically perhaps some increase, but I've never felt it.
(and since last year, put the soundplane in the same hub, and also not noticed any extra latency)

but perhaps because the latency is 'constant' we just cope, we are generally very good at dealing with constant delays... its more jitter we 'feel'

hard to describe a feeling ... but here goes :)

if you quickly 'strike'* the surface then I don't think you really 'feel' the initial surface give, but as you then then apply a pressure, you do feel it 'give' and provide resistance , this means you can grade the pressure quite easily. (its controllable the amount of force to pressure in software)

saying you cannot feel the 'strike' give is not a criticism, its really not necessary, as your initial velocity is already determine by you before contact... so there is no need for a feedback element.

if however, you slowly touch the surface, then you basically move straight to the second phase (pressure), so you can feel the give immediately. (this way you can play a slow attack pad type sound)

Roli have coined this idea of 5 expressions which I think works reasonably well

  • Strike - traditionally called velocity, initial force
  • Pressure (Z) / Glide (X) / Slide (Y)

  • Lift - traditionally 'release velocity', how fast you release a key

amount of give... its a few millimeters... more like pressing a surface that gives but is rigid, like a plastic lid i suppose.
BUT you have to remember, its a musical instrument, so the feedback obviously is given by sound, the feel is highly correlated.

so, when the soundplane is unplugged, you might think.. "oh that doesn't give much feedback" but when you connect it to a sound source, the feel takes on a different dimension, the sound means you can feel the give more (odd i know, but its completely unlike playing say on an iPad)

then of course it also depends how/what you play...

I tend to think i play in two styles...

  • tapping - this is quite fun, almost like finger tapping. quick strikes, it kind of bounces.

  • deliberate touch - i.e. slow approach/softer, far feel the pressure from the start.

(of course you can also kind of combine this i.e. quick strike then play the pressure)

fatigue, I play with it for hours, and never feel fatigued, the tapping would probably get tiring if you did it for a long time ... but i think the give perhaps helps reduce the impact ...if you have a medical complaint perhaps not advisable (e.g. Rheumatism)

compared to Continuum (I've tried one, there is a post here somewhere on my comparison), yeah completely different... not better or worst, just different, I prefer the slide on the soundplane, but the Continuums dynamics is incredible (actually quite difficult to control initial velocity... but thats probably something you get used to)

sorry, lots of words, but probably inevitable when trying to describe how something feels.

summary: you can feel the give and combined with sound source its plenty of feedback, both physically and 'emotionally'

awesome stuff , and good news all around :)

Dom from Bitwig is saying that BWS does support VSTs with MPE.

http://www.kvraudio.com/forum/viewtopic.php?f=259&amp;t=445754

but they must support the canDo() operation specified at the end of the MPE spec,
can you confirm that Aalto 1.7 has implemented this?

Thanks
Mark

k, if you could possibly get me a beta at some point, then I can re-test with BWS, and so help 'move the ball along' :)

not got one, but the manual implies it should work well...

it appears you can make the pad send x/y/pressure on different channels. (using the editor) so that should give you independent control for each pad.
(pity it doesn't allow appear to allow pitchbend for x, but you should be able to use mod and mod+1)

then the other controls on the Neo you should be able to map to Aalto controls using automation in your DAW.

should be fun .... anyone actually tried it?