theformand's Recent Posts

The dishcloth we found was a type of "spongy" cloth that has certain abilities while dry, and once you soak it up its kinda gone for good. Its a cheap no name brand that we never heard of ourselves, go figure :)

Just found another video from the development phase where Jakob is playing a few random things:

Playing the G-Tar from Martin Nielsen on Vimeo.

Hello everyone


So, this post is going to describe a project named The G-Tar project (this name was originally chosen because it is so cheesy it actually hurts :)).
This is a 4th semester Medialogy project at Aalborg University, Denmark.
This project was our first venture into the world of sensors, DSP and general electronics. Also, we learned quite a bit about synthesis and how to get
stuff going on in Max/MSP/Jitter.


The basic principle of the project was to develop a NIME based on the guitar metaphor. We were heavily inspired by Randall Jones's Soundplane 8x8, so we figured that we would simply try to integrate this onto the neck of a guitar, slap some other sensors on it, and explore the mapping possibilites of this NIME.


As it turns out, this would entail a world of technical problems, which I will try to describe in this post, and link videos showing the insides of the G-Tar.


The way the Soundplane works (roughly) is by sending out signals into copper strips, through a dielectric and reading them from another layer of
copper strips. Randall Jones used a Fireface 800 sound interface to handle this I/O job.
If we wanted to build this into a guitar form factor, we needed something smaller. So we found some cheap noname USB soundcards on a chinese website.
The sound cards each came with 2 input channels and 6 output channels. So we started out with 4 soundcards, giving us a total of 8 carriers and 24 recievers. This would give us pressure points in a matrix of 8 by 24. However as we built this whole thing up around a guitar, we figured we would go with 6 by 24 (6 strings and 24 frets). These 4 sound cards would then connect to a central USB hub which would connect to a Mac running Max/MSP/Jitter.
The smart thing about using a Mac is to utilize the Aggregate Device feature, so we could fuse these 4 cards together into a single device for Max/MSP/Jitter to recognize. Check this link for a visual of the physical layout with 4 soundcards 1 usb Hub and 1 Arduino (for other sensors).


Sorry, but the commentary is in Danish, but it should give you an idea of how it looks:
(im the one with the soldering iron)




This setup was never succesful though. It turned out that either the USB device part of MAC OS X or Max/MSP/Jitter or whatever, simply would not recognize 4 of the same cards through one usb hub. We tested out different types of hubs, and nothing would seem to work. Connecting the cards via their own usb plug works perfectly fine but as soon as a hub is introduced to the system, it simply gets
very very jittery and unstable. Some times they all show up for a few seconds before they disappear again, sometimes nothing at all happens.


So unfortunately we had to compromise alot, by using only 2 sound cards (the MacBook Pro we were using only has 2 usb slots) resulting in a 4 by 12 touch matrix, or approximately half the area of the guitar neck is active. Each fret on the neck is connected to an output port on one of the USB cards via a standard 25-lead cable. We chose this because the cable is so flat and therefore fits well into the neck design.




This video shows how the inside of the neck looks like with copper strips
and all, again in danish. The yellow dishcloth seen in the video gives us this "spongy", soft feel that allows us to read pressure applied.
A solution to this USB- issue would result in higher resolution and therefore more intimate control.


From here on out it was simply a matter of exploring different mapping opportunities and building synthesis in max/MSP/Jitter. The final mapping
is far from perfect. For example, the accelerometer delivers both tilt data and "jerk" data, however only the tilt data is actually used.
And the tilt data isnt properly employed. I dont remember off the top of my head, but I think tilting the G-tar upwards (the Guitar Hero move)
controls some Pulse width modulation parameter, therefore only usefull when youre using pulses. Tilting it downwards controls some parameter of a
flanger algorithm, therefore only active when flanging is turned on. But it is still a work in progress and we're counting on developing this alot further
as it is already an extremely fun toy in its current form.


Finally i'll show two short videos containing an overview of the physical layout and a short introduction by one of the other developers, Peter, and a few examples of what the G-Tar can do.
It does by no stretch of the imagination show the full range of sound available.


Physical layout:




Intro + examples:




On behalf of my dev group 435 of Aalborg University Denmark, id like to thank Randall Jones for inspiring us and helping us understand how the Soundplane works.
Its been alot of fun figuring out how to get the Fireface 800 out of the equation, and moving the Soundplane onto a guitar neck, and our
supervisor gave us all good grades :)


If anyone is interested in seeing the Max/MSP/Jitter patches weve constructed or the report documenting this whole thing, feel free to post here
and i'll try to check in once in a while. Any other questions in general are welcome, but we're all enjoying our summer break, so response time might
be a few days.


-Martin

Okay, so this forum just ate all the formatting of the above post. Might be a bit hard to read. If an admin could chop it up a bit that might be nice, thanks! :)