You are currently browsing the tag archive for the ‘art’ tag.
Per previous posts, I am making some free software available here (although it’s somewhat niche): A Mac OS X Distributed Objects server for the Neurosky brain wave reading Mindset and a Quartz Composer plug-in client for the server. (If you have neither OS X nor the Mindset, you might want to wait for a future post where I talk more about how the brain wave art project is coming.)
This post will also serve as a brief introduction to what it would take for you to write your own Cocoa client for the server. But, If you just want the software, you can get it here:
- Server Application (and source code / Xcode Project)
- Quartz Composer Plug-In Client (and source code / Xcode Project)
- To install the client for Quartz Composer, close QC and copy the .plugin file to: “/Library/Graphics/Quartz Composer Plugins”. When you next open QC, you should find it in your Patch Library listed as “MindSetQCClient”. Usage of the patch should be obvious,
- The server shouldn’t need to start first as long as the client periodically checks for a vended object, but when troubleshooting it’s probably a good idea to start the server, then the client.
- The server needs the Thinkgear bundle in same directory as the server app. (I’m not including the Thinkgear bundle, it’s available from the Neurosky website for free as part of their developer stuff.
- Neurosky documentation has instructions for how to figure out what serial port your mindset is on, iirc. The default for the server is the one I use.
- I’ve borrowed so heavily from a hodge-podge of tutorials and examples, that I’m not going to include a license for the code. Use it as you will.
So, onward to the tutorial/implementation details:
Distributed Object Mindset Server and Client
This server is intended to be a little easier to use than some of the connection methods Neurosky provides (at least in my mind). It grabs data from the Mindset and provides it to Cocoa client applications (such as my Quartz Composer plug-in) by using Objective-C / Cocoa’s Distributed Objects interprocess messaging capability.
To access the Mindset data, the client must create an NSConnection to “JacksMindsetServer”. This gives it access to a vended object which supports the following very simple protocol (this protocol will have to be included in your client header file):
Creating the connection to the vended object which uses that protocol is simple and requires only a short bit of code:
NSString *_host = nil;
sharedObject = (id <PassingMindData>)[[NSConnection rootProxyForConnectionWithRegisteredName:@”JacksMindsetServer” host:_host] retain];
You should now have an object called “sharedObject” which allows all of the methods specified by the “PassingMindData” protocol created above and which will pass the data from the mindset server to your code. To do so, the primary method is “getOldestData”. Calling this method will return an array of the oldest line of values from the Mindset and getDataCount returns the number of lines currently queued.
The returned array contains ordered NSNumbers representing each type of value available from the mindset. The array elements can always be accessed in the following order:
- Attention (0)
- Meditation (1)
- Raw (2)
- Delta (3)
- Theta (4)
- Alpha1 (5)
- Alpha2 (6)
- Beta1 (7)
- Beta2 (8)
- Gamma (9)
- Gamma2 (10)
- SignalQuality (11)
The client is left to access these elements as it pleases from the NSArray object returned by getOldestData. The server also relies on the client to remove the original data from the server as soon as it grabs it by calling “removeOldestData” on “sharedObject”. (If the client does not call this, there is no auto-cleanup by the server until it’s stopped or exits and the client will not be able to access new data.)
If multiple lines of data are queued, getOldestData and removeOldestData should be executed repeatedly. A simple example would be:
if ([sharedObject getDataCount] > 0)
mindDataLine = [NSArray arrayWithArray:[sharedObject getOldestData]];
[self setOutputAttention:[[mindDataLine objectAtIndex:0] doubleValue]];
That’s really it. How to write a server is out of the scope of this post, but Neurosky has some great documentation and have provided examples from which I have –heavily– borrowed.
Let me know if you have questions or need further explanation. I’m going to continue to work on the art project with this stuff and will post more about that later.
Longer, more detailed post to follow – with free code and everything – but I wanted to post a video of art being made with my brainwaves:
In this demo (which is a significant step further than my last), my project selects between a series of images, merges them, moves them, and adds various visual effects based only on input from my brain waves (as measured by a Neurosky Mindset). All images – both drawings and photos – were made by me. Depending on when I run this, the images selected and how they’re merged vary significantly. In this case, only a small subset were selected. Other times, there is a wider variety. It’s important to note that often, this has created pairings and mergings that are fantastically cool looking. The Next step, creating a self portrait video of me sleeping with a curved screen over top of me projecting what my mind does with this while I sleep.
Via HacDC and as part of Digital Capital Week, I’ll be giving a lightning talk this Saturday the 19th on using your brainwaves to make art while you sleep. I’ll include either a video of the “first draft” of the art, or a live demo. This is a follow-up talk to one I gave this past February.
The talks start at 4:45 and go for an hour (or a little over) and you can find us at:
Mount Vernon Place United Methodist Church
900 Massachusetts Ave NW, Washington DC
If you want to hear more about consumer-grade fun with using your brainwaves to manipulate the world around you, come check it out!
The current speaker lineup is:
- Look Ma, No Wires (Michael Panfield)
- Sysadmins: Have smartphone, will travel (Betsy Nichols and Andrei Tchijov)
- AI: Three most common reactions (Bradford Barr)
- ??? (Alan McCosh)
- Writ Large: scaling a Cartesian robot (Dan Barlow)
- Urban Data Access: How communication builds communities (Will Holcomb)
- Fast Creativity: Using the DNA of Improvisational Comedy to Foster Ideas Fast (Shawn Westfall)
- While you sleep: Making Art with your mind (and a little code) (Jack Whitsitt)
Not really appropriate for this blog, but I’m pretty lazy about updating my art-only one: Paivi and I were juried into (along with many other talented local photographers) the DCist Exposed show this year and the opening is Saturday, March 6. Come see it, if you’re in town and free. My selected photo was:
Official press release follows:
Washington, DC — DCist.com is pleased to announce its fourth annual DCist Exposed Photography Show, at Long View Gallery, running March 6 to 21, 2010. Out of over 1,000 individual entries submitted through Flickr.com, 47 winning images were selected by a panel of judges to be included in this year’s DCist Exposed exhibit. DCist.com prides itself on engaging and promoting emerging local photographers through its daily use of images from the popular, reader-generated DCist Flickr photo pool. Each day, DCist.com selects photos from the pool for use in its daily coverage of local news, arts and entertainment, food and sports.
This year’s opening reception will be bigger and better than ever, to be held Saturday, March 6, 2010 from 6 to 10 p.m. At the bar, mixologist Scott Palmer from Dino will have a special punch, Leopold Brothers will host a liquor tasting, and Pabst Blue Ribbon will hold down the fort with plenty of beer. Nage will provide hor’dourves, while DJs v:shal kanwar and Sequoia spin tunes. Reception is $5 per guest at the door.
Long View Gallery is located at 1234 9th St. NW, just a few blocks from the Mt. Vernon/Convention Center Metro. The 2009 DCist Exposed event welcomed over 1,000 people on opening night, and with this even larger venue, we expect our biggest crowd ever. All photographs selected and displayed at DCist Exposed will be for sale at prices well below traditional gallery shows. Regular gallery hours are Wednesday-Saturday, 11 a.m. to 6 p.m., and Sunday, 12 to 5 p.m.
As promised in the previous post, here are demo videos of my three new Quartz Composer Webcam Audio Visualizer compositions. I’m being a bit silly in them, but that’s because I dont have an external webcam or anything else more artistic to point it at tonight. In the future, I might do a real non-demo piece of art with one or more of these. No promises, though. Next post will be about security, though, I swear. :)
Well, the HacDC Hacker’s Lounge event/party got canceled – which was too bad. However, I did write some valuable code and make some pretty cool looking new compositions. The code isn’t ready for release, but I did put up the compositions and they’re available for free download here: https://sintixerr.wordpress.com/quartz-composer-downloads/
I don’t have video for them yet (maaaybe later today), so you’ll just have to try them out for yourself. I actually like all three of these much more than the original.
Remember, OS X / Quartz Composer only.
( Hmm. I guess I should write a viewer for these so you don’t need Quartz. Many projects, little time, but we’ll see… )