You are currently browsing the category archive for the ‘technology’ category.
Per previous posts, I am making some free software available here (although it’s somewhat niche): A Mac OS X Distributed Objects server for the Neurosky brain wave reading Mindset and a Quartz Composer plug-in client for the server. (If you have neither OS X nor the Mindset, you might want to wait for a future post where I talk more about how the brain wave art project is coming.)
This post will also serve as a brief introduction to what it would take for you to write your own Cocoa client for the server. But, If you just want the software, you can get it here:
- Server Application (and source code / Xcode Project)
- Quartz Composer Plug-In Client (and source code / Xcode Project)
- To install the client for Quartz Composer, close QC and copy the .plugin file to: “/Library/Graphics/Quartz Composer Plugins”. When you next open QC, you should find it in your Patch Library listed as “MindSetQCClient”. Usage of the patch should be obvious,
- The server shouldn’t need to start first as long as the client periodically checks for a vended object, but when troubleshooting it’s probably a good idea to start the server, then the client.
- The server needs the Thinkgear bundle in same directory as the server app. (I’m not including the Thinkgear bundle, it’s available from the Neurosky website for free as part of their developer stuff.
- Neurosky documentation has instructions for how to figure out what serial port your mindset is on, iirc. The default for the server is the one I use.
- I’ve borrowed so heavily from a hodge-podge of tutorials and examples, that I’m not going to include a license for the code. Use it as you will.
So, onward to the tutorial/implementation details:
Distributed Object Mindset Server and Client
This server is intended to be a little easier to use than some of the connection methods Neurosky provides (at least in my mind). It grabs data from the Mindset and provides it to Cocoa client applications (such as my Quartz Composer plug-in) by using Objective-C / Cocoa’s Distributed Objects interprocess messaging capability.
To access the Mindset data, the client must create an NSConnection to “JacksMindsetServer”. This gives it access to a vended object which supports the following very simple protocol (this protocol will have to be included in your client header file):
Creating the connection to the vended object which uses that protocol is simple and requires only a short bit of code:
NSString *_host = nil;
sharedObject = (id <PassingMindData>)[[NSConnection rootProxyForConnectionWithRegisteredName:@"JacksMindsetServer" host:_host] retain];
You should now have an object called “sharedObject” which allows all of the methods specified by the “PassingMindData” protocol created above and which will pass the data from the mindset server to your code. To do so, the primary method is “getOldestData”. Calling this method will return an array of the oldest line of values from the Mindset and getDataCount returns the number of lines currently queued.
The returned array contains ordered NSNumbers representing each type of value available from the mindset. The array elements can always be accessed in the following order:
- Attention (0)
- Meditation (1)
- Raw (2)
- Delta (3)
- Theta (4)
- Alpha1 (5)
- Alpha2 (6)
- Beta1 (7)
- Beta2 (8)
- Gamma (9)
- Gamma2 (10)
- SignalQuality (11)
The client is left to access these elements as it pleases from the NSArray object returned by getOldestData. The server also relies on the client to remove the original data from the server as soon as it grabs it by calling “removeOldestData” on “sharedObject”. (If the client does not call this, there is no auto-cleanup by the server until it’s stopped or exits and the client will not be able to access new data.)
If multiple lines of data are queued, getOldestData and removeOldestData should be executed repeatedly. A simple example would be:
if ([sharedObject getDataCount] > 0)
mindDataLine = [NSArray arrayWithArray:[sharedObject getOldestData]];
[self setOutputAttention:[[mindDataLine objectAtIndex:0] doubleValue]];
That’s really it. How to write a server is out of the scope of this post, but Neurosky has some great documentation and have provided examples from which I have –heavily– borrowed.
Let me know if you have questions or need further explanation. I’m going to continue to work on the art project with this stuff and will post more about that later.
Via HacDC and as part of Digital Capital Week, I’ll be giving a lightning talk this Saturday the 19th on using your brainwaves to make art while you sleep. I’ll include either a video of the “first draft” of the art, or a live demo. This is a follow-up talk to one I gave this past February.
The talks start at 4:45 and go for an hour (or a little over) and you can find us at:
Mount Vernon Place United Methodist Church
900 Massachusetts Ave NW, Washington DC
If you want to hear more about consumer-grade fun with using your brainwaves to manipulate the world around you, come check it out!
The current speaker lineup is:
- Look Ma, No Wires (Michael Panfield)
- Sysadmins: Have smartphone, will travel (Betsy Nichols and Andrei Tchijov)
- AI: Three most common reactions (Bradford Barr)
- ??? (Alan McCosh)
- Writ Large: scaling a Cartesian robot (Dan Barlow)
- Urban Data Access: How communication builds communities (Will Holcomb)
- Fast Creativity: Using the DNA of Improvisational Comedy to Foster Ideas Fast (Shawn Westfall)
- While you sleep: Making Art with your mind (and a little code) (Jack Whitsitt)
I’ve created a google code page for it HERE.
You can grab a stand alone zip of the source/project HERE.
(I’ve never used SVN before, so what’s up at the google code page might periodically be fubared, so you might want to start with the zip)
Feel free to download, comment, and please -contribute-. This was my first Objective-C app and first Xcode project, so if it’s a mess…well…deal or help? :)
Just remember the google code page if you want to post some updates or questions.
I’ve also made some haphazard notes to help people understand the code:
The aquireData class handles reading the tcpdump text file. It uses Core Data to store the data. If I had to do it over, I wouldn’t have used Core Data…but it is what it is. You can find the data model by double-clicking pkviz_DataModel under the Models folder in the project in Xcode.
pkGraphView is a subclass of NSView that I use to handle the layers, which are done in Core Animation (easy enough to understand). The view has a delegate function (drawLayer) which I handle in the layerDelegate class to deal with drawing the paths for each layer.
Everything else is handled by transformData – it’s pretty much my controller.
the Load button tells aquireData to parse tcpdump and store in a core data context
The launch button kicks off transform data, which pulls in the data from the core data context, sticks it into an array, launches a thread to pop out individual packets, and then tells the view when it’s read to display another packet. Everything else stops, starts, adjusts the current packet referenced, or aids this animation loop process.
The main array of packets in transformData is bytepakposSet. It is an array of packet arrays. packet arrays contain arrays of bytes with 2 values in them: bytevalue, and byteposition
so, if you wanted to access the third packet in bytepakposSet and see what the byte value of the first byte stored is, you’d do:
[[[[bytepakposSet objectAtIndex:2] objectAtIndex:0] objectAtIndex:0] intValue];
if you wanted to get the byte value and position returned in an array:
[[bytepakposSet objectAtIndex:2] objectAtIndex:0]
Core Data doesnt return objects in order, so you dont know ahead of time what order the bytes are in the packet, youll have to sort them by position in packet first. You can find position:
[[[[bytepakposSet objectAtIndex:2] objectAtIndex:0] objectAtIndex:1] intValue];
All, I’ll be giving a quick (5 minute) introduction to using Neurosky’s Mindset API to do cool stuff with your brainwaves – like making art while you sleep :) – on 02/23/10 @7:30pm as part of HacDC’s Lightning Talks (featuring 12 speakers for 5 minutes each). For the introduction, I’ll be using the simple Objective-C server and custom written Quartz Composer plug-in client to display a visualization that response to both your brainwaves and ambient noise/music together. Come out and see!
Check out the example proof-of-code video I did below (a longer post to come tomorrow):
As promised in the previous post, here are demo videos of my three new Quartz Composer Webcam Audio Visualizer compositions. I’m being a bit silly in them, but that’s because I dont have an external webcam or anything else more artistic to point it at tonight. In the future, I might do a real non-demo piece of art with one or more of these. No promises, though. Next post will be about security, though, I swear. :)
Well, the HacDC Hacker’s Lounge event/party got canceled – which was too bad. However, I did write some valuable code and make some pretty cool looking new compositions. The code isn’t ready for release, but I did put up the compositions and they’re available for free download here: http://sintixerr.wordpress.com/quartz-composer-downloads/
I don’t have video for them yet (maaaybe later today), so you’ll just have to try them out for yourself. I actually like all three of these much more than the original.
Remember, OS X / Quartz Composer only.
( Hmm. I guess I should write a viewer for these so you don’t need Quartz. Many projects, little time, but we’ll see… )