You are currently browsing the category archive for the ‘Projects’ category.
Longer, more detailed post to follow – with free code and everything – but I wanted to post a video of art being made with my brainwaves:
In this demo (which is a significant step further than my last), my project selects between a series of images, merges them, moves them, and adds various visual effects based only on input from my brain waves (as measured by a Neurosky Mindset). All images – both drawings and photos – were made by me. Depending on when I run this, the images selected and how they’re merged vary significantly. In this case, only a small subset were selected. Other times, there is a wider variety. It’s important to note that often, this has created pairings and mergings that are fantastically cool looking. The Next step, creating a self portrait video of me sleeping with a curved screen over top of me projecting what my mind does with this while I sleep.
Via HacDC and as part of Digital Capital Week, I’ll be giving a lightning talk this Saturday the 19th on using your brainwaves to make art while you sleep. I’ll include either a video of the “first draft” of the art, or a live demo. This is a follow-up talk to one I gave this past February.
The talks start at 4:45 and go for an hour (or a little over) and you can find us at:
Mount Vernon Place United Methodist Church
900 Massachusetts Ave NW, Washington DC
If you want to hear more about consumer-grade fun with using your brainwaves to manipulate the world around you, come check it out!
The current speaker lineup is:
- Look Ma, No Wires (Michael Panfield)
- Sysadmins: Have smartphone, will travel (Betsy Nichols and Andrei Tchijov)
- AI: Three most common reactions (Bradford Barr)
- ??? (Alan McCosh)
- Writ Large: scaling a Cartesian robot (Dan Barlow)
- Urban Data Access: How communication builds communities (Will Holcomb)
- Fast Creativity: Using the DNA of Improvisational Comedy to Foster Ideas Fast (Shawn Westfall)
- While you sleep: Making Art with your mind (and a little code) (Jack Whitsitt)
I’ve created a google code page for it HERE.
You can grab a stand alone zip of the source/project HERE.
(I’ve never used SVN before, so what’s up at the google code page might periodically be fubared, so you might want to start with the zip)
Feel free to download, comment, and please -contribute-. This was my first Objective-C app and first Xcode project, so if it’s a mess…well…deal or help? :)
Just remember the google code page if you want to post some updates or questions.
I’ve also made some haphazard notes to help people understand the code:
The aquireData class handles reading the tcpdump text file. It uses Core Data to store the data. If I had to do it over, I wouldn’t have used Core Data…but it is what it is. You can find the data model by double-clicking pkviz_DataModel under the Models folder in the project in Xcode.
pkGraphView is a subclass of NSView that I use to handle the layers, which are done in Core Animation (easy enough to understand). The view has a delegate function (drawLayer) which I handle in the layerDelegate class to deal with drawing the paths for each layer.
Everything else is handled by transformData – it’s pretty much my controller.
the Load button tells aquireData to parse tcpdump and store in a core data context
The launch button kicks off transform data, which pulls in the data from the core data context, sticks it into an array, launches a thread to pop out individual packets, and then tells the view when it’s read to display another packet. Everything else stops, starts, adjusts the current packet referenced, or aids this animation loop process.
The main array of packets in transformData is bytepakposSet. It is an array of packet arrays. packet arrays contain arrays of bytes with 2 values in them: bytevalue, and byteposition
so, if you wanted to access the third packet in bytepakposSet and see what the byte value of the first byte stored is, you’d do:
[[[[bytepakposSet objectAtIndex:2] objectAtIndex:0] objectAtIndex:0] intValue];
if you wanted to get the byte value and position returned in an array:
[[bytepakposSet objectAtIndex:2] objectAtIndex:0]
Core Data doesnt return objects in order, so you dont know ahead of time what order the bytes are in the packet, youll have to sort them by position in packet first. You can find position:
[[[[bytepakposSet objectAtIndex:2] objectAtIndex:0] objectAtIndex:1] intValue];
All, I’ll be giving a quick (5 minute) introduction to using Neurosky’s Mindset API to do cool stuff with your brainwaves – like making art while you sleep :) – on 02/23/10 @7:30pm as part of HacDC’s Lightning Talks (featuring 12 speakers for 5 minutes each). For the introduction, I’ll be using the simple Objective-C server and custom written Quartz Composer plug-in client to display a visualization that response to both your brainwaves and ambient noise/music together. Come out and see!
Check out the example proof-of-code video I did below (a longer post to come tomorrow):
As promised in the previous post, here are demo videos of my three new Quartz Composer Webcam Audio Visualizer compositions. I’m being a bit silly in them, but that’s because I dont have an external webcam or anything else more artistic to point it at tonight. In the future, I might do a real non-demo piece of art with one or more of these. No promises, though. Next post will be about security, though, I swear. :)
Well, the HacDC Hacker’s Lounge event/party got canceled – which was too bad. However, I did write some valuable code and make some pretty cool looking new compositions. The code isn’t ready for release, but I did put up the compositions and they’re available for free download here: http://sintixerr.wordpress.com/quartz-composer-downloads/
I don’t have video for them yet (maaaybe later today), so you’ll just have to try them out for yourself. I actually like all three of these much more than the original.
Remember, OS X / Quartz Composer only.
( Hmm. I guess I should write a viewer for these so you don’t need Quartz. Many projects, little time, but we’ll see… )