rebecca's WIP

Archive
readymades

MAIN_IMG_HI

After many weeks post-Spring Show, I finally have pulled together solid documentation for my final Readymades project, What in the world do you want to see? Following the show, I took some time to fix a few minor fabrication details, and document the piece with nicely lit stills as well as a video that clearly illustrates the interaction. Documenting the experience of this piece has felt especially hard to do, since you have to look into a device to experience the images in a very intimate way, so that’s something I’m still working on. The video edit is still in process, but here are some stills in the meantime as well as the text that accompanied the piece in the ITP Spring Show. I’m planning to revisit and possibly revise the text, post my final video and do a quick post-mortem on the project (what I learned, what I would like to change / revise / etc.) within the next week.

 

WHAT IN THE WORLD DO YOU WANT TO SEE?

What if you were able to travel to an office in Moscow or a parking lot in Nebraska? To watch people sitting poolside, folding their laundry, or eating lunch in the food court of a mall?

What in the world do you want to see? is an interactive video sculpture that reimagines the ViewMaster, an old stereoscopic imaging toy, as a portal into other people’s worlds using surveillance camera video feeds taken from around the globe. Have a look inside the device and pull down on the lever to change what you see.

IMG_8184_LO IMG_8191_LO

Read More

2015-05-04 20.25.16 2015-05-04 21.26.39 2015-05-04 22.42.18 2015-05-04 22.47.22

Read More

2015-04-28 16.08.24

My plan for the final project is to continue working with the View Master and convert it into a stereoscopic video player using two tiny LCD screens. I learned a lot from prototyping on the Oculus last time and I think for now I’m going to move away from using the stock footage. My plan at the moment is to play webcam and security camera videos that are livestreamed on this website. I think there’s something really interesting in the idea of “seeing the world” through the View Master, but instead of going to exotic locations you might see a feed of a laundromat in Moscow or a parking lot in Nebraska. As a longer term project, I think it’d be amazing if you could play the livestream video feeds from the site, but for the moment I’m going to curate a selection of interesting videos and download them to my computer.

Screen Shot 2015-05-01 at 11.33.01 AM

I ordered some vintage View Masters in the mail and was able to take one of them apart with the generous help of Ben Light. It’s a bit difficult to open up the older ones (as opposed to the new ones which just come apart with a bit of pulling) and I broke mine in the process. Luckily I have two more to try. There are a lot of technical challenges to keep in mind with this project including: fitting all of my electronics inside the device, preserving the switch mechanism (I’m going to use the little lever as a switch to change the videos) and getting the stereoscopic video right. I’ll need to get two of the RCA converter cables, since each screen will be playing something different. I’m also worried about the resolution of the small screens, but it’s what I have to work with right now so fingers crossed it will look ok.

2015-04-28 16.09.58

To make my project feel a bit more manageable, I’ve broken it down into smaller
steps that I’m going to go through in the next couple of days:

– Make simple video playlist in Jitter
– Control video playlist with a simple switch from Arduino
– Make switch out of View Master mechanism
– Test cables with two small screens
– Scan still reels to figure out how to make the images stereoscopic (dimensions and placement)
– Figure out all parts and how I want to create my enclosure)
– Embed everything inside the View Master (eek)

Read More

My presentation for the Video Readymade assignment was a first prototype of what will eventually be my final project. Since my View Master hadn’t arrived, I decided to make a first prototype to simulate the experience using an Oculus and the Oculus library for Jitter.

oculus

how dorky on a scale of 1-10

For this first version, I decided to use cheesy POV stock footage of faraway places and exotic locales as a nod to the original purpose of the View Master device, which was to transport you to somewhere else. (Think scenic postcards in stereo). I kept the watermark on the stock imagery, which I really liked conceptually, but I don’t know that it works in execution. I got a lot of great feedback from my critique to help push my thinking along, such as:

  • What about creating a virtual experience for experiences that aren’t particularly desirable? Really boring situations?
  • It feels like an architectural rendering of a place that already exists
  • How can this cheap plastic thing get you to another world?
  • Is there a way to play with nostalgia?
  • Am I interested in having a dialogue with VR’s use in simulation and training?
  • How can I bring out the uncanny?
  • What is it that interests me about the stock footage?
  • Could I make them into a series (if I get one working on time)?

In my mind, there are three contexts that I can leverage with this project and use to play with expectations:

  1. View Master (toy, history, cultural significance, stereo imaging, private space)
  2. VR: what are the expectations around VR? what’s it supposed to do? can it serve another purpose?
  3. Stock Footage: is it a database? a simulacrum? a blank version of life?

I’m going to look more into the history of the View Master, other related devices and stereoscopic imaging to see if there are some things in there that I can use as a starting point. Another thing I’d be interested to explore is the tension between the private (immersive space) of the device and a public space. (For this I also need to remember the scale of the View Master and that unlike the Oculus, it’s not totally immersive because the image is so small).

What if the View Master was a portal to a database, and you could type in anywhere you wanted to go or anything you wanted to see? What if it piped in a live stream?

I have two new (possibly promising ideas) for the video content:

  • Use a live stream of different surveillance cameras, and it switches feeds every time you pull the lever (could be in dialogue with my catcalling surveillance camera)?
  • A stream of eyes looking back at you, like really intimate forced eye contact
Read More

My proposed concept for my video readymade and eventual final project is called “What in the world do you want to see?” The idea is to use an old View Master to transport the wearer into a rudimentary virtual reality environment created from stock footage of faraway places.

Screen Shot 2015-04-11 at 6.49.48 PM

A couple of weeks ago, I became interested in the idea of using stock footage with the watermark left on it. Stock imagery is interesting to me because it functions as an empty vessel in which people can insert meaning through the act of buying and recontextualizing it to signify whatever they want. The watermark visually signifies this. Stock footage of an exotic beach is both no place and any place at the same time.

The View Master interested me initially because I like to think about devices for seeing and ways in which they draw our attention to it. I’d been thinking a bit about virtual reality, and then it occurred to me that I could hack the old View Master toy and have it function as a rudimentary VR headset for playing footage in a way that is more immersive. I created a system diagram for my proposed project to show what the user flow is and how all the parts will work together.

Read More

Read More

catcallingFor my emotional object assignment, I’m planning to continue working on my catcalling surveillance camera. My goal is to flesh out my prototype by using a real camera to do motion-tracking in Max / MSP, and to use that data to control the movement of the camera using a pan-tilt motor mount. From a technical standpoint, the flow of my project will be as follows:

camera feed —> Max / MSP —> Max detects person moving —> Max triggers sound AND Max talks to Arduino —> controls servo motor (pan / tilt mount) to look in direction of person

I’m planning to still use the whistling sound effect when somebody walks by the camera, though I’m not exactly sure how the sound and movement will work together. I’m also not sure if I’ll expand the range of sounds that are triggered. Will it say things in addition to whistling? This also brings up more questions around who is doing the saying. Is the piece gendered? Is a computer speaking? Is a person speaking? What is the personality of this thing?

Read More

2015-02-21 12.32.04
2015-02-21 12.33.51

Today I spent a bunch of time working with my IR sensor and getting the level of sensitivity I wanted from it. I may have to tweak that once I hook it up to my Max patch, depending on exactly how it triggers the sound. I’ve decided to use an IR sensor with a decoy camera since it has the look I’m going for and is clearly legible as a surveillance camera. I initially thought of working with camera data, but the ITP surveillance cameras are big and dome-like and don’t visually register as such.

This particular IR sensor uses digitalWrite(), which is actually perfect for what I’m doing since I just need it to function as a switch. I also set up my Max patch with the sound file I’m using (many thanks to Nick Bratton for all the help / explanation). Next step is to incorporate serial and connect my Max patch to my Arduino code.

In the meantime, here are some more reference images that I like.

Read More

STORYBOARD

Technical questions / issues to address

  • What is the best / most flexible surveillance camera that will be easiest to manipulate and hook up to Max (or potentially to Arduino)? Can I find a good option that also has the old CCTV cam look I am going for (the big clunky rectangular kind)?
  • There are two things I want to do that respond to the user: trigger the whistling sound in Max / MSP when somebody walks by (with a slight delay) and have the camera pan left and right to follow / track the user’s movement. (It would also be great if the camera could then pause on them). If the camera panning seems tricky technically, I’ll need to get the sound part working first and then tackle the video stuff afterward.
  • Even if I find a surveillance camera with a built in motor for pan / zoom, I have a hunch that the second part requires an external library for motion detection and tracking. Which is the best library to do this? Open CV? Can I do this in Max?
Read More


Read More