Monthly Archives: February 2013

‘Reverse’ IBL

I watched ‘Beyond the Black Rainbow’ last weekend and put together this test video,

This weekend’s tests were all about getting brainstorming on ways to improve Cambot and doing a bit of shooting in the studio. The studio time definitely helped to see what is working on Cambot and where it can use some improvements. The goal of my shoot was to experiment with lights.

I’ve spent some time before working with traditional Image based lighting, where I basically used a reference image from a location to digitally light a 3D model.


The reference image is typically a panoramic image unwrapped from a photo of a chrome sphere. The chrome sphere reflects a full 180 degree image of the reflection and in doing so the reflected light which would be cast upon the object. Typically a gray sphere is photographed at the same time for comparison with the in-progress/finished model.


I want to begin the same way, background locations photographed with reference spheres, then I want to photograph practical scale models to place in them. Typically this is achieved by attempting to mimick the lighting conditions, i.e. “the sun was here so we place a key light here, with a fill light on this side and…”. Then through a process of ‘match-lighting’ a cinematographer/Director of photography can reproduce the location’s lighting.

Why ‘Reverse’?
Now I want to turn this all around. In theory by using these reference images it should be possible to recreate the environment lighting on demand when photographing a practical model or actor. All that would need to happen is for a directional light source to project onto the surface of the model with the same hue and intensity as the reference.

This definitely isn’t a new idea, Paul Debevec, developed this years ago through ICT with his lightstage, pictured above, which I’ve linked here rather than attempting to further summarize it;

My approach is through the use of DMX stage lighting. There are multi colored lights capable of mixing Red Blue and Green in real time. These are also programmable via DMX512 protocol, so they can be set up to run through pre-set lighting configurations.

My Stage
I’ve been using these slimpar 64 RGB LED lights from Chauvet. I’ve currently arranged five of these in a half ring around my stage all pointing inward.

I plan to eventually upgrade this to a more automatic solution. There are a lot of software packages designed for stage techs, in fact many concerts and night clubs use these systems. There is also computer soft/hardware solutions more designed for filmmakers and animators like this card for Kuper which fits in with a Kuper motion control system. Or the DDMX-S2 from Dragonframe, which allows stop-motion playback control for incremental programs. For now I’m controlling these via the Chauvet Obey-10 mixing board, which allows me to set sliders for each of the color channels of the lights independently.

What I’d really like is for it to be able to process a ref image or video and reproduce it automatically, or use a video clip and essentially ‘play it back’. It makes sense that through software I could take a reference image and sample the quadrant’s hue and value, and route that into a DMX controller, then those values could be used for control of the lights.

The Science
The theory sounds great, but first I have to figure out the physical lighting limitations of this rig and about LEDs in general. I’ve often been warned about color temperature in photography. The difference between Tungsten (3200K), Daylight (5700K), and Fluorescent (4000k). However in attempting to get a clear answer to the temperature of LEDs I went down a rabbit hole. It seems this all goes out the window the moment you start color mixing. It is completely variable, which means it could be anywhere. Added to this LEDs typically have a more limited spectrum, take a look at

these graphs;

LEDs are often assigned a CRI which as I understand it, is how well they can reproduce the sun’s light, and thus how balanced a color will appear when illuminated by it. The other thing I’ve discovered about LEDs is the pulse width of the lights themselves. They aren’t actually constantly on, the light blinks on and off at arate so fast we can’t detect it. For many lights this is slowed down for the dimming feature/effect. This pulse width modulation which our naked eye cannot detect, even at lower frequencies will be detected by the camera when set to a high shutter speed.

I found it really interesting in this test to see the way the light’s wavelengths interacted with the shutter speed of the camera. It seems there are ways to work around it, selecting a lower shutter speed for example, but I haven’t quite figured out the science for it. Looking over forums shooting around PWM seems to be an increasing problem, especially for venue/location photographers;
Shutterspeed and flickering hmis
PWM is not your friend
LED flicker on camera

Cambot BRAIN storming pt2

Continued from:

I’ve been talking with Chris Hassell from Co-Optic productions/TaskForce3D about my ideas for home-made Motion Control Rigs so I felt like it was time to update my notes on the project.


To begin what are the goals of this project?

      Inexpensive (as much as I’d love to get a MrMoCo rig I just can’t)
      Wide range of movement
      Programmable; Move-shoot-move, Play-back and Real-time controls
      Self-contained Modular/lightweight/portable

I hesitate to put a number down because it’d be laughable and embarrassing, but suffice to say I have very little budget, so I’m looking for the best solution for the least cash.

ProAim $475
When you look for cheap video gear on ebay you’re sure to find Proaim. Its not all bad, it works until it doesn’t. They offer some motorized tilt/pan heads, realtime only control, so I’d need a remote operator, or I’d still need to buy/make a ‘brain’ for it.

So, it may be great for long crane shots, but for this project its just too expensive…

Wide range of movement
How many axes of motion?
Ideally six; Dolly (back/forward left/right), Pan, Tilt, Jib/Crane, Jib/rotate

Yes. I want it all. I need to try and keep myself in scope for this project. My current Cambot has 7 axes of motion and may be just a bit too complicated for a programmable version; Dolly forward, incremental forward, incremental track to side, crane up, jib rotate, pan, tilt

There seem to be a few schools of thought surrounding this, there are the hobby-type time lapse/video; pan/tilt units, pan/tilt slider, and then there are the ‘pro’ solutions which can drive more channels but frequently connect to a computer; Ditogear evolution, MrMoco, etc. These can become monster rigs, and definitely not as portable, but can be capable of any possible moment.
For example MrMoCo’s Animoko;

C-MOCO is a german developed motion control system based on an industrial robotics arm.

It features a record and playback model for repeated motions along 7 axis, as well as embedding camera tracking data into R3D files for use in 3D and compositing applications. This is exactly what I want, but the fact that there is no price on their website tells me ‘if you have to ask, you can’t afford it.’

It seems like the biggest limitation for this is the ‘brain’; how many channels can I simultaneously control?


Looking at “brains” to control a MoCo rig, so far these have looked like the best options;

Basic Motion Controller Bundle $480
This is a kit for converting a slider into a MOCo unit. It only controls a single axis and is primarily intended for time-lapse. Like many, its max speed is controlled be swapping motors into place, using the gearing of the motor to speed it up or slow it down.

They also offer the SMART controller. $706

This offers the same single axis, but also allows for more control, live mode, ramping of speed, stop-motion, limit switches, etc.

Timelapse Camera Slider Motion Controller $225

This is the brain behind many time lapse rigs, it controls a single-axis for slider movements and also can be made to hook up to a telescope head for pan/tilt movements

Sadly there was an update done to the model of telescope head typically used, but the best ones are available from this site

Kessler Crane
elektraDRIVE BUNDLE PACKAGE with ORACLE Controller $1,314.95 $1k(controller only)

This can control two axis, so either a slider or a P/T head. It appears to be set up for very complicated programming, ramping motions, real-time/timelapse/stop motion, all that you’d expect.

Ditogear evolution $976.47
Ditogear really has an exciting product here. With up to 6-axis control, bezier controls for the ramping and motions, wifi enabled. It runs on android devices, with an IOS version expected in 2013.

Ditogear has also been collaborating a lot with this next one…

Dragonframe software $295
IOTA controller $750

It seems there may be no option for live playback since this is a software solution primarily for stop-motion, but through their IOTA (2-axis) controller or with an arduino (up to 8-axis) you can interface with stepper motors for motion control. The interface seems really slick, with bezier ramping and many axis of potential control. Additionally, and this part is very cool to me, they have created an interface allowing for the control of DMX lighting, so that while being able to control your camera’s move you can also have animated/programmed lighting. (I’ll be writing another post about lighting later)

eMotimo TB3 $750

Programmable for video and stop-mo, and self contained, supports Pan and Tilt plus outputs to control a third axis. It is also firmware updatable with open source architecture and a community.

CineMoco $415/$825

Slick, transportable and self-contained, the CineMoco dolly/slider presents a good option for a single axis of motion. It is programmable for Video/timelapse/Stopmotion. However it is $425 for the Dolly, and does not appear to have a supported way to control/program additional axis of movement. I did a bit of research though, and it sounds like they plan to allow for daisy chaining, and the developer of it says it can support up to 32 axis

Ideally this is something that can be easily broken down and brought on set, or out to a location to shoot background plates.

CamBlock $10k
The best in the ‘modularity’ category, and something I’d love to model off of is CamBlock For their brain they use a pocketPC running custom software.

But at $10k for three axis movement its out of my league.

Open Moco
This group of makers/hackers using the arduino have come up with some great open-source solutions, and appear very helpful, and many solutions have come from here, including the MX2;

John Pilgrim
I’m sure there are a lot of people online doing similar home-made rigs, but John is one who’s designs have stood out to me;His design for a pan/tilt head using stepper motors and pre-assembled gearing is pretty inspiring

Phidgets motors

These motors are apparently very quiet and will fit in Dynamic perception’s mounts;

Gini Slider $350

This slider looks very good, especially for the price; Modular and expandable. I could definitely see hooking up a motor and belt to it very easily;


80/20 seems like a lot of fun. Its T-slot framing, basically there are a bunch of extruded aluminum parts made to interlock, and designed essentially like an ‘industrial erector set.’
Ebay store

Servo city
Pan/tilt $649.99

In addition to having a wide selection of stepper motors, DC motors and real-time controlers, servo city also offers a couple pan/tilt heads. They look very well made and sturdy, but the price is a bit high for something which would still need a ‘brain’.

For realtime and playback they even have self-contained servo driver/controllers;

I’d hoped gathering these notes would provide me with a clear winner, but I’m still deep in the brainstorming phase. I’m tempted to start with the Gini slider to go with my current rig and piece together a modular arrangement, then work on finding a way to motorize it with Phigets and ServoCity motors, or one of the more expensive kits from ditogear or Kessler ($629).

But deep down I realize the brain should be the starting point. The best brain sounds like it’d be an arduino-based custom one, since the majority of the open source ones appear to be that way, and there looks like some good firmware starting points at Open Moco but not being an engineer or skilled programmer I’m hesitant to try and do that part my self. As far as prefabricated, the best ones seem to be the dito gear evolution with its nearly $1000 price tag, the more affordable Cine Moco, which would be great and self contained, but most useful if I can confirm that it’ll support more than one axis of motion, or the eMotimo with it’s three axes and integrated pan/tilt, but I’m a little discouraged by the the non-nodal pan and need to separate the camera from the current rig.

Continued here;

Texture map tiling

Last week I brought my camera to the secret laboratory and shot a ton of texture reference of the VN armor, LINK so now I’ve been banging my head againsed this for the last three days.

I’m working on texture maps and I’m getting some weird edges and tiling, seems like the model is repeating the diffuse map. I have played with the tiling but it repeats and centers the map, wondering if its something that can be fixed from your side, is it a normals thing or a UV map in the OBJ?

Right now I’m importing the texture in iClone and opening the texture in photoshop, edit…update…edit… there has got to be a better way to paint this map but not sure what it is.

you can see the tiling more clearly here, its as if each face repeats the same diffuse map centered on the face of the obj.

I’ve heard the new photoshop can paint directly on OBJs, but i don’t have access to it right now, I tried using this uvmapper, but I’m not sure what I’m doing –

I know there must be a way to ‘unwrap’ a model to get the texture to stretch across the whole thing.

Minor update:

I’m pulling reference of the UV maps and it seems like that is the problem. I must be overwriting the UV data while I’m doing teh textures;

You can see here that the end caps are placed over the unwrapped sides;

That is in comparison to this, which is the UV ref from a primative generated from within iClone;

I’m still unsure how to fix this, but I feel like I must be getting closer…

Minor Update #2

I’ve spent a couple more hours putting together teste and think I’m getting closer to understanding whats happening, but I still think there must be a better way to do this.


Here are three different tests. The first from the left is a primitive generated from inside iClone, the texture is wrapping properly, the UV ref seems to map correctly, however I am NOT going to try and create the whole model from this method, the shapes are just too complicated to make without a real modeling tool set.

On the right is the first exported prop, generated from the OBJ. It seems like I must have replaced the OBJs UV and regenerated it through iClone, which appears to have overlaid the side and front faces causing the tiling problem. As far as I can tell there is no way to revert this UV swap.
Using OBJ’s UV in iClobe
Replacing UV in iClone

In the center is the OBJ imported and using it’s native UV, which is supposed to map like this;

But as you can see it isn’t. There is a split happening in the middle and it seems to be grabbing from the ‘back’ to fill it…


Seems like I’m closer and further away at the same time.

VN Animatronic Robot (pt1)

VN animatronic robot

Its long been a dream of mine to have a real, working, robot. I know a working autonomous Animatronic Robot with an AI is far out of my depth, but I want one that can be programmed to act in movies, run my booth at conventions, talk to people and hand out flyers. Even something like the animatronic band at Chuck E Cheese.I have begun these videos as a behind the scenes for some of the construction.

As usual I’m relying on as many found and prefab parts as I can to achieve the mechanical production look and move as quickly through the construction as I can.

More detailed information on the Drab Future blog;

via Von Neuman Moderator Animatronic Robot (pt1).