Tue, Apr 23, 9:05 AM CDT

iPi Desktop Motion Capture In Review

Feb 06, 2012 at 12:00 am by Store Staff


file_478618.jpgProduct Review: iPi Desktop Motion Capture Standard Edition

Although motion capture systems have been available for quite some time, I’ve never had experience with one. iPi Desktop Motion Capture is the first motion capture application I’ve ever used, so this means that I lack any specific experience regarding mocap systems in general, so I won’t be able to make any comparisons.

Unlike other motion capture setups where you get the cameras and software, iPi Desktop Motion Capture’s solution is software-only, and you have to take care of the capturing hardware yourself. The software has two main applications, one used to capture the video data, and the other one to perform motion analysis and movement tracking. You perform your motion capture based on video you have previously recorded, and iPi Desktop Motion Capture bases this process on recognizing and analyzing your performer’s movement on video, so you don’t need to use special suits, LED markers, and such.

The motion capture process is pretty simple. First, you use the iPi Recorder application to shoot video using your cameras. You can either shoot the performer directly, or shoot a calibration pass first, so the software will know where the floor is, and what the capture volume will be. You can use many cameras for this (up to 6, depending on the edition you’re using), and the result is a single video containing all the different perspectives, one next to the other.

When you’re done, you import your video into the iPi Studio to perform the motion capture. If you shot a calibration pass, you run that one to let the software know the different stats in the scene, like floor and camera positions (you can manually set where the cameras are located, so it’s a good idea to always take note of values such as camera height, distance from actor, and such). The scene calibration can be stored in a file for future use. When this step is done, you import your performance video and start capturing.

As I said before, iPi Studio uses image recognition to perform the motion capture, and this is what makes using the software very tricky. It’s better to perform in front of a light-colored, clear background, and dress with flat-colored clothing that can offer some contrast (for example, long sleeve red shirts and yellow or blue pants, green socks…) so the software can easily pick the character features. It’s recommended to start your shooting with your performer in T pose, so you can match the software’s dummy character with your performer in both pose and color. Luckily, the software has commands that help in this area, as it will very closely match your character’s pose and wardrobe color with that of your performer (you have to manually align your character with your performer before doing this, though).

Matching colors is really important, and it’s really tricky. When I was first working with the software, I was having problems all the time, pretty much resulting in completely useless captured data, because I had not matched my character with my performer.

Tracking the data usually takes a long time (on my system, each frame took from 2 to 4 seconds, so a 300 frame animation could take up to 1200 seconds, or 20 minutes), and the resulting motion data is usually jittery, so you need to apply a softening filter (included as part of the iPi Studio’s toolset). After capturing is done, you can use that motion data in MotionBuilder, Max, iClone, and others. I chose the MotionBuilder exporter, since it’s what I now use for character animation, and inside MotionBuilder I could use that data in any way MB allows.

I said iPi Desktop Motion Capture is a software-only solution because it doesn’t include any kind of capturing hardware. iPi Soft lists some cameras that you can use for video capture with iPi Recorder, but there is no “official hardware.” The recommended hardware is either the PlayStation 3 Eye camera, or the Microsoft Kinect, and iPi Soft provides drivers for both devices, so you don’t need to perform any kind of hacks to your computer. I don’t own a Kinect, so I don’t know how good/bad the captured video may be. On the other hand, the PS3 Eye camera can shoot at high frame rates, which is needed for this kind of motion capture, but it’s still a webcam, so it doesn’t offer crisp, high-resolution digital video like what you’d get with a higher resolution camera. However, for markerless motion capture camera resolution is not critical, and the PS3 camera yields pretty good results.

Motion captured data always needs some tweaking afterwards, so if you have a 3D application it will always be a good idea to go through this captured data after its done, to fix issues that were not captured correctly, or bring your animation up to the next level (for example, maybe your actor performed a kick, but you want the digital character to kick higher, or jump faster).

As I said before, I can’t compare iPi Desktop Motion Capture to any other motion capture system, because I’ve never had any experience capturing motion. iPi Studio does a pretty good job though, and can be a very affordable motion capture solution for anyone with a low budget, but I do think setting up the entire system is not an easy task. You also need to keep in mind that you can only capture the larger body features, and smaller details will still need to be hand-animated (like fingers). And while larger, more complex motion capture systems may be able to capture such fine details, they cost hundreds more. For this reason, iPi Desktop Motion Capture should be on the top of your list if you’re an indie animator/filmmaker or game developer, as it will surely be a great time saver, even if you need to tweak your animations afterwards. If you’re still not convinced, you should definitely try out the demo.

For more info, please visit:

This member has no items for sale.


Sergio Aris Rosa [nemirc], is Sr. Staff Writer for the Renderosity Front Page News. Sergio discusses on computer graphics software, animation techniques, and technology. He also hosts interviews with professionals in the animation and cinematography fields. You can follow him on Twitter, and if you want to see what he's up to you can visit his blog.

 

 

 


February 6, 2012

Please note: If you find the color of the text hard to read, please click on "Printer-friendly" and black text will appear on a white background.

Sections: Archives

Comments

or Register to post a comment



Privacy Notice

This site uses cookies to deliver the best experience. Our own cookies make user accounts and other features possible. Third-party cookies are used to display relevant ads and to analyze how Renderosity is used. By using our site, you acknowledge that you have read and understood our Terms of Service, including our Cookie Policy and our Privacy Policy.