IPi Recorder can also now load unsynchronised videos from multiple cameras and sync it automatically. Updated 11 December 2018: iPi Soft has released iPi Motion Capture 4.1. The update makes it possible to use the real-time preview with data captured simultaneously from multiple sensors, either connected to a single computer, or to multiple.
Serial Ipi Mocap Studio (serial-ipi-mocap-studio.torrent.rar.zip) in free image graphics lake can download by Megaupload Rapidshare 4share. IPi Mocap Studio 2 keygen - iPi Soft iPi Recorder. IPi Recorder is a software program provided by iPi Soft LLC for capturing, playing back and processing video records from multiple cameras and depth sensors. Captured records can be used for motion tracking in iPi Mocap Studio. IPi Recorder supports the following cameras and depth sensors: Sony PlayStation3 Eye (also known as PS Eye). IPIIntelligent Processor Island (IPI) and IPI Track Circuit (IPITC) Instruction Installation. Data Recorder Module 80115 (For Model 3000 GCP) SIG-00-01-17.
iPi Mocap Studio
iPi Mocap Studio is a software program provided by iPi Soft LLC for tracking an actor’s motion by analyzing multi-camera (or depth sensor) video recordings. Input videos are recorded with another tool — iPi Recorder.
|Latest version:||See iPi Studio Release Notes|
|Supported OS:||Windows 10 / 8.1 / 8 / 7 (32- and 64-bit)|
|Output Formats:||See full list in our wiki docs|
|Editions:||The software comes in several editions which differ by kind and number of cameras supported for tracking.|
|Hardware:||See our Accessories and Hardware page to select and order items you may need.|
|Free Trial:||30-days free trial has full features of Basic Edition|
|End User License Agreement|
If you’ve decided to purchase, go to the Store page and follow the instructions. After purchase, you will get a license key, which enables you to upgrade the trial version to a selected edition of the software.
To remove the software, use regular software uninstall via Windows Control Panel.
The programs are from iΠ (iPi: iPisoft).
The iPi Recorder is free and can be installed on multiple machines.
The iPi Studio is not free but has a 30 day trial for the Basic version (up to 2 depth sensors/4 PS3 Eyes)
http://wiki.ipisoft.com/Main_Page contains all the different configuration option links.
This page http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Configuration has all the information needed to set up with 2 Kinects which is what I used. On the page there is a section for Valve Source Filmmaker which is how to import DMX from iPi to SFM. Copied/pasted here:
Valve Source Filmmaker
First, you need to import your character (or its skeleton) into iPi Mocap Studio, for motion transfer.
There are currently 3 ways of doing this:
- You can import an animation DMX (in default pose) into iPi Mocap Studio. Since it has a skeleton, it should be enough for motion transfer. To create an animation DMX with default pose, you can add your character to your scene in Source Filmmaker and export DMX for corresponding animation node:
- open 'Animation Set Editor Tab';
- click '+' -> 'Create Animation Set for New Model';
- choose a model and click 'Open';
- export animation for your model, in ASCII DMX format;
There is a checkbox named Ascii in the top area of the export dialog.
- Alternatively, you can just import an SMD file with your character into iPi Mocap Studio. For example, SMD files for all Team Fortress 2 characters can be found in your SDK in a location similar to the following (you need to have Source SDK installed): C:Program Files (x86)Steamsteamapps<your steam name>sourcesdk_contenttfmodelsrcplayerpyropartssmdpyro_model.smd).
- If you created a custom character in Maya, you should be able to export it in DMX model format. (Please see Valve documentation on how to do this).
Then you can import your model DMX into iPi Mocap Studio. Current version of iPi Mocap Studio cannot display character skin, but it should display the skeleton. Skeleton should be enough for motion transfer.
To export animation in DMX, you should just use 'General...' export menu item in iPi Mocap Studio and choose DMX from the list of supported formats. You may also want to uncheck 'Export T-pose in first frame' option on the 'Export' tab in iPi Mocap Studio.
Now you can import your animation into Source Filmmaker. There will be some warnings about missing channels for face bones but you can safely ignore them.
Of the 3 options above I used option 1. Worked well for me.
In addition I watched a ton of tutorials as I attempted to use PS3 Eyes before I went to Kinects.
Jimmer Lins has a 7-part tutorial on YouTube which you already have linked on the class page, was very helpful. (I think the DMX support was added to iPi after Jimmer did this tutorial).
The iPi community tutorials are very useful also.
You can also save a BVH and process/refine through Motionbuilder/Maya as you posted on the class webpage using pre-capture animations. I used this quite a bit to remove stuff from the animation.
I also used PS Move controllers for hands. During the semester they added support for head tracking with an additional PS Move strapped to the head. I didn’t try that. Apparently the software will support up to 6 motion controllers (PS Move and I think Wii controllers also) but I am unsure if that is for one actor or for their in-development 2 actor mocap.
I was unable to get a good calibration with 3 PS Eyes and I gave up after a week and moved on to the Kinects. If someone tries the Eyes I would suggest reading the docs on the iPi wiki thoroughly and through the community forums to see if they want to try it. I couldn’t get 4 PS Eyes to work on my laptop due to bandwidth issues.
Each Kinect requires its own USB controller and theoretically 2 PS Eyes can be run on a single controller however if your system takes up too much of the bandwidth on the controller you will have issues. My laptop was siphoning 20% on one USB controller for the touchpad which I had to disable and then add a Bluetooth mouse to get my mouse off the USB bus.
I had issues attempting to set up the Kinects near 180 degrees apart. I ended up in a configuration less than 90 degrees which for what I was doing was ok, but for turning motions it was not optimal.
Ipi Recorder 2
Because the trial for the Studio was only 30 days I waited until the last possible day to activate to cover the rest of the semester. An additional month to learn and use the software would have been helpful. The Studio is rather expensive so I didn’t buy it. Would be nice for machinima students to get a 3 month trial maybe, if iPi was willing to offer that.
The captured video files can be absolutely huge. When I changed from the PS Eyes to the Kinects I turned off the video feed and this reduced the size of the captured by a factor of 10. PS Eye or Kinect video capture files were about a GB a minute. With just depth sensor info from the Kinects a minute was about 100MB. This may or may not be a factor for some people, for me it was because I was recording in the garage and running in to my office to process. Transfer times on my wifi were too slow so I was copying to an SD card and loading to the machine with Studio enabled.
I really wish I had had more time with the iPi software to get the PS Eyes working. The capture area for the PS Eyes is 20’x20’ where the Kinects is roughly 7’x7’. Also the PS Eyes capture at up to 60 frames per second and the Kinects are fixed at 30.
The processing takes quite a bit of time even on a fast computer and it’s not perfect. For simple animations it is not too bad and will do most of the work for you. I had issues with anything where my hands and arms crossed my body. The depth processing would lose my hands and arms in the mass of my torso and I would have to manually adjust and re-track.
The mocap was frustrating at first but once I figured out the basics it was really fun. The very first calibration and processing took a long time to figure out for me. Once I got it the first time though it was pretty easy to tear down and set up somewhere else.