Mocha tracks and animation of 3d models

Hi Guys,

I needs to achieve the following but I am a little confused as to how to pull it off using Mocha hitfilm. I will have a handheld shot down a lake taken from a bridge over the lake. Into that shot I want to have to aircraft fly down the lake and over the actors head on the bridge. 

From what I have seen of Mocha it doesn't product a camera animation when it tracks, rather it creates a bunch of points that one can parent assets to. So my question is simple. I've seen how to put a 3d model that is fixed in place into tracked footage using Mocha. How could I track the handheld shot (with Pan) and then add the craft hows own position will be animated (flying towards the camera) while also using the track of the handheld shot. I should probably also say I have PFhoe if a point tracker would be a better option in this case. 

I hope this makes sense. I have the flu at the moment so my mind is a little fuzzy. 

thanks in anticiption

Comments

  • Actually, Mocha DOES animate the camera during a solve. If you have any mocha tracked solves floating around take a look at the camera's transform data.

    So, in theory you can take your mocha tracked footage into Hitfilm and just toss your model in and animate it. In practice, since Mocha is doing a fantastic fakery (attempting to generate a 3D world based on how a couple of rectangles are moving), there's potential jitter issues. Accuracy of a Mocha track is very high near any generated points but can fall off with distance (understandable since not only is mocha guessing based on rectangles, but motion is limited,  more or less, by the pixel grid.).

    In general try and get your tracking planes as close as possible to where your plane will go. Since it will be in the air, then perhaps your ground plane below where you want your plane... That's not a confusing sentence. ;-)

    As mentioned, sometimes an object placed into a mocha tracked shot can jitter--especially if it's far away from tracked areas. A quick fix is to just delete a bunch of camera keys and let Hitfilm interpolate.

    One thing to remember about mocha (or ANY tracker) is, of course, mocha has no clue what it's looking at--it's just fakeing a solve based off parallax of planes. This means that often your tracked world might be far off to the side somewhere nowhere near the origin of your 3D workspace. This Martian Brenner tutorial covers aligning your track to your workspace. A good thing to know.

    https://www.youtube.com/watch?v=32NJS_SIuFM

    Since you have PF Hoe I might say try that first, since Pf Hoe is kind of a click-it/forget it tracker. (and if PF Hoe give you strange locations for your tracked points you can use the same damn technique used for mocha to align your Pf Hoe track to the workspace.) If the PF How track is solid use that. If not, go for Mocha. Point trackers can be more accurate than planar because of the multitude of points, but planar tracking can solve shots that point trackers get lost on. Since you shot your own footage I will assume you didn't do stupid whip-pans that blur out your potential track points. ;-)

    For some really good information on Mocha, I recommend this webinar. It covers dealing with tracking if your tracked areas go out of frame, and also covers tracking OBJECTS in mocha (basically you 3D solve the scene, then go back to track objects) and also shows a neat trick for "tricking" mocha into solving a "Static" camera, which is great for tracking objects. The 3D model parts use C4D but you should easily be able to translate to Hitfilm.

    https://www.youtube.com/watch?v=EI1SQbfL-4w

    I feel for you on the flu. i don't have that, but my allergies are acting up and I have a Benadryl high.

  • Thanks @Triem23 Rapid and fantastically detailed as always. I stand corrected on the camera animation thing. I have to admit I have very limited experience with tracking. I haven't shot the footage yet but of course I was going to avoid any such whip pans ;-)

    thanks again for the help

  • Np. One other note. Make sure you always have some shore or whatnot in shot. Don't fill the frame with water. Reflections and water movement are confusing for trackers. Trees, grass, buildings, bridge, clouds--all of these types of objects provide good tracking areas. 

  • edited February 2016

    @Triem23 Probably deserves its own thread, but for what sort of shots is a  planar tracker 'better' than a point tracker? I see multiple tutorials in AE where they just "track the scene" with a few hundred points, then they're free to create arbitrary planes all over the place by just selecting any three points that they think are on a plane and can attach whatever they want to them. It seems a lot less fiddly than what we have to do with Mocha and still get problems, and they only have to do it once per scene instead of once per plane.

    This Video Copilot tutorial was pretty interesting (not a clickable link so there isn;t a big AE video in the thread; just cut'n'paste it) ;) :

    www.youtube.com/watch?v=ZiXdcziloG8

     VideoTrace:link uses points, and is a doddle to use. It's a shame it seems to have been abandoned. The models it produces load into Hitfilm just fine, although I've not yet found any way of translating its 3DS Max Camera Path data into the Maya format Hitfilm recognises. If I could, extracting 3D model data (with the correct textures) from a video to re-overlay them for everything else to interact with realistically would be really, really useful to me.

    Currently the only way to do it is to use a dolly shot to let it do its 3D shape extraction voodoo, then go back and film from the original static point on a tripod and overlay the models you get on top of that video and line them up by eye and don't move the camera again; which works as a proof of concept, but is a bit limiting, as you could do most of that with 2d masks.

    I guess what I'm saying is: It would be nice to have the choice and have a Point tracker in Hitfilm. Even just adding a third point to the current system and using a bit of maths to calculate a 3D plane would be really, really handy. You can sort of fudge the effect with Quad Warp and 4 tracked corners, but it's not quite the same. ;)

  • Triem23Triem23 Moderator
    edited February 2016

    @Palacono also notice those point tracker shots are usually fairly slow and even in lighting. One you start getting motion blur in the shot point trackers get hosed. 

    I certainly would never object to a full cloud point tracker in Hitfilm, but I want scopes first. ;-) 

    Actually I like the idea of adding a third point to the existing tracker. This wouldn't be accurate enough for full animations, but would be great to hang a static model or particle system from. 

  • edited February 2016

    @Triem23 Yep, a 3D plane from the 3 points would do for lots of things. Occlusion masks. simple texture replacement, models on tables etc.. 

    Maybe making the camera static and the world moving around it, would make the maths simpler?  Then you'd get an arbitrary plane that just had a lot of keyframes for the angle changes and origin position, which is pretty much all you need for a lot of things.

    Even if it wasn't particularly accurate, you'd get some usable 3D movement and you could use Quad Warp on the plane to keep any wandering corners locked in place (like a texture replace on a wall with perspective), by having them track some (other?) points on the surface. Then, presumably that Perspective Tick Box would earn its keep and the  texture would look correct, which it doesn't when applied to 2D points. 

    An extra point tracker and a bit of maths that's probably in the Public Domain? Even easier than scopes! :D

  • True, but Scopes is ine I have classified as a *need* where a 3D point cloud track is a *want.* ;-) right now it's difficult to get fully accurate color without scopes (I do stuff for TV stations. Broadcast levels are 16-235 not 0-255) and that is a huge deal for *every* user. (I also have XML on the "need" list even though I wouldn't use it. But that, and more Export options would enable Hitfilm to "play well with others" and become a fully-integratable tool in a larger facility.) 

    All that said, two of the four times I've gone into AE in the last two years were 3D point tracks for shots I just couldn't do in Mocha. The other two were me working with another artist who was in AE. 

  • edited February 2016

    @Triem23, I hear you, which is why I do most of my initial colour correction in GoPro Studio Premium. I could go elsewhere, but it's a doddle to mess about in GP Studio on one window, save the file and the LUT is updated and picked up in the Cineform AVI file in the video editor, which mercifully, Hitfilm and Sony Movie Studio both like playing with, even if Hitfilm is a bit slow to recognise the new LUT.  Only problem is GPSP crashes a fair bit and they've recently killed it off in favor of the free version which has less colour correction tools than Hitfilm. :( Does have the flux feature though, which is nice, but I need 2 PCs as they can't coexist on the same machine.

    But a 3 point tracker? Someone could knock that up in an afternoon. :)

    And is that 16-235 still true for everything that's broadcast? I'm always seeing adverts for TVs which have "the blackest blacks and the whitest whites".... 

  • @Palacono For broadcast TV stations, yes, because values under 16 and above 235 still have subcarrier waves. Also, at least in the US, broadcasters still have to be ready for SD (the stations I work for are local govt. Low budget. We're digital, but still SD), so HD stuff is still prepped for broadcast to meet the 16-235. Internet  (Including Netflix), DVD and Blu-Ray can go 0-255, but the 16-235 range is still the mastering standard.

    That said, I still tend to try to limit most values to 8-235 even if I'm going for internet. That gives me a little headroom for the grading process when I start adding contrast curves or glows to deepen and brighten without clipping. So it's actually in many ways still better to pretend those limits exist. After all 8-235 is a little wider than 16-225, but leaves some wiggle room.

  • @Palacono yeah it's still true for broadcast sources and illegal to broadcast values greater than 16-235. Like Triem23 pointed out non-broadcast sources aren't limited that way and typically go full range. The gotcha is sites like Youtube and Vimeo assume everything uploaded is 16-235 and expand it even if it's already full range so you always want to give yourself some wiggle room. I know you're using VMS instead of Pro but this guide pretty well explains everything from a Vegas perspective

    http://www.jazzythedog.com/testing/DNxHD/HD-Guide.aspx

Sign in to comment