Does Hitfilm Pro have the tools to fit into a "Professional" work environment?

This discussion was created from comments split from: HitFilm Pro 2017... great for video\VFX, not so much for motion graphics :(.

Comments

  • @Triem23 @Aladdin4d

    Does HitFilm properly support the workflow a professional in the industry would require to get the job done? 

    I imagine TIFF would be to way to go for everyone that doesn't have really powerful computers, since it stretches beyond previz, or is that a false claim?

    I have seen in other threads on here, that big studios actually use HitFilm, I believe that Agents of S.H.I.E.L.D is one of the shows that use HitFilm for some stuff (correction: yes, it's true), which is interesting to me. I'm guessing since this was a while ago, they could have done more than just previz with HitFilm 2017 than what they were able to do before with HitFilm 3 Pro, or would EXR still be a limiting factor?

    I'm referring to these comments: 
    3D artists need exr. No exr, no pro artist will use hf.
    Well only those that render with png, jpegs, amateurs. And I'm interested to hear which workflow would exr not be desirable working as a 3d artist.

     

     

  • Triem23Triem23 Moderator
    edited December 2016

    @CNK this is a tough question to give a definitive answer to, since different studios will have different workflows and different tools. I can only make scattershot observations and attempt intelligent speculation. 

    First, Aladdin and I do make much, if not all (I wouldn't know Aladdin's full work schedule, of course) of our income in video production. However I do live events (multicam switched production), weddings/private events, and the occasional (minor and local) TV work with the occasional web project. I am basically a one-man-band, therefore I don't usually have to worry about passing data, EDLs and work files to others. I am responsible for my own output, whether to DVD, Blu-Ray or Web. For TV work, I've had to worry about matching broadcast limitations, but the stations I've done stuff with, believe it or not, actually have WMV files on the broadcast stream, so giving a final delivery in mp4 (but at a fairly high bitrate--say 50mbps) has actually been acceptable. Yes, I have done shots for television entirely end-to-end in Hitfilm. Yes, I am "professional."

    My toolkit encompasses Vegas Pro, Premiere, Ae, Boris FX (a standalone compositor--little brother of Boris RED), Photoshop, Blender, PD Howler, and a lot of other programs. Basically no single tool can, or should, be able to do absolutely everything. For one thing, the overhead required would be ridiculous, for another, the interface would be bloated beyond belief, and certain ways of approaching data are mostly mutually exclusive (I know you keep hoping for nodes in HF, but to do so FxHome would effectively have to create an entire module separate from the existing Composite Shot since Composite Shot layers have a rigid linear flow and nodes don't. It's very unlikely to happen). 

    The average Hitfilm user is also a one-man-band. Many Hitfilm users are obsessed with trying to use only one program. Studios don't. A single shot might bounce through Maya, Houdini, Fume, Ae, Premiere, and Resolve (among others), and between Mac, Windows and Linux systems. 

    Now. Hitfilm DOES HAVE the output formats to fit into a multi platform workflow, but there are extra steps involved. You are correct that TIFF is an often used format for 8 and 16 bit/color channel workflow, but a studio would have tools to convert PNG or Open EXR. The problem is this is an extra step requiring extra time, thus money. Open EXR is used more and more, however. Hitfilm doesn't export audio-only, so the best workaround is turning off video tracks, dumping uncompressed AVI and extracting the audio. Studios have the tools, but, again, it's extra time and money. The lack of XML makes it nigh-impossible to transfer EDLs (a studio would have the resources to write custom tools--if they wanted to spend time and money). That said, XML will transfer cuts-only, with no transitions or effects. The native tools for various software are usually proprietary, and "generic" standards like OpenFX are not universal (Ae plug-ins are proprietary, for example. Any third-party plug-in creator has a separate version for Ae from OpenFX.). 

    Now add in that Nuke, Fusion and Aehave been the standard tools for decades, and the truth is that Hitfilm is NOT going to be a standard studio tool for final production. That's fine, that's not what FxHome is trying to do. Understand Adobe does not care about the indy producer. Their subscription model works because after 20 years in the industry the studios WILL pay up, and a single studio generates dozens, if not hundreds of licenses. 

    Back out studios and Hitfilm possibly has a larger user base. Express is a hell of a giveaway, even without add-ons. 

    So, Hitfilm (probably)  isn't going to be making final footage for Michael Bay. That said, as you've noted, some studios are using Hitfilm for previz. That's great! That's an important and valuable task. For final shots--well, there's Ignite. That's what Ignite is for--getting studios (and indys on other NLEs) to use Hitfilm tech. Simon Jones mentioned once a major studio (FxHome declined to give specifics) needing Ignite tech support. He did note specifically the Gunfire effect. Given that most movie and TV gunfire is digital, we're probably seeing a lot of Hitfilm gunfire out there, but it's coming from Ignite in another host. 

    This doesn't mean Hitfilm lacks value to a young artist hoping to work on big-budget shows. As a layer-based compositor, Hitfilm is very similar in capabilities and workflow to Ae. Good technique and organization in Hitfilm will directly translate to Ae. Learning one will enable one to pick up the other quickly. Changing platforms is 30% changes in terminology (point/null, Composite Shot/precomp, parent/pick whip, etc), 50% UI/workflow  changes (in Ae you use a context menu to use one layer as a matte to another, in Hitfilm we use the Set Matte effect), and 20% adjusting to tools the other lacks. For a beginner or hobbyist I would recommend Hitfilm over Adobe  because for half the price of a one-year all-Adobe subscription, you get about 75% of the power of a Premiere/Ae combo without being locked out of your own archives if you decide to stop paying. For a working Pro I would still recommend looking at the HFP demo--that artist might find they prefer Hitfilm for many tasks. Perhaps that artist likes some effects, but not workflow? That artist can get Ignite separately (a friend of mine who is a team lead at The Mill [L.A. office]  has a personal copy of Ignite for Atomic Particles and Gunfire. He's recommended Ignite to the higher-ups ,but I don't know if they bought.). If that artist likes HFP, well, it comes with Ignite, so that also expands other software!

    Now lets hit this one:


    3D artists need exr. No exr, no pro artist will use hf.
    Well only those that render with png, jpegs, amateurs.And I'm interested to hear which workflow would exr not be desirable working as a 3d artist.

    That's Chibi, and as you read threads, I'm sure you've noticed me saying he's wrong. Period. Remember *I* do TV work, and, yes, I've use. PNG sequences on final shots. I wasn't getting paid enough to work in EXR and chibi is totally ignoring previz and is obviously ignorant of how crappy broadcast streams are. A digital broadcast of HD material is usually about 10-15 mbps a 8-bits/color channel. This is about 50% higher than a YouTube stream. For OpenEXR he's talking about OpenEXR with aov--which is simply a way to embed multiple alpha channels into a single file. OpenEXR dates to 2001, with aov added in November 2013. There is zero functional difference in rendering/compositing multiple file passes and having a multi-channel file. Basically it makes things a little bit faster and a lot easier to organize. When chibi implies lack of this one feature makes Hitfilm unusable, he's wrong, and confusing a feature he would like for something critical. 

    This is a common problem for you young whippersnappers--to be a grumpy middle aged man here, you kids have grown up with such amazing and advanced tools, you're used to a "one button" solution and don't have the background in older/other methods to do workarounds.  I give a few examples:

    Example(s) one: Ae has "Turbulent Displacement," while Hitfilm has Heat, Smoke, Fluid and Energy Displacement. All of these effects are a Displacement filter with a built-in fractal noise generator. One can get the exact effect by embedding a comp of some animated Fractal Noise as a source layer for Displacement. These specific Turbulent/Heat/Energy/Fluid/Smoke Displacements combine two tools into a single interface for speed--but understanding how these effects work let me create those looks in software that doesn't have those particular effects. Vegas, FCPX, Motion, Premiere, Resolve and Avid don't have Turbulent Displacement--but they all have Displacement and Fractal Noise, so one can get the same effect in those programs. Turbulent Displacement is convenient, but it's not critical. Displacement shifts pixels on the X/Y axes from 0 to the max number of pixels specified based on the selected channel of a control map layer. Functionality, there is zero difference in creating a black and white Fractal Noise as a control map or using the built-in Fractal Noise in Turbulent/Heat/Smoke/Energy/Fluid Displacement. One method is fast and easy, but creating a custom control map is far more powerful. Another related example is the Title Studio in BCC 10/HFP 2017. All this tool does is combine the regular Extruded Text, Type on Text and Layer Deformer into a single effect, so one doesn't have to chain. It's a bit more convenient, but it doesn't do a damn thing you can't already do in BCC9/HFP4.

    Example 2: I do a lot of multicam editing, and Vegas is my primary editor (especially with the multicam in Ultimate S for Vegas Pro). Now, I've used Vegas since version 1. I started using Ultimate S in Vegas 6 and Vegas itself didn't add a multicam tool until Vegas 8. Hitfilm has no multicam tools. This doesn't mean you can't edit multicam in Hitfilm, but you have to do it the "old-fashioned" way--multiple tracks and lots of slicing, then moving clips to a top track. It's a lot slower, but that's not the same as being unable to do it at all! In fact I have a buddy who is a Vegas editor who doesn't have Ultimate S and hates Vegas's native multicam. To this day he genuinely prefers slicing clips and manually moving his cuts to a top track. The annoying thing to me is that I am a far better editor--I'm faster, my cuts are tighter, and he basically has me do all his grading--but he gets more, and higher paying gigs than I. He's a charming bastard where I am pedantic and piss people off. Talent and skill are NOT the most important part of getting the job. 

    OK, Example(s) Three: There's a Sam & Niko (Corridor Digital) vlog where one of their editors (I want to say Wren, but don't quote me on that) is talking about issues he'd had with a media file. Problem was it was an old M-JPEG encoded file, it was interlaced, and, not only did they not know how to deal with this file, they weren't aware of the EXISTENCE of this type of file. Now, Corridor Digital is a group of talented, imaginative guys who do truly excellent work, but--kids in their 20's who were completely lost dealing with a bit of stock footage from 2009? These holes in their knowledge would absolutely kill their chances of getting a lot of positions at a lot of places. Anyone working in broadcast, stock media, archiving, restoration, etc is going to need a working knowledge of older formats, and that knowledge will remain viable for decades more. Fortunately for Corridor, their business doesn't deal with these tasks, and Corridor has attained a well-deserved level of success. But, man, sometimes watching a breakdown or BTS I want to drive up there and impart some old knowledge that would make some of what they do easier. To be fair, those guys could teach me a lot about the new techniques emerging.

    Sorry, as I so often do, I drifted off topic. ;-) Still, understanding underlying principles is, in my opinion, very important. You've read enough of my reponses and posts by now to see I often try to give background explanations rather than a simple "follow these steps." Teaching someone a series of steps teaches a specific, narrow thing. Teaching someone the WHY of those steps gives a foundation to grow on your own. Take the discussion on Displacement vs Turbulent/Heat/Smoke/Energy/Fluid Displacement in this thread--talking about Turbulent/Heat/Smoke/Energy/Fluid Displacement teaches someone how to use that particular effect. Telling someone how any why those particular effects work teaches how to create thise effects in software that doesn't have it, and, if one is being clever, hopefully pointing out how one could use other types of control maps with basic Displacement to create custom effects! Understanding control maps allows Displacement to create Predator invisibility effects, glass looks, or even generate fake 3D in a 2D image! 

    Back on topic. To summarize, Hitfilm CAN fit into a multi-program team environment. However, it lacks certain functions that would make it easier. Hitfilm won't replace Ae or Nuke or Resolve in Hollywood anytime soon (even if it had XML import/export, audio export, OpenEXR aov, DNxHD export, RAW/R3D import, etc) since these programs are already so ingrained in the system. But Hitfilm can find a place in supplemental tasks (previz), and Ignite brings Hitfilm effects into other programs. Learning Hitfilm and developing good work habits is still valuable if/when moving to competing layer-based compositors.

    Lack of support for OpenEXR aov shouldn't be a deal breaker unless one is lazy or completely ignorant of workflow more than three years old. 

  • Triem23

    If you want to go ahead and stay in previz and render pngs then that's your choice. But you don't do 3d for a living as you have posted.
    Most professional studios, 3d artists will not work with pngs as a render sequence. Unless they are way behind. Even for still images 3d artists want floating point images with all the buffers accessible in it.
    Now even mid level studios are moving to Deep image compositing.
    If Hitfilm can't even support exr properly they won't be considered in vfx studios mid to highend. Its target audience will remain in the low and indie sector.

    And I've been rendering exr with embedded buffers in vray and lightwave for more than a decade. 2013 what??

  • "There is zero functional difference in rendering/compositing multiple file passes and having a multi-channel file."


    Have you tried manipulating a depth buffer that is 8, 16, 32 bit float?
    You will see the difference between them. :D
    There is a reason why 32bit float is used a lot for compositing.

  • Additional reading about exr 2.0 and deep compositing.
    https://www.fxguide.com/featured/the-art-of-deep-compositing/
    Its trickling down to mid level studios now.

    Hitfilm doesn't even support basic exr buffers yet after three major version since HF2. Its hard to replace ae, fusion, nuke for compositing in the pro environment without it.  Even the one man 3d artist today works with buffers for compositing. But not pngs.

  • @chibi I'll admit I am possibly in error here. It may be that it's possible that OpenEXR 1. 0 had provisions for multichannel rendering (I never used OpenEXR until 2014). However, the press release for OpenEXR 2.0 lists Deep Data at the top of the "New Features" list. Not my fault if the marketing guys listed something as brand new that wasn't. Here's a relevant article. OpenEXR.com lists Nov 2013 as the release date. If I am in error, I can say the OpenEXR devs released bad information. 

    There's still little functional difference between a multichannel file and multiple single files other than the time savings of only opening one file instead of many, but the bottom line is Hitfilm can load, composite and export OpenEXR. Any mid-to-large studio will have the software to either extract individual channels from a Deep Comp or combine multiple passes to a Deep Comp. Yes, it's another step, but you're still arguing that Hitfilm can't be used at all in an OpenEXR workflow, when it can. An extra step or two is inconvenient, but that's not the same as not being possible! 

    You're also still talking about mid-to-high level studios when FxHome has clearly stated their target audience is independents and small studios! They're not trying to compete with Adobe, Fusion and Nuke! I mean Pro users are pretty much subsidizing Express users. 

    And it's absolutely true that my 3D work is limited compared to general edits, vfx and mograph, but, um, yeah I do know industry people working in previz in 2016 rendering PNG. Your "if you want to stay in previz" comment is a bit insulting to those artists. Remember, previz ends up being used to determine production shot lists, and how final versions are set up. Comparison of many modern movies from previz to final you'll see that the final usually matches the pacing angles and movement down to the frame. Among other things previz is used to avoid expensive time building extraneous frames--and the side benefit here is that the previz team gets some of the most creative work! These are the guys who determine what the shot is going to be and, increasingly, previs assets are being handed off to the effects teams to refine. "Through the Looking Glass" and "Dawn of the Planet of the Apes" are two recent movies where the models, animation data (including mocap)  and camera data got passed up the chain from previz. In short, previz got to create the shot. The effects teams got to make it pretty. 

    And they still used PNGs for animatics, because when you're going back and forth with the director and producer designing the cool shot you work fast and cheap. Let the effects guys spend lots of money making someone else's animation look finished. :-)

    Fundamentally, I agree Hitfilm should add Deep Data support. I still maintain that the current lack isn't a total deal breaker for the software. I'd rather see mp4 handling on the timeline get from laggy to smooth, XML import/export* and some kind of audio only export. All three of those would benefit more users (and be of more value to a studio) than Deep Data. 

    *I wouldn't ever use XML import/export. I've made comments earlier about the difference between my wishlist and what's best for the overall user base. I want Spherical particle sim forces, auto targeting/parenting to layers of particle trajectories, forces and emitters as well as transparency and reflection maps for 3D models. XML import/export benefits more users than my wishlist items, and should be a higher priority.

    If there's one killer feature for integration of Hitfilm with other programs it's a total lack of any way of transferring EDLs. Not a missing subset of OpenEXR functions thst has a workaround! 

  • @chibi

    "Most professional studios, 3d artists will not work with pngs as a render sequence"

    As a final render to go on to the next stage of the pipeline probably not but for other purposes prior to that point it will almost certainly happen

    "Even for still images 3d artists want floating point images with all the buffers accessible in it."

    That's fine as long as the 3D artist has no intention of ever working with anybody else or seeing his or her work leave the 3D department. When work leaves the 3D artist/department it should conform to what is best for the next stage in the pipeline. Nobody else needs or wants everything the 3D artist does. 

    "Now even mid level studios are moving to Deep image compositing."

    True.

    "If Hitfilm can't even support exr properly they won't be considered in vfx studios mid to highend.."

    Well that depends on your definition of "properly" now doesn't it? What HitFilm does handle is well within the OpenEXR spec so in that sense it is handling it properly. No it doesn't support multi-channel EXR but as you continually ignore that isn't an absolute requirement either. If it can be a channel in a multi-channel file that channel can just as easily be a single file. In some workflows there are even major benefits to doing it that way. And since I'm curious what about mid and highend studios using TIFF or dpx workflows? Are you going to try and force the same thing on them telling them they're a bunch of amateurs? How do you think that's going to work out for you?

    "And I've been rendering exr with embedded buffers in vray and lightwave for more than a decade. 2013 what??"

    You've been going on and on and on about multi-channel deep data support. That's not the same thing as what you've been doing for more than a decade. Deep data OpenEXR was released in 2013

    April 9, 2013 - Industrial Light & Magic (ILM) and Weta Digital announce the release of OpenEXR 2.0, the major version update of the open source high dynamic range file format first introduced by ILM and maintained and expanded by a number of key industry leaders including Weta Digital, Pixar Animation Studios, Autodesk and others.

    The release includes a number of new features that align with the major version number increase. Amongst the major improvements are:

      1. Deep Data support - Pixels can now store a variable-length list of samples. The main rationale behind deep images is to enable the storage of multiple values at different depths for each pixel. OpenEXR 2.0 supports both hard-surface and volumetric representations for Deep Compositing workflows.
      2. Multi-part Image Files - With OpenEXR 2.0, files can now contain a number of separate, but related, data parts in one file. Access to any part is independent of the others, pixels from parts that are not required in the current operation don't need to be accessed, resulting in quicker read times when accessing only a subset of channels. The multipart interface also incorporates support for Stereo images where views are stored in separate parts. This makes stereo OpenEXR 2.0 files significantly faster to work with than the previous multiview support in OpenEXR.
      3. Optimized pixel reading - decoding RGB(A) scanline images has been accelerated on SSE processors providing a significant speedup when reading both old and new format images, including multipart and multiview files.
      4. Namespacing - The library introduces versioned namespaces to avoid conflicts between packages compiled with different versions of the library.

     

     

     

  • No matter what FXHome does, mid to high end studios won't look at HitFilm... because they have toolchains built around insanely high end tools.

    When there are people doing what looks like really high end work in HitFilm and have been for years, then they'll start getting taken seriously in higher end studios... but that is for those of us who like HitFilm, irrelevant.

    Some folks here have dome some really good work with older versions of HitFilm, and because it's a one-stop-shop for editing and compositing, and because it now has sophisticated enough color tools to do the job, IMO HitFilm gives us what we need to do great work that will be indistinguishable from professional work... given talent and dedication.

    Ergo, work done in HitFilm can get you hired. It's great for small shops, too; it gives you a lot more for the money than the Adobe suite does, and with a much nicer UI, and all in one... I like not having to do any round tripping for small projects.

    Nodes would be amazing in HitFilm... but probably also a massive effort -- it's probably better to link to Fusion, since Fusion is free. And mind-bogglingly powerful, easily a rival to Nuke. It used to be one of the stalwarts in the Shake days before apple killed Shake.

    Every release of HitFilm has been pretty big; the team's giving us more and more professional grade functionality, and is clearly listening. Definitely worth keeping an eye on even if you're not using it yet.

    I'm underusing it right now; hoping to change that... I just need to get some learning time in.

     

     

  • @Triem23 Although also layer-based, AE has it's flowchart view, which is useful to see what's being applied to what layers without having to drill them all open. Although there are many other things that HF could adopt from AE before that that would be appreciated more.

  • @CNK Triem23 already covered a lot of things but I did want to add something about your comment on using TIFF's. Your thoughts are on target. TIFF is often still chosen as a "common denominator" format for a lot of reasons. 

    And some more for @chibi ;

    "There is a reason why 32bit float is used a lot for compositing".

    True and that's why HitFilm Pro 2017 now supports 32 bit float compositing and the importing and exporting of 32 bit floating point OpenEXR sequences.

    "Hitfilm doesn't even support basic exr buffers yet after three major version since HF2. Its hard to replace ae, fusion, nuke for compositing in the pro environment without it.  Even the one man 3d artist today works with buffers for compositing. But not pngs."

    HitFilm does now support 32 bit float OpenEXR giving you roughly the same support as Fusion because as I've said before Fusion hates multi-channel EXR files and you are much much better off not using them with Fusion. Being pedantic AE doesn't have native support for multi-channel EXR files either. EXR support is supplied via a lite version of a third party plugin. What the plugin does is extract the channels and presents them to AE as single entities to be used as layers in a comp. In other words it fakes having multiple independent sequences.

    You seem to think multi-channel EXR is the only to get 32 bit float support. It isn't. If the data can be contained in channels, each channel can be saved as independent files and I guarantee you the tools you're using can produce independent sequences just as easily as they produce multi-channel sequences. Compositing with either one is the same.

     

  • edited December 2016

    @Triem23

    For a beginner or hobbyist I would recommend Hitfilm over Adobe  because for half the price of a one-year all-Adobe subscription, you get about 75% of the power of a Premiere/Ae combo without being locked out of your own archives if you decide to stop paying.

    I'm a hobbyist and this is exactly why I came into the Hitfilm fold.  'case you wanted some evidence/validation. 

  • @Triem

    "You're also still talking about mid-to-high level studios when FxHome has clearly stated their target audience is independents and small studios! They're not trying to compete with Adobe, Fusion and Nuke! I mean Pro users are pretty much subsidizing Express users."

    Mid to high level studios need exr support. Period. I don't really care about wedding events.

    Aladdin4d
      "Fusion hates multi-channel EXR files and you are much much better off not using them with Fusion."

    OMG! Are you serious??????? I'm speechless and lazy to type in so many words why that statement is so very wrong.  Keywords for you, Kelly Myers, Fusion, Battlestar Galactica.

     

    Such long winded posts from mods. Moderators should moderate.

     

  • Deep Image Compositing with Exr 2.0 will be standard  workflow for 3d artists in 5 years. But first we need proper Exr 1.0 support  ;)

    https://www.youtube.com/watch?v=x7gt_CHYkj0

  • I have nothing to add, but I did read your explanations/answers, just wanted to point that out -- thank you! :)

    You guys can carry on, eventually someone is going to lose, who will it be, do we need to get Dr Phil in here? ;)

  • @chibi Dead serious and you might have wanted to choose an example where the artist didn't break out multiple render passes to separate sequences because that's exactly what Mr Meyers did for most of it. In instances where he didn't he also ran into the problems and tedium of working with multi-channel EXR files in Fusion. He used exrTrader to get his render passes out of Lightwave noting this:

    "Step Three: Key Light Pass Settings for Render Buffer Export with exrTrader. exrTrader from db&w is an OpenEXR file saver that takes advantage of the different buffers Lightwave's Rendering Engine has and saves those buffers that you can choose out to a single OpenEXR file (*.exr) or to multiple files individually named and stored to individual directories all at once."

    How about some random quotes on working with multi-channel exr files in Fusion?

    Splitting EXR drives me nuts

    The whole thread also mentions resorting to Python just make things livable

    Extracting EXR Channels after Loader?

    Again native sucks so resort to Python scripts

    Multichannel workflow

    More complaints about Fusion and multi-channel exr along with this choice comment "I think most people think multichannel workflow means you want to render everything in a monolithic exr and do your comp without separating the passes in your flow. There are a dozen reasons why this is bad."

    I could do this all day. Fusion doesn't like multi-channel exr and you can find all kinds of horror stories and lots and lots of advice saying don't use multi-channel.

    You could be doing some great things HitFilm Pro but by your own admission it seems you're just too lazy.

     

     

Sign in to comment

Leave a Comment