Does HitFilm Utilize the Hardware Acceleration Capability of Intel CPU?

13

Comments

  • @Triem23 - Not automatic - an option, similar to Resolve.

  • I agree with that, but being able to ingest original footage and then generate optimized media that I can edit with in-app is nice... provided that the software manages the linkage between the two well.

    That way I can edit with optimized media, but pop back to original media when I'm color grading, doing VFX, or exporting, and switch again to optimized media when I need to modify the edit.

    Resolve does this pretty well. In theory, I'd be able to import 8K raw footage, generate optimized media, edit it, and switch back to raw for color grading, and also have it send the original raw to Fusion for compositing and VFX.

    Also in theory, this should be possible in HitFilm, with the all-in-one nature of HitFilm making the VFX roundtrip simpler. If it's happening when I want it to, then I don't mind waiting for the software to generate optimized media.

     

  • Triem23Triem23 Moderator

    I merely state again that the Export Queue in Hitfilm 2017 makes it possible to batch as much media as you want. 

    Tutorial releasing on Friday. 

  • And to automatically switch back and forth between optimized and original media...?

    If your original footage is H.264, there's no value in going back and forth; all you need is the optimized media and that's it. If you're working with higher end codecs like RedCode however, there's a LOT of benefit in being able to transparently go back and forth, especially as the resolutions go up.

     

  • Triem23Triem23 Moderator

    @WhiteCranePhoto no auto switching. Still have to manually relink. Not a big deal for a long form event with a few clips, but a pain for a film, or an event with tons of B-roll. 

  • @Triem23, Plus, on a long even with a few clips, there's no reason to shoot in a codec bigger than ProRes 422 HQ in HD resolution, which doesn't require transcoding for editing anyway. :)

     

  • @Triem23 Agreed that will work but its just an extra unnecessary step. First import all of your H.264 camera files, then add to the timeline and name and render each one out as Cineform AVI to a different folder. Then start again by importing all of the AVIs back again.  

    Easy way 

    1. Import camera files
    2. Select all and generate optimized media - select Cineform as an option and go have a cup of coffee.
    3. Edit and render project out to cineform or a delivery codec.

    @WhiteCranePhoto - Good point for pros. You need a render option to choose between optimized media AVIs or your original files - like Resolve has, Thanks

      

     

  • Triem23Triem23 Moderator

    @WhiteCranePhoto true-that, although all the cameras I personally own shoot AVCHD or straight h.264...

    Upgrade cameras or get some Blackmagic Video Assists and take care of external monitors and ProRes/DNxHD recording in one step? The eternal question. 

  • @Triem23 External proxies aren't an option with HitFilm, since there's no way to export a timeline for conforming in other software. If HitFilm had some conform tools then maybe, but right now any time I shoot in a high end codec, HitFilm ends up being a non-starter. 

     

  • Triem23Triem23 Moderator

     @WhiteCranePhoto Speaking of Proxies, I'd like Hitfilm to add them! What Hitfilm calls proxies are really prerenders...

  • @Triem23 That would be a big improvement... especially if the devs also integrate the latest Red sdk. HitFilm supports 8K, why not start importing it directly and proxying it for performance?

  • @WhiteCranePhoto @Triem23 Yes, proxy got added to increase render and export performance for nested comps. This was the original use case.

    A lossy proxy, not used for export is a different use case and is definitely useful. Good point about being able to switch on and off the lossy proxy, I like that idea.

    Thanks for all the feedback guys :)

  •  There are reasons why Vegas added a media smart proxy mechanism a while ago and Premiere has done so recently. Given the constant timeline performance problem reports it would behove Hitfilm to have the same. Especially now that Hitfilm has an intermediate export feature. Thing is, Hitfilm does not have a Preview/draft mode. That is where the editor auto uses the media proxy. Easy enough to add a use smart proxy mode to the viewer with all the other options enable/disable various render features.

    Vegas actually uses XDCAM (aka mpeg-2 ) for its proxy mechanism. Even though LongGOP, mpeg-2 might be the fastest to decode of lossy compression setups assuming no silliness in decoder use.

  • Excellent discussion but we've strayed a bit from Intel Gpus. Maybe we should start a new thread on intermediates and proxies for 4k editing and I have no idea how we will handle 8k. 

    My ambitions stop at UHD TV viewing/broadcast quality. I understand Hitfilm and other NLEs have been used in the movies, What pre and post workflow and hardware do you use with Hitfilm for the sort of quality needed for the big screen? 

     

     

  • One more thing please. I'm still learning Express which is limited to HD. If any of you have Pro I'd like to know how 4k/uhd project plays/scrubs on the timeline vs transcoding to Cineform AVI. Thanks 

  • @FishyAl The transcode thread has a link to something I put together. Probably not a proper test but something I whipped up in response to a thread post. Here is a direct link.

    https://hitfilm.com/forum/discussion/42285/hitfilm-timeline-4k-performance-demo-video

    Here is the transcode thread just to be complete.

    https://hitfilm.com/forum/discussion/42349/transcoding-for-better-performance-and-easier-editing

  • Aladdin4dAladdin4d Moderator

    @FishyAl NormanPCN already put together a demo of just that

    Hitfilm timeline 4k performance demo video

     

  • edited February 10

    Thanks @NormanPCN great tests - results are similar to my experience with other NLEs. Seems 4k is testing the consumer PC limits with H.264 and H.265. Even a current 4ghz 4 core i7 and an decent GPU struggles. My best results after testing Cineform and Grass Valley HQX were with MagicYUV. File size is 10x but it's 100% bit for bit lossless, every frame is a keyframe, and it's super fast on the timeline. Desktop storage is cheap. I'm adding a 12TB raid 0 to my new PC with two Barracuda 7200 HDDs ($500 -$40/TB) which should also handle transfer speed. If not, a Samsung M.2 1TB drive has a 3.5GB/sec read transfer speed which should solve the need for buffering.  

    Ideally we want 4k realtime timeline editing with full 4k preview. Compression is the enemy and decompression the challenge. It's going to take a lot of hp with 4 layers on the timeline plus effects with AVC camera files. Moore's law died. 4k is here and HDR TVs are already on the market and we are headed to 8k and beyond at higher bitrates and color depth. CPUs are unlikely to go beyond 4.5-5mhz as silicon technology has hit the wall. See  The future of computers - Part 1: Multicore and the Memory Wall and The myths of Moore’s law

    More cores will help but only to the extent that parallelism can be used in NLEs and there are tradeoffs as explained above.

    GPU power has grown rapidly but will ultimately be limited like cpus and memory transfer rates. Gaming is massively parallel and GPUs are great but editing is still CPU bound. 

     

    Seems to me the only solution is to use edit friendly codec formats to reduce the edit workload as much as possible. Cineform, Avid, GV HQX and Apple are now free but they are hardly new. Faster codecs like MagicYUV may be the answer, Uncompressed or raw video may be best for edit workload but need fast transfer rates, Logical workflow might be:

    1. Proxy files using same codec - low resolution copies which are faster to edit but reduced preview quality. It's hard to judge your 4k quality at 720p. Use original files for 4k render.
    2. Transcode to a slightly lower resolution (2k) edit friendly codec for better speed and preview quality. Use original files for 4k render.
    3. Transcode to an edit friendly codec at original 4k quality for better speed and high quality timeline preview. Use Transcoded files for 4k render.
    4. Transcode to RAW.

    Many of these options are already available in Resolve- an NLE that was not designed for consumer PCs.

    The alternative is a change in camera storage technology so that small portable cameras can record low or no compression 4k video on one or more 500GB+ SD cards. It may be possible when  you see the size of the new M.2 1TB SSDs with transfer rates from 500MB-3.5GB/sec.

  • For me 4K is pure hype but people are into it. Video delivery bitrates of HD are so poor that 1080 is not as good as it can be. My DVDs look better than my most of TV these days. Very sad. My same DVR can hold twice as many hours as it did a few years ago. I can see the difference. Crapola. Something like the Superbowl gets more bitrate but not much else.

    For heavy 4K work 8-core all the way. GPUs can handle the graphics workload (aka effects) but the video decode needs the CPU. People often think GPUs are faster than CPUs. The ALUs in the GPUs are actually inferior in many ways to CPUs. Serious logic is a weak point of GPUs but a strong point of CPUs.

  • Triem23Triem23 Moderator

    @FishyAl As I've pointed out to others here before (usually asking why an i3 with Intel HD 4000 can't do 4k in real time), in 2005 SD broadcast really was still dominant. Now we're pushing 4k.

    4k has 32 times the pixel density of SD (NTSC)

    Since 2005 CPUs and GPUs have become about 20 times faster....

    What happens when you cram 32 times the data into 20 times the "bandwidth?" Things slow down... Now I'm oversimplifying here, but not by much. H.265 may be a more efficient compression scheme than h.264, but, oh yeah, that data has to be decompressed and reconstructed before the NLE can process it--so, yeah, 32 times the data in 20 times the "bandwidth..."

    And now that Moore's law is slowing down, can you imagine what's going to happen when everyone starts pushing for the utter pointlessness of 6K and 8K?

  •  "Since 2005 CPUs and GPUs have become about 20 times faster...."

    Really?

  • Triem23Triem23 Moderator

    @NormanPCN somewhere else in this forum I posted on this with links the the relevant dataset I was using. Right now I've been up for 28 hours. ;-)

    And, of course, any speed comparison is going to vary by benchmark and method used, but I'll stick by that number as "close enough" to make the point.

    For example, the top GPU of 2007 was the Geforce 8800, which has a passmark score of around 700. The 1080 has a passmark score of about 13000. Not quite 20x faster, and limited to a single benchmark sample, but supportive of the thesis, so I'm going with it. ;-)

    I see Intel T7300s benching at around 700 and i7 6700's around  10k which is only a 14-fold increase. But I also compared the "Extreme" 2007 CPU to the "main" 2017 CPU.

    Which merely means, if anything, I've understated the problem. :-)

    Since I recognize your superior knowledge on hardware, if you disagree with the assertion, I'll defer to your "PCN" status. :-)

  • edited February 10

    @Triem23 - Agreed. I built a state-of-the-art  PC in 1997 using Adobe Premiere 4,2. Had to add an expensive SCSI drive to get the 3mb/sec transfer speed from tape. Premiere system requirements "8 MB of RAM for Windows 95 ; a 100-MB hard drive; an 8-bit color (256-color) display adapter and monitor, and a CD-ROM drive". Cost me a fortune. I think the Pentium was $500. Pal of mine working on a linear betacam system in a TV studio was highly impressed. 

    My new PC will be - i7-7700k water cooled and o/clocked to 5ghz, 32GB DDR4 3200, 4GB GPU card with 3 fans, 12 TB HDD Raid 0 etc and it fits into the same sized box I had in 1997, A lot more than 20X the PC and it will still struggle with 4k editing.

  • NormanPCN  Please explain. What bitrate are you using on your BDs? Mp4 or Mpeg-2?

    Thanks Al

  • @Triem23 Yes GPUs are quite a bit faster. Transistor budgets are that much higher. Same for CPUs. For CPUs I tend to ignore the addition of extra cores over time when comparing performance. That is my bias.

    20x just seemed like a big number for a single core performance increase, but a number like that is with all cores blazing. This assumes an app can get a lot of cores blazing.  Contrast this with GPUs where I always consider extra "cores" as important. Again, my bias. GPUs are just used differently than CPUs for the most part and extra "cores" are more important to how GPUs are used.

    @FishyAl Well Bluray bitrates are commonly around 25Mbps AVC for 1080p24. There was a website that cataloged this once, but the posts have been deleted. The internet Youtube/Vimeo are around 5Mbps AVC these days for 1080p<=30 last I tested. DVDs are often around 5+Mbps mpeg-2 for SD content.

    I don't know what my TV is giving me these days, as I don't know the exact HD size, but I have had the same DVR for years, same service FIOS fiber optic (Verizon...now Frontier) and a 1 hour TV 1080i60 program these days takes only about 1% of my DVD compared to when I got the service were it took about 2%. There very likely was no switch from mpeg-2 to AVC behind the scenes since I see such poor quality so often these days. Such a switch could lower their bitrates and preserve quality. I see lots of macroblocking going on even with little movement. Seriously so in the darks depending on the channel. Even back in the day heavy movement would macroblock compared to Blu-ray if the background was not soft/blurred.

    For videos I encode, they are GoPro mounted to a mountain bike, first person view and the trail is bouncy. This does not compress enough for internet delivery bitrates and looks very soft, even with a lot of flat blue sky in shot. AVC has a inbuilt deblocking filter so you are not likely to see things get blocky but just go soft. 12Mbps is okay. 16Mbps is better.

    For standard TV/movies fare, where people, are standing, walking and talking you can get enough compression that stuff can look okay at seriously low bitrates. The content providers, sans Blu-ray, are pushing those boundaries. Now if someone sees the same broadcast if a quality bitrate their opinion might change. Some might prefer softer.

    I remember when the Black Hawk down DVD came out. It was insanely sharp at similar bitrate to DVDs in my collection. I supposed that most directors and likely actors (faces) don't want the hyper crisp stuff. With 4k being hyped this seems contrary to the idea of 4K (at least to me). With today's post processing one can keep faces/skin soft and the rest sharp. Have your cake and eat it too.

  • Yeah, 4K is overhyped. It's not necessary most of the time. Ironically, All of the paying work I've gotten with my 8K camera has been captured and finished in 2K. 

    We're shooting a feature in 8K for the Redcode option rather than because we need 8K, and I expect that we'll finish in 2K unless we get picked up by a distributor that requires a 4K DCP.

     

  • Triem23Triem23 Moderator

    @WhiteCranePhoto reminds me of a "back in the day" short we budgeted for 16mm that we were suddenly shooting on Super 35,because the three other producers wanted to brag. 

    It was a bad shoot that led to me leaving the company I co-founded. So, typical indie, I guess. 

  • @Triem23 ;Yeah, that's pretty typical for indie films. Bragging rights don't make the film any better; just costlier. That was a concern when I contacted my Red rep and told him that I was going to make the Epic-W jump... the 8K part meant high data rates, and so on.

    The thing is that since we don't need an 8K master and not much compositing, we can shoot most of it at 12:1 or 14:1, since 4K is 25% of our resolution (which is ridiculous).  We end up saving data rates compared to 4K using other intermediate codecs, with higher quality overall.

    We gain a lot of versatility, too;  we can push to ISO 3200 and still get clean results, which means we don't need nearly as much light to give us the look we need. We still are going to be modifying and supplementing the light, but we won't need HMIs to pull it off.  Also with the color management in the camera, we'll need less work to color grade in post, which is also a good thing since we're aiming to have the film finished in time for SIFF, which doesn't leave us much time.

    So it turns out that we're able to save money with this camera, but even so the resolution is pointlessly excessive.

    If we were trying to master in 8K, we'd end up blowing the budget on disk storage. :)

     

  • edited February 10

    @WhiteCranePhoto  Wow! What 8k camera do you have? If we are struggling to edit 4k please explain your 8k workflow. I saw a new codec last month for Premiere that claims 8k editing without proxies !!!  http://comprimato.com/jpeg2000-codec-plugin-adobe-premiere/ 

    Also have a contact at Intel who got REDCINE-X PRO running on a Skull Canyon mini PC, He said "This past year I also worked with RED Digital Cinema to enable REDCINE-X PRO for Intel Iris Graphics. We gave them a Skull Canyon NUC (containing a “SkyLake GT4”, the largest graphics die we have ever fabricated), and they loved it. The small form factor was a big hit, and they used it to show off their 4K workflow at their IBC booth in Amsterdam. I’ve spoken to some DITs who also appreciate the smallness, and said it is like a mini-workstation that they can use while shooting on-set or in the chase van. " I still think Intel graphics (our original topic) has great potential for media work. It just makes sense to have the gpu linked to the cpu - instead of sending the work off by bus to a discrete gpu and waiting for it to return. Intel also linking up with AMD for future gpu development.

    I'll try AVC 12 & 16Mbps bitrate for BD thanks.  

    @NormanPCN Pleased to hear you also work with lowly GoPros.  

     

     

  • CNKCNK
    edited February 10

     4K is very controversial on the internet, huh...

    I read that wrong, we're not trying to get smooth 4K playback, because when 4k doesn't work, 8k will. =))))))))

     

     

Sign in to comment

Leave a Comment