Does HitFilm Utilize the Hardware Acceleration Capability of Intel CPU?

edited August 2016 in Filmmaking

Am just reading about Kaby Lake which has just been announced. Googled a bit but didn't seem to find a definite answer. So the question...

Does HitFilm utilize the hardware acceleration capability (e.g. Quick Sync, GPU-accelerated encoding/decoding) available on Intel's CPU?

Sometimes when my box can playback high resolution clips smoothly using programs like Windows Media Player, but then editing them in HF becomes quite a challenge even without any effects or adjustment.

Cheers.

(Sorry for the poor grammar, it's late at night.)

«134

Comments

  • CNKCNK
    edited August 2016

    Hello,

    We can get to the bottom of your issue if you'd like.

    What's your PC specs?

    Download and post a report of one of the files you're using in HitFilm: https://mediaarea.net/en/MediaInfo

    To answer your question, no, HitFilm doesn't use or take advantage of Intel's Quick Sync feature. This feature isn't all just pros though, it reduces output quality quite a bit, in exchange for speed (faster renders).

    I don't see them implementing this into HitFilm anytime soon, mainly because their stance on equality among consumer hardware is clear, they don't want either AMD, NVIDIA or Intel to have an advantage. 

    The advantages will still be there in the form of faster processors (Intel is way ahead of AMD), or NVIDIA having a massive advantage in the high end GPU market, as AMD can't even compete right now, they didn't launch any products in the high end market.

    Lastly, you can't compare a dedicated video playback software to a full blown NLE and VFX software. :)

  • Indeed rendering speed is not the main concern at the moment, but that just playing back the video in the trimmer stutters.

    PC config: i5-4300U, 8GB RAM, SSD

    Media info:

    General
    Format : MPEG-4
    Format profile : JVT
    Codec ID : avc1 (avc1/isom)
    File size : 1.36 GiB
    Duration : 3 min 14 s
    Overall bit rate mode : Variable
    Overall bit rate : 59.9 Mb/s

    Video
    ID : 1
    Format : AVC
    Format/Info : Advanced Video Codec
    Format profile : High@L5.1
    Format settings, CABAC : Yes
    Format settings, ReFrames : 1 frame
    Format settings, GOP : M=1, N=8
    Codec ID : avc1
    Codec ID/Info : Advanced Video Coding
    Duration : 3 min 14 s
    Bit rate mode : Variable
    Bit rate : 60.0 Mb/s
    Width : 3 840 pixels
    Height : 2 160 pixels
    Display aspect ratio : 16:9
    Frame rate mode : Constant
    Frame rate : 29.970 (30000/1001) FPS
    Color space : YUV
    Chroma subsampling : 4:2:0
    Bit depth : 8 bits
    Scan type : Progressive
    Bits/(Pixel*Frame) : 0.241
    Stream size : 1.36 GiB (100%)

     

  • CNKCNK
    edited September 2016

    I think that the problems with playback are 2 things:

    - Format

     - Resolution

     

    You definitely want constant bitrate, and you definitely want an editing codec, you don't want to edit H.264, that's a delivery codec.

    Your laptop isn't the best, but not the worst either by any stretch, and as far as I'm concerned, HitFilm is not doing the playback on the GPU, that's the CPU and disk speed, though if I'm wrong someone is going to correct me, so don't think that is a fact, please. :)

  • Variable bitrate is not a problem for editing. It is variable frame rate that is the problem for editing.

    4K/UHD is very difficult to edit. It will require a powerful computer to do so in real time for many tasks. For 4K you probably want 8 cores to keep things smooth across transitions and such.

    AVC is probably the highest overhead codec to decode there is. Your file looks like it might be coming from a GoPro (GOP : M=1, N=8). GoPro AVC files are more difficult than most. I don't know why but that is just my past experience.

    You can transcode the source AVC file to Cineform with GoPro studio software (free). It should be easier to edit. Or you can transcode to an easier to edit AVC file by using Handbrake and the "fast decode" option.

  • Transcoding with "fast decode" seems to help a bit, but still stutters somewhat.

    My question is, as the video is stuttering in the trimmer (no edits, no effects, etc...), I wonder if HF does not use any hardware acceleration in this regard...?

    Cheers.

    P.S. It's a DJI drone clip.

  • @LoYukFai Video decoding is generally CPU bound without any hardware acceleration like QuickSync and you're not likely to see QuickSync assisted decoding in most editing programs anytime soon. QuickSync decoding acceleration actually requires an active Intel GPU in order to work. That means if you have an Intel CPU but are using a a discrete GPU from Nvidia or AMD then QuickSync decoding is useless.  

    As to why you're seeing stuttering it's because you're working with 4k/UHD video. It's just harsh to deal with and the minimum requirements for 4k work are pretty high. The lowest stated minimum requirement for editing 4k that you're going to see is a quad core CPU with 8 or more cores being the recommended spec and hyperthreading logical cores don't count towards the number of needed cores. You have a two core processor so you're really just maxing out what it's capable of decoding 4k AVC video. Transcoding to Cineform like NormanPCN mentioned will help but realistically you might have to switch to a proxy, offline workflow.

  • Hitfilm doesn't use any hardware specific acceleration--no CUDA, no Quicksynch (Quicksynch isn't very good, actually). Video decode is done by the CPU and render to screen by the GPU.

    Unfortunately your machine has a low-mid level processor and the integrated GPU is near the minimum required. 

    As Norman stated, editing 4k smoothly requires a high-end computer. Here's a couple of numbers: your video is encoded at about 60Mb/s an uncompressed stream is 5690Mb/s. Your computer is having to unpack from 60 to 5690 on the fly, and draw that to the screen, AND scale it to fit the relative panel. This is using massive amounts of computer resources. 

    As Norman said your footage is packed in a format that is difficult to decode. You could improve editing speed by transcoding to an intermediate editing format like DNxHR or ProRes. However, this will greatly increase file size. 

    For a good discussion on why editing codecs are preferable to delivery codecs, see this article:  http://telestreamblog.telestream.net/2012/04/save-yourself-frustration-use-editing-formats-when-editing-2/

    Given your specs chances are your machine isn't powerful enough to edit 4k at full res in real-time playback no matter what. 

  • It is possible to have the Intel GPU active with an AMD/Nvidia GPU also. Even without a monitor attached to the Intel GPU.

    All the GPUs (AMD, Nvidia, Intel) have hardware AVC decoders built-in these days. However these decoders are designed around basic video playback used by video players. A single video stream. Video editors have greater needs, such as simultaneous decode of multiple video streams (transitions, cuts, compositing).

  • @NormanPCN Actually that's not quite good enough for QuickSync decoding. The Intel GPU must be the primary GPU, used for an extended display or switched to on the fly with Lucid Virtu. Win 8 and higher will support a headless or unconnected iGPU but that's impossible with Win 7. Also if you are using Lucid Virtu to switch from a discrete GPU to the iGPU, the discrete GPU will be disabled for the duration of the decoding.

  • @Aladdin4d In the distant past, I have had a headless setup on Windows 7, but I was doing that to use the Quicksync encoder via Sony AVC in Vegas. I had an AMD main GPU at the time. I never thought about or tried the QS decoder as nothing I had tried to use it.

    There was a process to get "headless" to work on Win7. At least for encoding. AFAIK the Intel GPU driver thought something was there or you could make it not care something was not there. Not strictly headless but it can work. 

  • @NormanPCN

    There was a process to get "headless" to work on Win7. At least for encoding. AFAIK the Intel GPU driver thought something was there or you could make it not care something was not there. Not strictly headless but it can work.

    That's a virtualization technique and Lucid Virtu is the original Intel approved method. With Win 8 and up you're still using virtualization but you gain the ability to have a hybrid solution of a QuickSync decode and a discrete GPU render but you're not necessarily gaining anything in reality because it really depends on what all is implemented in the virtualization. In Win 7 you're stuck with the iGPU doing everything shutting out the discrete GPU during the process. These kinds of things are only really going to turn up trying to decode using QuickSync. 

    Encoding is different with it's own twists and turns. The difference being Intel assumed content creators that would be interested in using QuickSync to encode would be much more likely to use a discrete GPU so they made it a little easier to deal with.

     

  • Thanks for the replies and information.

    A follow-up question, a 4-core CPU with basic integrated graphics versus a 2-core CPU with better graphics (e.g. Iris Pro level), which would be better for editing films in HF?

    Cheers.

  • edited November 2016

    Intel's integrated GPUs aren't all that impressive, though they're reaching a level of adequacy. You're probably better off with either a better processor or one of those Razer machines that gives you the option of adding a performant GPU via a RazerCore later.

  • edited November 2016

    Intel Quick Sync is now supported by many Professional NLEs now including Vegas Pro, , Edius 8 and DaVinci Resolve 12.5. They wouldn't use it if it was inferior. Maybe Hitfilm should have a closer look. 

    http://www.intel.co.za/content/www/za/en/architecture-and-technology/visual-technology/graphics-overview.html

    https://flv.isitetv.com/media/video/2288/video_url_118307_2288.m4v

  • Hitfilm should make use of latest Intel Graphics for editing. There are many benefits.

    https://www.youtube.com/watch?v=2LrPUhToNAg 

    "Movie Edit Pro 2017 Plus, the newly released product by Magix, allows filmmakers to deal with the challenges that 4K 10 bit HEVC video and 360 video format present. Hear how Magix succeeded by partnering up with Intel, which allowed them to enable their product for the latest video coding innovations embedded within the 7th Gen Intel Core processor family, and which empowered them with the most complete software development toolset."

     

  • Triem23Triem23 Moderator

    @FishyAl you can add Quick sync support to the wishlist, but your earlier statement that other software wouldn't support Quick Syc if it was inferior is wrong. Quick Sync is inferior. 

    Others have done empirical testing. http://www.anandtech.com/show/7007/intels-haswell-an-htpc-perspective/8

  • That video is a bunch of hype. Hitfilm does use the GPU fully for graphics.

    Quicksync is a special special case thing, that does not do very much beyond a file encoder (AVC, HEVC) and that encoder is not as good quality wise versus what we have. Faster yes. At high bitrates things are fine but we only use that for intermediates and Hitfilm has better true/real quality intermediate options these days (Cineform, Prores).

    Quicksync for decode is not a real thing for editors. Fine for a media player, and maybe a transcoder, but not for an editor with multiple media file data streams in flight at the same time.

  • edited February 6

    Pro software like Edius 8 claims a 5X speed increase in H.264 render.

    EDIUS_Quick_Sync_Video_Technology

    Latest Kaby Lake gpu has some potent hardware decoding including 4k 10 bit HEVC. 

    Anandtech The Kaby Lake-U/Y GPU - Media Capabilities

    Also DaVinci Resolve (gpu hungry NLE) runs on INTEL GPU.

    DaVinci Resolve* 12.5 Meets Intel® Iris™ Graphics

  • Norman - agreed but many of us shoot and edit H.264 including 4k in 8bit at average bitrates and we want H.264 out which is compatible with most media players, Youtube, and looks great on my SUHD TV.

    I'm using latest Express but I haven't seen how it can transcode to Cineform intermediate codec for editing. This could improve 4k timeline performance as Cineform is gpu optomised and H.264 is not an edit friendly codec.

  • Stop torturing yourself with 8-bit h.264 footage!

     

  • Triem23Triem23 Moderator

    Hitfilm Express doesn't have tools to transcode to Cineform (Pro 2017 can export Cineform), but can import. 

    Gopro Studio (free) transcodes to Cineform. 

    You've missed a key point in this discussion. Quicksynch is a much faster encode, but it's a much lower quality. Every independent test empirically demonstrates that the output quality of Quicksynch encoded mp4 is far inferior to basically every other encoder. You'll have worse color and more aliasing artifacts. Period. 

    Remember, press releases are commercials, and commercials shade things to sound as good as possible. 

    The first post in this thread links to articles discussing why mp4 is terrible for editing, and this thread has optimized settings if you insist on mp4. 

  • WhiteCranePhoto - thanks but my GoPro, DJI Mavic pro drone and my DSLR all shoot 8 bit h.264 UHD and HD. Sometimes we can only do what we can afford

  • Triem23Triem23 Moderator

    Then use GoPro Studio to transcode to Cineform for editing, and export h.264 for your final! 

  • @FishyAl that's why I stopped using dSLRs for video... shot on Black Magic cameras for several years. Nowadays you can get used Black Magic cameras at surprisingly low prices...

  • @FishyAl "This could improve 4k timeline performance as Cineform is gpu optomised and H.264 is not an edit friendly codec."

    How so? Cineform does not use GPU for decode/encode from anything I have seen. Video decode is not a massive parallel task and thus does not fit well with GPU compute.

    Yes, GPUs these days have fixed function, single stream, AVC and HEVC decoders but those are often slower than good decoders on a good CPU, like libavcodec. Sadly many decoders are not so good performance wise. Also, not everybody has a fast CPU.

  • edited February 6

    Triem23 - thanks for the links and advice. I use Handbrake but scripts are over my head. I also use GPStudio and Cineform codec. GPStudio won't convert my Sony H.264 files but I can re-wrap them to mp4. Can Express import the Cineform AVI files if Pro 17 can't?

    I fully understand that h.264 mp4 is a long GOP highly compressed format and not edit friendly. It is, however, the most common  camera format and therefore a challenge for all edit software due to the ever increasing popularity of UHD/4k. h.265 is even worse. To avoid the need for third party conversion software it would be nice if Hitfilm could include the options to transcode to a user selected codec or lower res proxy files prior to edit - like DaVinci Resolve 12.5 has done.

    What codec format is used for the Hitfilm optomized media option?

    I accept your point about Quicksync. The Anandtech link was from 2013 so I'd like to research it further. My understanding is that latest Quicksync is better and faster.  I'm busy upgrading my PC to a Kaby i7-7700k with an Asus Prime Z270-A m/board, 32GB DDR4 3200, a 512GB Samsung 960 M.2 SSD and 8TB raid 0. 

    I'll also test Magix with my Kaby to see if the video is "a bunch of hype" as Norman suggests. Personally, I see significant benefits to the new Intel gpu encoding/decoding abilities for media software.

    I love Hitfilm and will make Pro my main editor once I'm convinced it can handle my h.264 and 4k.

    PS - If you get a chance, try a new intermediate codec called MagicYUV. It's 100% lossless and the fastest codec. Works on the timeline of most NLEs that use VFW or QT (except Resolve). I'm using it to edit my 4k on my old core i5 with realtime highest quality timeline preview. Even handles 10/12/14 bit color depth. Free version has a watermark. Full version is $14 or any donation you choose - including none.

    MagicYUV Lossless Video Codec

    Thanks again,

    Al

     

     

  • Kaby Lake can handle H.264 a lot better than you'd expect, but it still pretty much sucks. I'm a bit spoiled though; I'm using a 16-bit camera with ludicrous dynamic range and stunning color rendition nowadays. :)

    NormanPCN  GPUs actually are quite good with video. It parallelizes well because it's well defined and structured. It's SIMD friendly, so it works well with explicitly parallel implementations. It's due to that that Resolve, Mistika, Scratch, RedCine-X Pro, etc are GPU-oriented.

     

     

  • With QuickSync I understand the hype, but never ever look for demonstrations in an Intel ad and take the information at face value.

    Their QuickSync claims seem nice, but you cant run Intel Integrated graphics and a dedicated GPU at the same time, HitFilm will always choose the most powerful graphics card in your system.

    It may be a viable option for people not needing the more accurate encoding option. I just can't wrap my head around why someone would use it though. A dedicated graphics card is built from the ground up to render stuff, wouldn't a dedicated run circles around integrated graphics when working on effects and when exporting? 

     

  • Aladdin4dAladdin4d Moderator

    @WhiteCranePhoto NormanPCN is right on this one. Video decompression/decoding is CPU bound even in the apps you mentioned. Debayering is now handled by the GPU in all of them I think as is image processing but but none of that happens until you have decompressed video. This is from the Resolve system configuration guide

    However for editing and grading, the compressed data needs to be decompressed to the full RGB per pixel bit depth that will use four times more processing power of a HD image for the same real time grading performance. The decompression process, like compression, uses the CPU so the heavily compressed codecs need more powerful and a greater number of CPU cores. 

    Once the files are decompressed, as DaVinci Resolve uses the GPU for all image processing, and always at the full color and bit depth, the number of GPU cores and the size of GPU RAM becomes a very important factor when dealing with UHD and 4K-DCI sources and timelines.

  • Yep, dedicated GPUs run screaming circles around Intel's GPUs. GPUs are something Intel's struggled with for years. There's even a possibility that HardOCP believes is really in the works that we'll be seeing Intel processors incorporating AMD GPUs in multi-chip modules.

    Some applications are able to use multiple GPUs, but they're using OpenCL or CUDA rather than OpenGL.

Sign in to comment

Leave a Comment