Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It’s probably just not that interesting. There’s generally a proprietary encode/decode pipeline on chip. It can generally handle most decode operations with CPU help and a very narrow encoding spec mostly built around being able to do it in realtime for broadcast.

Most of the video you encode on a computer is actually all in software/CPU because the quality and efficiency is better.



> Most of the video you encode on a computer is actually all in software/CPU because the quality and efficiency is better.

I don't think that's true. I bought a Thinkpad laptop, installed Linux and one of my issues was that watching youtube video put CPU onto 60%+ load. The same with Macbook barely scratched CPU at all. I finally managed to solve this issue by installing Arch. When everything worked as necessary, CPU load was around 10%+ for the same video. I didn't try Windows but I'd expect that things on Windows would work well.

So most of the video for average user probably is hardware decoded.


The comment to which you replied was about encoding, not decoding.

There is no reason to do decoding in software, when hardware decoding is available.

On the other hand, choosing between hardware encoding and software encoding, depends on whether quality or speed is more important. For instance for a video conference hardware encoding is fine, but for encoding a movie whose original quality must be preserved as much as possible, software encoding is the right choice.


Most hardware encoders suck.


>>> It can generally handle most decode operations with CPU help and a very narrow encoding spec.

This is so much spot on. Video coding specs are like a "huge bunch of tools" and encoders get to choose whatever subset-of-tools suits them. And than hardware gets frozen for a generation.


> Most of the video you encode on a computer is actually all in software/CPU because the quality and efficiency is better.

It depends on what you care about more, you don't always need the best possible encoding, even when you're not trying to record/stream something real time.

For comparison's sake, I played around with some software/hardware encoding options through Handbrake with a Ryzen 5 4500 and Intel Arc A580. I took a 2 GB MKV file of about 30 minutes of footage I have laying around and re-encoded it with a bunch of different codecs:

  codec   method   time     speed     file size   of original
  H264    GPU      04:47    200 fps   1583 MB     77 %
  H264    CPU      13:43    80 fps    1237 MB     60 %
  H265    GPU      05:20    206 fps   1280 MB     62 %
  H265    CPU      ~30:00   ~35 fps   would take too long
  AV1     GPU      05:35    198 fps   1541 MB     75 %
  AV1     CPU      ~45:00   ~24 fps   would take too long
So for the average person who wants a reasonably fast encode and has an inexpensive build, many codecs will be too slow on the CPU. In some cases, close to an order of magnitude, whereas if you do encode on the GPU, you'll get much better speeds, while the file sizes are still decent and the quality of something like H265 or AV1 will in most cases seem perceivably better than H264 with similar bitrates, regardless of whether the encode is done on the CPU or GPU.

So, if I had a few hundred of GB of movies/anime locally that I wanted to re-encode to make it take up less space for long term storage, I'd probably go with hardware H265 or AV1 and that'd be perfectly good for my needs (I actually did, it went well).

Of course, that's a dedicated GPU and Intel Arc is pretty niche in of itself, but I have to say that their AV1 encoder for recording/streaming is also really pleasant and therefore I definitely think that benchmarking this stuff is pretty interesting and useful!

For professional work, the concerns are probably quite different.


>Most of the video you encode on a computer is actually all in software/CPU because the quality and efficiency is better.

That was the case up to like 5 to 10 years ago.

These days it's all hardware encoded and hardware decoded, not the least because Joe Twitchtube Streamer can't and doesn't give a flying fuck about pulling 12 dozen levers to encode a bitstream thrice for the perfect encode that'll get shat on anyway by Joe Twitchtok Viewer who doesn't give a flying fuck about pulling 12 dozen levers and applying a dozen filters to get the perfect decode.


It’s not all hardware encoded - we have huge numbers of transcodes a day and quality matters for our use case.

Certainly for some use cases speed and low CPU matter but not all.


Not sure why downvoted, all of serious Plex use runs on hardware decode on Intel iGPUs down to an i3. One only sources compute from the CPU for things like subtitles or audio transcoding


Because Plex and gamers streaming is not the only use case for transcode


"Most of the video you encode ..."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: