TC
📼 The Classroom 🌐chris.ray@paltv.co.uk

Encoding for an IRL programme

Hi, I'm streaming a show that uses a hardware vision mixer to mix cameras together, then OBS will stream this and any other videos we wish to play to YouTube. We have an i5 13600k and RTX 3060Ti. How can I ensure that I am getting the highest quality possible, particularly for irl content rather than gaming content? I guess I'm asking more for YouTube specific settings rather than OBS ones, as I'm pretty sure I know what I'd do just for regular encoding. The reason I ask is that YouTube is quite nebulous with its technical specifications. I only recently learned that although YouTube will accept videos at 50fps and 25fps, it won't accept streams at these, and will do some horrific attempt at frame blending! Which is a shame as we are UK based, but now we know, we can plan for that. Our vision mixer will output 1080p60 (some cameras may be 1080i60 but it will deinterlace). I heard from somewhere (think it was EposVox) that YouTube will allocate a higher bitrate and better encoder to content that is 2560x1440 or higher. So, I think I'd benefit from streaming 1440p60 to YouTube, and letting either OBS or the vision mixer do this scaling. So, our videos we'll play out will also be 1440p60. We will have a gigabit upload connection, so any bitrate is possible, really. At what point is there diminishing returns? I know that at some point YouTube will start to do more harm than good by allocating a larger bitrate, what should we set it at? Does anyone have any other tips like this? YouTube-centric or otherwise. I've put below the current encoding settings we are using for the NVENC encoder. (Wish I could use CPU encoding lol.) https://i.clr.is/u/1P8T78.png I know at some point this becomes more opinion based than anything else, but please do just jot down your opinions! Thanks.