• kevincox@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    19 days ago

    the reason no one posts the bitrates is because it’s not exactly interesting information for the the general population.

    But they post resolutions, which are arguably less interesting. The “general public” has been taught to use resolution as a proxy of quality. For TVs and other screens this is mostly true, but for video it isn’t the best metric (lossless video aside).

    Bitrate is probably a better metric but even then it isn’t great. Different codes and encoding settings can result in much better quality at the same bitrate. But I think in most cases it correlates better with quality than resolution does.

    The ideal metric would probably be some sort of actual quality metric, but none of these are perfect either. Maybe we should just go back to Low/Med/High for quality descriptions.

    • GissaMittJobb@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      19 days ago

      I think resolution comes with an advantage over posting bitrates - in any scenario where you’re rendering a lower resolution video on a higher resolution surface, there will be scaling with all of its negative consequences on perceived quality. I imagine there’s also an intuitive sense of larger resolution = higher bitrate (necessarily, to capture the additional information).