From what I can guess it's the quality of the HDR lighting they are using in the game. I'm talking about the internal HDR lighting that is part of the rendering process (this is different from HDR for TV that every is talking about these days). This HDR lighting is then tonemapped to a display range so that the TV can display it (for normal TVs this range is 0-255 but for HDR TV it's higher and depends on whether the game is using Dolby HDR or HDR 10).
With compressed the game would use RGBA8 HDR while with 64 bit the game is using FP16 HDR. It's just a difference in precision, higher quality would mean you get less instances of banding and white/black crush but it also increases the framebuffer size and as such affects performance.
9 years ago
Формат буфера кадров
Собственно вопрос: за что отвечает данная графическая опция Формата буфера кадров (сжатые 32бита и half16 64 бита) и насколько сильно влияет на качество и производительность.
Заранее спасибо.
почитай статейку, может, поймешь что-нибудь http://www.ixbt.com/video2/tech_ss2.shtml 👿