G
Guest
Guest
Recently purchased a monitor which claimed 10bit across all of their online material. Turned out to actually be an 8bit panel that utilized "frame Rate Control" to dither in more color information, or 8bit+FRC.
Now, when it comes to viewing, or watching content, I don't even mind 6bit+frc sometimes. I don't mind 720p. I don't mind H.264 at 17Mbps. But when I buy a product for a specific reason, would hope that it does exactly that function.
Seems like all of the monitor companies are often rating 8bit panels as 10bit because they are implementing FRC dithering, or because the panel can except 10bit signal.
We have had similar "manufacturer claims" in resolution and especially dynamic range specs on cameras. But those are somewhat understandable.
HD ready vs HD vs FullHD
The term "HD" is vague already, so can't really Call anyone out there
4K vs True 4K vs 4K RAW vs 4K red vs Super Sample 4K
This gets a little more contentious. Yet, 4K raw is really a 4K sensor, even if it isn't resolving a 4K color image.
Dynamic Range
Similar to "High Definition", this is a vague spec. But in the opposite way. with "HD", the term or specs itself was vague. With DR, the actual rating can be base on aesthetic preference for actual DR or usable DR. So, in a weird way, Red and Canon DR specs are justified, even if their not reflecting "usable DR".
But bit depth! 8bit vs 10bit is pretty measurable. If we are trying to work in a 24fps world, and getting additional frames thrown in for dithering, what does that do? probably not much. Yet it is still there.
Is there some understandable reason for the discrepancy? I an't think of any at this time. It is like if a camera had a 1080p sensor, but shook around to capture for images and combined them into one frame for "4K". YOU'd want to know that , right? I would.
Seems manufacturers are getting really lazy about specs. Anyone recall when Heinz ketchup started putting less product in their bottles? Some granny had a recipe that used the whole bottle, and she found out that the Heinz ketchup brand was no longer putting the correct amount in the bottles, and Heinz had to change back to putting the correct amount in.
Even BHphoto and similar sites are running the risk of liability, as they grammatically state "specs" and not "Manufacturer Specs". So, are they taking on the responsibility for accurate listings? Or could they be sued for "false advertising" because they do not state that the specs are not their own? BH could help by being more strict with manufacturers and perhaps call their bluff, allowing the whole monitor industry to be ok with listing actual panel bit depth. One phone call to tech support from each company, and get a rundown of the accurate specs. Though, isn't really their responsibility, but they should clarify that the specs are from the manufacturer and not their own, even if we all know it is implied.
I actually don't have much issue with 8bit+FRC, and prefer it to 8bit alone, but when I purchase a monitor because of a 10bit rating, would really hope to get a 10bit monitor. Especially when I pay extra for the feature.
If time permits, and recently, time is permitting, I'll go figure out if there are any true 10bit monitors in the USD$1K range. Though, colour uniformity and other attributes are important too. I'd probably rather have consistent colour in 8bit, than warped innacurate 10bit, but that goes without saying. Just that some of these fake bit depth monitors are not better than other monitors, but just have "10bit" flag to get more money. So, in that sense, it feels like actual fraud. The only reason we can't do anything about it, is because all of the companies are doing it! And sellers are not worried about it either. Or is there some way to rationalize it, similar to DR specs?
Any of you guys have a preferred computer monitor from recent times?
Now, when it comes to viewing, or watching content, I don't even mind 6bit+frc sometimes. I don't mind 720p. I don't mind H.264 at 17Mbps. But when I buy a product for a specific reason, would hope that it does exactly that function.
Seems like all of the monitor companies are often rating 8bit panels as 10bit because they are implementing FRC dithering, or because the panel can except 10bit signal.
We have had similar "manufacturer claims" in resolution and especially dynamic range specs on cameras. But those are somewhat understandable.
HD ready vs HD vs FullHD
The term "HD" is vague already, so can't really Call anyone out there
4K vs True 4K vs 4K RAW vs 4K red vs Super Sample 4K
This gets a little more contentious. Yet, 4K raw is really a 4K sensor, even if it isn't resolving a 4K color image.
Dynamic Range
Similar to "High Definition", this is a vague spec. But in the opposite way. with "HD", the term or specs itself was vague. With DR, the actual rating can be base on aesthetic preference for actual DR or usable DR. So, in a weird way, Red and Canon DR specs are justified, even if their not reflecting "usable DR".
But bit depth! 8bit vs 10bit is pretty measurable. If we are trying to work in a 24fps world, and getting additional frames thrown in for dithering, what does that do? probably not much. Yet it is still there.
Is there some understandable reason for the discrepancy? I an't think of any at this time. It is like if a camera had a 1080p sensor, but shook around to capture for images and combined them into one frame for "4K". YOU'd want to know that , right? I would.
Seems manufacturers are getting really lazy about specs. Anyone recall when Heinz ketchup started putting less product in their bottles? Some granny had a recipe that used the whole bottle, and she found out that the Heinz ketchup brand was no longer putting the correct amount in the bottles, and Heinz had to change back to putting the correct amount in.
Even BHphoto and similar sites are running the risk of liability, as they grammatically state "specs" and not "Manufacturer Specs". So, are they taking on the responsibility for accurate listings? Or could they be sued for "false advertising" because they do not state that the specs are not their own? BH could help by being more strict with manufacturers and perhaps call their bluff, allowing the whole monitor industry to be ok with listing actual panel bit depth. One phone call to tech support from each company, and get a rundown of the accurate specs. Though, isn't really their responsibility, but they should clarify that the specs are from the manufacturer and not their own, even if we all know it is implied.
I actually don't have much issue with 8bit+FRC, and prefer it to 8bit alone, but when I purchase a monitor because of a 10bit rating, would really hope to get a 10bit monitor. Especially when I pay extra for the feature.
If time permits, and recently, time is permitting, I'll go figure out if there are any true 10bit monitors in the USD$1K range. Though, colour uniformity and other attributes are important too. I'd probably rather have consistent colour in 8bit, than warped innacurate 10bit, but that goes without saying. Just that some of these fake bit depth monitors are not better than other monitors, but just have "10bit" flag to get more money. So, in that sense, it feels like actual fraud. The only reason we can't do anything about it, is because all of the companies are doing it! And sellers are not worried about it either. Or is there some way to rationalize it, similar to DR specs?
Any of you guys have a preferred computer monitor from recent times?