So, from what I understood, skin tones are usually supposed to be between 50 and 70 on the waveform, IRE speaking.
But what about a waveform set in RGB, not in IRE ?
Peaks: Reds, maybe around 200, Greens, maybe around 170, Blues, maybe around 130. For typical Caucasian skin. That'll correspond to an approximate IRE peak of around 75.
Thanks, but I'm a little lost, I think I should have be more precise : I'm using Sony Vegas waveform. Wich, with the RGB studio settings, goes from -20 to 120. No RGB indicator here, just luminance. But from what I understood, the "right" (let's say average) luminance (so, not colors) for the skin can be set in this way too.
(not sure if I'm clear...or right)
The lines on the waveform still represent IRE values. The scale itself doesn't change even if you have it set to 7.5 IRE, Studio RGB, etc. So, the IRE range is still the same. You still want the skin tones in the same range (though 80 is kind of hot, even for the brightest parts of the skin).
Yes, that's why I prefer to stick between 50 and 70.
But when you say that the scale doesn't change depending on the setting (studio RGB or IRE), it doesn't seem to be the case (I can take screenshots if you want) : checking IRE or RGB doesn't display the same results and the "curves" change on the waveform...
By the way, if I may ask an additionnal question, what is the right choice to make between both, IRE and RGB ? If I remember well, it depends on the way you plan to broadcast the video, but I'm not sure...I think it was meant to be RGB for PAL systems and IRE for NTSC ? (Knowing that, in my case, if I broadcast something, it'll be on the Internet or on PAL systems, since I live in France)
If you're in PAL (or SECAM, as it were), you don't want them turned on at all. If you're going to the web, you don't want them on. If you're in HD, you don't want them on. It's an NTSC concern, meant to make sure your footage conforms to NTSC equipment -- i.e., how your footage would appear on a scope hooked up to a machine which adds 7.5 IRE setup. You'd still want the same range.
But when I say that the scale doesn't change, I mean that the numbers represented on the scale are still IRE values. They're not RGB values.
Oh, yes, ok...I read a little in the meanwhile, so, I shouldn't check any box here because the AVCHD from GH1 comes in Computer RGB, and Vegas previews them in Computer RGB too, right ?
So, I should uncheck the RGB box in Options >> preview device too, I guess...
Sorry, I was a little confused, juste started learning this month, and all those parameters are confusing at first.
So, last question (I guess), just to be sure : viewing a video colorrected/graded with such parameters (so, Computer RGB) would be fine both for computer viewing _and_ LCD/Plama TV viewing ?
Of course I'll do test, but since a render can be long, it's always best to have advices from experimented people not to waste too much time, I guess.
Me again...this is weird, I worked with those settings and when I render (with Cineform, high quality, RGB 4:4:4), I have a different video than what I see on the preview in Vegas.
It seems more contrasty.
What did I miss ?
I thought this was the kind of things that happens when there's a studio RGB/Computer RGB problem...
There possibly is - the preview window expects computer RGB only, so if your material is studio RGB it ends up appearing different than what you see in the window. Glenn Chan's website has the best information I've seen on this.
The thing is, AVCHD from the GH1 should be computer RGB, if I'm not mistaken (in 32 bits mode)
Or were I wrong to work in 32-bits mode with GH1 videos ?