occiput wrote:You now appear to be saying that you think the problem is connected with the PAL signal in some way.
All along I've been trying to make it clear that my thinking is a modification was made to the camera's performance in respect to colour which all pieces of equipment up until (but not including) conversion to digital in the early 90s threw no protests against and broadcast and displayed the image as the BBC intended (as close as was possible on domestic TV sets of the time). Then it gets converted to digital - *if* there is a modification to the colour performance (not the PAL signal - I never *meant* a modification to the PAL or CVBS signals so apologies for confusion) - then there is possibly something at this conversion stage or thereafter that picks up on this camera mod in some way that produced this side effect?
occiput wrote:What are the tolerances on the PAL signal at the output of a television studio (BBC spec or IBA CoP will do)?
I don't have that information to hand unfortunately, but I know from people who have worked for both organisations that there were differences, as mentioned gamma was one. I skipped a couple of your questions as in all honesty I didn't have the information to hand to answer those either.
occiput wrote:Which professional video A to D converter(s) produce(s) no output when presented with an out-of-tolerance PAL signal, and how far out of tolerance must the signal be for the converter to reject it?
This example of no acceptance was taken as an example of digital conversion technology, but mainly as seen on domestic flat panel sets when being input with an analogue source that can be unstable, e.g. a VCR or the RF output of early video games. Even so, if you were to put in something like a 405-line recording into a modern A-D converter, that wouldn't accept it. Might recognise that there is something there but it won't decode it. My point here is that for any analogue to digital converter, the analogue signal has to be in a form that the converter is expecting, if it's non-standard, then it *might* still accept it, but that obviously depends on how out of tolerance the input signal is and the tolerances accepted by the converter. This is assuming in this example there is no TBC, but even then they also have tolerances before they can no longer correct the timebases. Those of us who have tried to put a signal from Betamax or VHS into an Aurora 625>405 converter will have come across this - the Aurora doesn't like the unstable timebase jitter (it has no TBC) of these formats so switches between the input source and the default "no input" image loaded into the flash memory.
I know I've been rabbitting on about timebases here (these are the best real-life examples I can think of), but if an ADC doesn't like a non-standard timebase signal, then who's to say they will always accept a (suspected) slight modification made to the colour performance of a camera 20+ years previously that the equipment of the day didn't throw up any side effects when playing or passing the signal?
occiput wrote:Why might the fidelity of colour reproduction of second-generation colour cameras such as the Link 110, 125 or Marconi Mk IX differ from that of the EMI 2001 or Marconi Mk VII?
Because of the different ways in which they produced the image:
EMI 2001s took the output of the green tube to make the basic image, whilst the luminance tube only supplied fine detail between about 1.5 - 5.5MHz. I did think the fact that they took the green image as the starting point may have been the cause then of course the question is but then ITV examples don't display this.
Theory 2 - Alternatively, it may have been down to the BBC's transcoding equipment or the settings thereof that picked up on this way of generating the image in the transfer stage? Come to think of it that ties in with the "settings" theory that Michael suggested above, though I doubt it was a moment of madness as this has affected most BBC stuff from this period using these cameras that I've seen.
Marconi MkVIIIs even had a different fidelity to 2001s - those used by some ITV companies (I have a fair few programmes from Southern using these cameras in my DVD collection) that make people look tanned, whenever I've seen output from other companies using these the picture looks like black and white one coloured in - which is exactly what it was. The luminance tube output was taken as the starting point then the output from the RG&B tubes added in.
The second generation of colour cameras were (like the Philips PC60 of the first generation) all three-tube designs, which not only introduced a slightly softer image than the 4-tube ones, I've noticed at least also don't give as good skin tones and therefore colour fidelity as the 2001 (this is how I could tell it is only programmes shot using EMI 2001s that have this effect - these second generation cameras have slightly off-colour flesh tones and the resulting images as seen today don't have this green tint).
I don't want to sound brusque, as I only want to make the point clear (ironically the clearest I think I've made it - sorry for any confusion as I was trying to make points that people might have asked if I didn't make them in my speeches above), so to sum up my first theory, I will try it in a series of chronological bullet points:
1) Some EMI 2001s in a studio at, say, TVC, have been in use for some time. They might have had modifications done already and now it is time for an approved modification to their colour performance to be made. The mod is made and they sit awaiting their next assignment.
2) The cameras are lined up, a programme recorded and in the days or weeks following this, possibly edited. The quality of the image at all times is maintained to the best possible standards the hardware of the day can do, as close to BBC spec as possible.
3) The programme is then transmitted, again the playback and transmission hardware is adjusted to the best possible results. The modification made to the cameras makes the image look even closer to BBC spec as previously. At no point in the chain from the image being taken in the camera to the image being displayed on the domestic colour TV set does any piece of hardware throw up any form of protestation against this mod made to the cameras. If this is a programme recorded on Quad, the same goes for any repeat transmissions in the 1980s that may have involved a transfer to and then repeat transmission from 1" C-type format - the programme looks the absolute best the available hardware can display.
4) By stage four, it's the 1990s and D3 and later DigiBeta have been introduced - time for the BBC project of transferring their entire archive of 2" (and 1") format tapes to either of these two newer formats, including the example programme mentioned in stages 2 and 3. First they have to pass the signals from the 2" or 1" machine through an ADC (and other equipment? I was not part of such a project at that time) which lies between the old format and new format machines. In my theory, the snag here is that the transcoding equipment between the machines of the two different formats in a transfer of the programme picks up on this mod made to the colour performance of the camera decades earlier as the colour fidelity the transcoder is *expecting* is not there, instead it is being presented with the modified colour fidelity of a first generation colour camera. This "tinting" might have even happened in the transfer from Quad/1" to DigiBeta, or the transfer of D3 dubs of Quad/1" material made earlier in this project to DigiBeta (i.e. PAL - digital or analogue - to component via YUV 4:2:2 subsampling). I've had to read up on digital video theory again after a gap of a few years, so please forgive me if there are large gaps in my knowledge here, as much of it has been forgotten in the intervening years.
What I think we can all agree on is that if you have noticed this effect, it will have happened in a transfer from one format to another, the programmes would not have been originally transmitted with this green tint in dark parts of the picture. I concur that it could have happened at any stage after the initial transfer in the 90s or 2000s, including when being published to DVD, although I seem to remember seeing this effect on broadcast repeats of programmes even when DVD was still in relative infancy.
The above is just a theory as result of racking my brains trying to find an answer of my own, part of my question here was if I was right, but mainly what exactly is causing to be just these programmes from only the BBC with BBC examples of only the EMI 2001.
I apologise if I'm causing frustration with this theory but I can't see any way of putting this theory clearer than the two times I've tried in this post alone. In the previous posts in an attempt to be as clear and as accurate as possible I've put too much information and thus gone so far and supplied so much info that I might have gone past the point of being as clear and as detailed as I can get and so muddied the waters in my point again.
I'll see if I can find examples of this "green tint in dark parts of the image" then post screenshots here as some do and some don't notice it, it might be better if we were to all actually see what I'm asking about in this thread.