It is currently Mon Dec 11, 2017 1:56 pm

Television Receiver Intermediate Frequencies

 
Posts: 316
Joined: Thu Jun 14, 2012 4:04 am
Location: Mt. Maunganui, New Zealand

Re: Television Receiver Intermediate Frequencies

Post by Synchrodyne » Tue Mar 11, 2014 4:45 am

Early on in this thread, I mentioned this document:

“Choice of Intermediate Frequencies for Domestic Television Receivers” (Union Europeenne de Radiodiffusion Tech 3062-E, April 1954).”

Although I have not since found a copy, I have recently found an article in Wireless World 1954 July that discusses this document, namely “Television I.F. Inquiry: Aspects of Receiver Design in Various Countries”, by G.H. Russell.

The EBU report was the outcome of a questionnaire distributed in December, 1952 to television receiver manufacturer’s organizations in various countries in 1952 December, and to which replies were received during 1953 and 1954. I have attached a copy of the article.

WW 195407 p.322.tif

WW 195407 p.323.tif

WW 195407 p.324.tif

WW 195407 p.325.tif


The additional information certainly requires changes in some of the assumptions that I had previously made.

Firstly, “low” IFs, around 25 MHz vision give or take, were used in the early days several European 625-line countries, including Italy.

Secondly, “high” IFs were also used in some, but not all of those countries in the early days, although all projected future use of “high” IFs.

Thirdly, although the use of 38.9 MHz was projected for future use by two countries, Germany and Netherlands, it was not reported as already being in use. As the dates of the submissions from the various participants are unknown beyond being in the 1953-54 period, one cannot be sure as to when the use of 38.9 MHz started. But certainly it was not the first “high” IF used for the 625-line system, even if one discounts the Italian exception.

Excluding the Italian case, the initial “high” IFs for the 625-line system were:

41.75 MHz Belgium, Denmark
39.75 MHz Denmark
39.5 MHz Sweden, Switzerland

Of these, 39.75 MHz was projected for future use in Denmark, and 39.5 MHz for future use in Switzerland.

Notwithstanding these plans, it would appear that the 38.9 MHz number adopted by Germany and Netherlands quickly became the standard number, possibly or probably with CCIR backing. Back-of-the envelope calculations show that amongst 38.9, 39.5, 39.75 and 41.75 MHz, 38.9 MHz was the most favourable in terms of positioning of the fifth harmonic of the IF, putting it in the guard band between the upper edge of the vision sidebands and the sound carrier of channel E7. So this may have been a factor in its selection. 41.75 MHz was the worst case, with the fifth IF harmonic falling on the channel E9 sound carrier. It was also just inside Band I, which was undesirable. Also of note is that the 39.5 MHz number associated with UK 625-line practice had a prior existence in Sweden and Switzerland.

Likely then this previously-mentioned article:

“W. Holm and W. Werner, “Choice of an intermediate frequency for television receivers to suit the C.C.I.R standard,” Funk und Ton 8, 1954.”

was pivotal in the 38.9 MHz story, and I should guess would explain why in the first instance, it was adopted in Germany and the Netherlands. Holm and Werner were both Philips engineers and technical writers as far as I know.

Turning to the Italian case, the “low” and “high” IFs of 25.75 and 45.75 MHz appear to have been drawn directly from American practice. As previously observed, 45.75 MHz fitted the Italian declared IF channel of 40 to 47 MHz, and it becomes evident that the 45.9 MHz number used by Philips was in fact a slight deviation from “standard”.

The Italians seemed to have realized reasonably early on that the unilateral declaration of a protected TV IF channel was not a complete answer. In Wireless World 1954 November it was reported that Italy had asked the CCIR to study the question, and that the above-referenced EBU document was submitted as a contribution to the enquiry. So somewhere there might be a pertinent CCIR document on the subject. This request by Italy may have marked the start of the process of its change from 45.75 to 38.9 MHz.

The French contribution to the EBU report evidently predated the adoption of standard IFs in that country. The typical numbers quoted include, as well as some in the “high” class, others that might be described as being “very high”, up to 80 MHz. One may rationalize the latter by recalling that the original French 819-line channelling plan involved Band III only, not Band I, so that there was no apparent need to keep the IF channel below Band I. The Band I channels, F2, F3 (not used) and F4 evidently arrived with the tête-bêche channelling scheme, whenever that was announced. The IF numbers quoted for France appear to be inconsistent in respect of the vision and sound carrier spacings, which should have been 11.15 MHz. Just possibly separate conversions were used for the vision and sound carriers, but that seems unlikely. Perhaps there was a data transposition glitch somewhere along the way.

Wireless World 1954 August included an article about a French TV receiver using printed circuits. It was described as a single-channel (evidently F8A, as used at Paris and Lille) superhet with oscillator low, although the actual IFs were not quoted. This receiver had a 12AT7 cascode RF amplifier and a 6X8 frequency changer, which looks to have been an unusual combination. RCA introduced the 6X8 triode pentode frequency changer in 1951 as the companion valve to its 6BQ7 cascode double triode, for use in TV VHF tuners where the IF was around 40 MHz. It developed the series cascode circuit because the original Wallman (shunt) cascode was not easy to use with variable tuning; rather it was better suited to fixed frequency band IF use. Applications for the 12AT7/ECC81 included shunt cascode amplifiers. In the case of this French TV receiver, its fixed, single channel nature might have been why 12AT7 was used.

Returning to the Wireless World 1954 July article, this also delineates the US UHF channel allocation guidelines predicated on the general use of the standard 45.75 MHz IF.

Stepping away from the EBU document to the Australian case, I have found a comment in a 1970 Australian publication to the effect that the (vision) IF would eventually be changed to 36.875 MHz (from 36.0 MHz) following an Australian Broadcasting Control Board recommendation. Unfortunately no further explanation was provided.

Back-of-the-envelope calculations suggest that the original 36.0 MHz might have been chosen in part to avoid having Band I channel local oscillator frequencies fall within some of the Band II channels. With a 38.9 MHz IF, the local oscillator frequencies for channels 1 and 2 would have fallen within the vision sidebands of channels 4 and 5 respectively. This conflict was avoided with 36.0 MHz. But it did happen that the local oscillator frequency for channel 5, 138.25 MHz, was right on the vision carrier for channel 5A, which was a later addition. Whether the change to 36.875 MHz was at least in part to alleviate this is unclear; it still resulted in the channel 5 local oscillator frequency falling not too far above the channel 5A vision carrier.

Cheers,

Steve

 
Posts: 316
Joined: Thu Jun 14, 2012 4:04 am
Location: Mt. Maunganui, New Zealand

Re: Television Receiver Intermediate Frequencies

Post by Synchrodyne » Sat Jun 07, 2014 6:22 am

I have updated the tabulation to include all of the information available to date. In large part this update is based upon the Wireless World 1954 July article that in turn was based upon EBU Technical Document 3062.

TV IF 20140607.tif


Additional to my earlier comments, the French numbers given in that WW article may reasonably interpreted as referring to the 14 MHz IF “channel” limits, rather than to the IFs themselves. From these I have inferred probable actual IFs based upon System E channelling practice.

In my February 25 posting, I said: “Some very recently found information on late-era SAWFs has thrown up some more numbers, particularly in respect of multistandard receivers, but I’ll discuss those separately.”

In particular, there were dual-Nyquist slope SAWFs that had vision IFs at 38.9 and 33.9 MHz. The 38.9 MHz number accommodated Systems B, D, G, H, I, K, K’ and L. The 33.9 MHz number was for System L’, with oscillator high. Thus I have included 33.9 MHz in the list, as it seems likely that this would have had reasonably significant use for System L’. The dual-Nyquist SAWFs appeared to have had a rather steep slope at the 33.9 MHz end, perhaps to minimize incursion into the primary vision response. Insofar as the SAWF group delay response could be tailored separately, this was probably not a major issue, although it would have increased the errors with conventional quasi-synchronous demodulation. One has the impression that System L’ was treated as something as a second class citizen in multi-standard receivers.

Switched SAWFs for multistandard receivers throw up some other IF oddities that I’ll mention in a subsequent posting.

Some of the questions that remain unanswered include:

Exactly when did the CCIR promulgate the 38.9 MHz standard for System B? I am fairly sure that it was sometime in 1954.

When was the French 28.05 MHz standard for System E established? Most likely later in 1954 or 1955. It might have been a SCART standard, if SCART existed back then.

Why did Australia choose its 36.0 MHz number for System B, and why did it later on change to 36.875 MHz?

Why did Japan stay with its “low”, 26.75 MHz number for so long?

When did BREMA publish its 39.5 MHz number for UK System I? The system itself was recommended in the 1960 TAC report, so BREMA may have done its “homework” soon after that.

Origins and timings of the 38.0 and 37.0 MHz numbers used for Systems D/K in Russia/Eastern Europe and China respectively.

Cheers,

Steve

 
Posts: 242
Joined: Sun Jan 19, 2014 8:09 pm
Location: Hucknall, Notts.

Re: Television Receiver Intermediate Frequencies

Post by colly0410 » Sat Aug 09, 2014 9:25 pm

Now that we're digital (well most of us) is there a difference in the IF's for the different systems? I'm thinking of DVB, DVB T2, ATSC, ISDB ect. I know of two countries that use VHF as well as UHF for digital TV (Australia & USA) (there's probably others I don't know about) would that affect the choice of IF?

 
Posts: 3712
Joined: Sun Feb 12, 2012 7:43 pm
Location: North Hykeham, Lincolnshire and Ilford, Essex (but not for much longer ...!)

Re: Television Receiver Intermediate Frequencies

Post by Terrykc » Sat Aug 09, 2014 9:36 pm

Strangely, I was looking at SAW filters for the IF in digital receivers recently and found a difference, albeit a small one, in the centre frequency for SAWs intended for Cable system* use and receivers for off-air reception ...

This didn't make any sense to me ...

* Interestingly, I came across SAW filters with switchable 7/8MHz bandwidths for continental UHF/VHF DTT use ...

 
Posts: 242
Joined: Sun Jan 19, 2014 8:09 pm
Location: Hucknall, Notts.

Re: Television Receiver Intermediate Frequencies

Post by colly0410 » Sat Aug 09, 2014 10:00 pm

Oh yeah Terry I'd forgot about the different (7/8 megs) channel widths at VHF & UHF, also I think Australia uses 7 megs channels at UHF so I presume that would complicate things. Then there's the 625 lines digital system N (6 megs I believe) to throw a spanner in the works. :)

 
Posts: 3712
Joined: Sun Feb 12, 2012 7:43 pm
Location: North Hykeham, Lincolnshire and Ilford, Essex (but not for much longer ...!)

Re: Television Receiver Intermediate Frequencies

Post by Terrykc » Sun Aug 10, 2014 11:07 am

As far as the Australian and American systems are concerned, it is merely a question of selecting the appropriate SAW to match the system. It is the switchable SAW for sets in countries that formerly used system B/G that I found interesting.

These are probably for use in CATV set top boxes as the European cable systems have inherited a large number of 7MHz channels in a continuous sequence up to 300MHz (32 channels). http://en.wikipedia.org/wiki/European_c ... requencies.

I believe that VHF has been dropped for DTT in Europe but that is not a choice open to the cable operators due the vast amount of bandwidth they would lose. This will be further complicated by the retention of a large number of analogue channels alongside the new digital ones. (Unlike UK cable networks which are now completely digital.)

Perhaps, in due course, they will re-engineer the entire frequency range as 8MHz channels ...

When SAWs were used for IF shaping in analogue sets, of course, every system required a different SAW.

The exception was again System B/G receivers where one SAW suited both systems. This, like its discrete component predecessors, merely needed extra traps to suit both channel spacings: no switching was necessary.

 
Posts: 316
Joined: Thu Jun 14, 2012 4:04 am
Location: Mt. Maunganui, New Zealand

Re: Television Receiver Intermediate Frequencies

Post by Synchrodyne » Mon Aug 18, 2014 10:22 am

The attached short item from Wireless World 1955 October confirms the original Australian standard analogue TV IF numbers of 30.5 MHz sound, 36.0 MHz vision. Evidently these were established well before the start of TV transmissions in Australia. Whilst the rationale for the exact numbers chosen was not given, a reasonable inference is that they were a “best fit” to the Australian VHF channel plan, and for example better than the then-recently emerged European 625-line system standard numbers of 33.4 and 38.9 MHz.

Cheers,

Steve

WW 195510 p.494 Australian TV.jpg

 
Posts: 316
Joined: Thu Jun 14, 2012 4:04 am
Location: Mt. Maunganui, New Zealand

Re: Television Receiver Intermediate Frequencies

Post by Synchrodyne » Mon Aug 18, 2014 10:32 am

The attached WW items are interesting in that they show that the consideration of and debate about a British standard TV IF go back to 1949, and that a "high" IF was also suggested back then.

Cheers,

Steve

WW 195202 p.41 TV IF.jpg

WW 195204 p.145 BREMA TV IF.jpg

 
Posts: 316
Joined: Thu Jun 14, 2012 4:04 am
Location: Mt. Maunganui, New Zealand

Re: Television Receiver Intermediate Frequencies

Post by Synchrodyne » Tue Aug 19, 2014 12:43 am

By way of some more background on the French TV IF situation, the attached Wireless World summary of the ITU Stockholm 1952 May VHF meeting mentions both the tête-bêche channelling system and the French use of the 162-to-174 MHz band just below Band III.

So one may assume that the French had worked out the details of their tête-bêche channelling system before the Stockholm meeting. The earliest mention of planning for this meeting that I have is in WW 1952 January. On the other hand, what was apparently not done was the simultaneous development of accompanying standard IFs, at least judging by the French input to the EBU TV IF enquiry. Given that some of the IFs reported to be in use circa 1954 were relatively high and implied Band III-only receivers, I don’t think that we could outrule the possibility that the use of Band I channels might not have been part of the original tête-bêche plan. That question might be answered if we could find a copy of the ITU Stockholm 1952 report. Of course another possibility is that with Band III channels in the majority, some of the setmakers simply chose high IFs for what were single-channel Band III receivers.

The Belgian TV situation was also noted as a complication. The earliest record I can find of the decision to use its own 625- and 819-line variants is in WW 1952 April. It did not, though, have a major bearing on IF choices insofar as whatever eventuated for the CCIR 625-line standard would also be appropriate for use in Belgium, subject to accommodating the French 819-line requirements in multistandard receivers.

Cheers,

Steve

WW 195210 p.434 ITU Stockholm French TV.jpg

WW 195204 p.147 Belgian TV.jpg

 
Posts: 316
Joined: Thu Jun 14, 2012 4:04 am
Location: Mt. Maunganui, New Zealand

Re: Television Receiver Intermediate Frequencies

Post by Synchrodyne » Sun Aug 24, 2014 9:37 am

Some clues to early French practice are provided in the attached item from Wireless World 1951 November. Surprising is that there were some dual-standard 441/819 line receivers.

Unfortunately I don’t have the continuation page, but I think what was being said was that 441-line receivers were often of the TRF type (for channel F1), whereas 819-line receivers (at the time for channel F8A) were superhets. I think though that dual-standard receivers would likely have been superhets on both systems with the IF channel not overlapping channel F1. The only IF information I have for 441-line superhet receivers is in respect of the Philips TF390, 13 MHz vision and 9 MHz sound with oscillator low. I doubt though that 441-line IFs were ever standardized.

Cheers,

WW 195111 p.459 Paris TV Show.jpg

 
Posts: 316
Joined: Thu Jun 14, 2012 4:04 am
Location: Mt. Maunganui, New Zealand

Re: Television Receiver Intermediate Frequencies

Post by Synchrodyne » Sun Nov 09, 2014 3:04 am

I recently came across the attached item in Wireless World 1963 April. It shows some preferred TV IF numbers as recorded at the CCIR 1963 February Geneva meeting.

WW 196304 CCIR Geneva TV IF.jpg


In the main, it confirms the numbers already recorded in this posting.

Of note:

Italy was still using 45.75/40.25 MHz, so its transition to the European standard 38.9/32.4 MHz was yet to come.

The USSR was using 34.25/27.75 MHz. That implies that the 38.0/31.5 MHz combination came later, possibly with UHF, possibly with the advent of colour.

The 34.25/27.75 MHz combination was mentioned in an early posting in this thread as having been found at: http://www.scheida.at/scheida/Televisio ... raster.htm. So the CCIR listing provides confirmation that it was used.

Japan was still using 26.75/22.25 MHz, its transition to a “high” IF probably not contemplated at that time.

Not mentioned was the UK System I IF of 39.5/33/5 MHz, but I have still not found information as to exactly when it was established; somewhere in 1962 or perhaps early 1963 would be my estimate. On the other hand the French System L IF of 32.7/39.2 MHz was listed.

That the USSR 38.0/31.5 MHz combination was, as it were, a second iteration that most likely arrived in the 1960s means that it might well have postdated the Chinese 37.0/30.5 MHz numbers. The working assumption is that 37.0/30.5 MHz was adopted at the start of TV broadcasting in China.

Cheers,

Steve

 
Posts: 316
Joined: Thu Jun 14, 2012 4:04 am
Location: Mt. Maunganui, New Zealand

Re: Television Receiver Intermediate Frequencies

Post by Synchrodyne » Mon Nov 17, 2014 2:27 am

An item that I have so far failed to pin down is exactly when BREMA announced the standard IFs, 39.5 MHz vision and 33.5 MHz sound, for System I in the UK.

The available evidence places it somewhere between the 1961 and 1962 UK annual radio shows.

The WW 1962 October report on the 1962 show noted that many of the setmakers were displaying dual-standard and convertible receivers, and the inference is that these were using the BREMA standard IFs. Certainly the highlighted commentary in the following page shows that the BREMA numbers were in place by then.

WW 196210 p.477 UK TV IF.jpg


The WW 1961 October report on the 1961 show also referred to dual-standard and convertible receivers from the setmakers. It noted the difference between the “TAC” and “CCIR/Gerber” 625-line variants of the 625-line system and that CCIR/Gerber signals had been piped into the show. But also recorded was the fact that both Ekco and Pye had displayed dual-standard receivers intended to receive the TAC variant. Pye had used the CCIR standard vision IF of 38.9 MHz, with sound therefore at 32.9 MHz. Thus seems unlikely that 39.5 MHz was “in play” at the time.

WW 196110 p.514 Pye Dual-Standard TV.jpg


Back to the 1962 show report, the commentary about the consequences of the 39.5 MHz choice are interesting. As it reads, it is suggested that with AFC to minimize oscillator drift, the European standard number of 38.9 MHz could have been used, in which case receiver image rejection requirements would have been much more relaxed. That the 39.5 MHz choice actually created some difficulties does I think lend some credence to the notion I mentioned in my original posting that it owed something to easing of IF strip design in dual-standard receivers, and was not based upon System I requirements alone.

The Pye receiver with 38.9 MHz 625-line vision IF had a video bandwidth of 4.25 MHz. On the face of it, this was rather low, given that 4.5 MHz was needed to match 405-line horizontal definition (at 3 MHz) bandwidth) and that the TAC parameters implied that the 5 MHz of the CCIR/Gerber system was not enough. But one may see where the 4.25 MHz came from in the IF passband diagram in the WW article. The basic IF passband was of the double Nyquist type, with 6 dB points at both 34.65 MHz (405-line vision IF) and 38.9 MHz (625-line vision IF). This no doubt simplified the required switching between the two systems, but also predetermined the 625 vision bandwidth. Possibly some of the setmakers favoured the double-Nyquist basic IF curve because of its simplicity. And thus perhaps the compromise to accommodate this option, but with more acceptable 625 video bandwidth was to move the 625 IF out to 39.5 MHz. This allowed a 4.85 MHz 625 vision bandwidth, hardly stellar, but somewhat better than 4.25 MHz. 39.5 MHz may have been seen as the practicable upper limit; anything higher would probably have placed the image too close to the (n+10) channel. Also, if the 625 IF “channel” were not to overlap channel B1, then 39.5 MHz was close to the upper limit in this regard as well.

The 53 dB image rejection requirement associated with the 39.5 MHz IF choice in turn demanded the use of four-gang UHF tuners, whereas European practice had been to use 3 gangs. So assuming that the receivers at the 1962 show had 4-gang tuners, the 39.5 MHz IF and 53 dB image rejection numbers must have been known at least a few months earlier to allow time for tuner design/redesign. As I understand it, Cyldon’s initial UHF tuner, the UT, was 3-gang, and I imagine that Philips also had 3-gang designs for European use.

Irish TV started System I transmissions on November 01, 1962. So if Irish 625-line and dual-standard receivers used the 39.5 MHz from the start, then it must have been known quite a bit in advance of this date to allow adequate initial production of receivers.

WW 196211 p.523 Irish TV.jpg


The System I parameters were essentially defined by the TAC in its 1960 report. The 1961 ITU Stockholm UHF meeting allocations were based upon the use of System I by both the UK and Ireland, although in both cases the system choices were noted as being provisional. But the go-ahead in the UK would not have come until appropriate Government actions following the Pilkington Report of 1962 June. So it would seem very likely that BREMA did its homework on the IF choice (and its consequences) in advance of the go-ahead, in anticipation that System I at UHF would come into use sooner or later.

Cheers,

Steve

 
Posts: 316
Joined: Thu Jun 14, 2012 4:04 am
Location: Mt. Maunganui, New Zealand

Re: Television Receiver Intermediate Frequencies

Post by Synchrodyne » Mon Jul 13, 2015 7:49 am

As mentioned in the Radio Receiver Intermediate Frequencies thread, I have recently come across a couple of articles in “Radio News” magazine for 1947 which provide useful background information on American domestic receiver IF choices. This magazine is available at the excellent American Radio History site, at: http://www.americanradiohistory.com/Rad ... _Guide.htm.

The two articles are in fact part of a long series entitled “Practical Radio Course” by Alfred A. Ghirardi, and the two at interest here are: Part 53, in the 1947 May issue, and Part 56, in the 1947 November issue. Part 53 discusses IFs for AM, FM and TV receivers. Part 56 elaborates on FM IFs.

In respect of TV receiver IFs, it was stated that the RMA had recommended that the sound IF be in the range 21.25 to 21.9 MHz, and that the vision IF be 26.4 MHz, with oscillator high. This does not quite add up, as a sound IF range of 21.25 to 21.9 MHz implies a vision IF in the range 25.75 to 26.4 MHz, rather than 26.4 MHz alone. It was also noted that pre-1941 TV receivers generally used IFs of 8.25 MHz sound and 12.75 MHz vision. The standard IF range was said to (just) place the images outside of the TV VHF channel range, spanning 44 to 50 MHz (channel A1), 54 to72 MHz (channels A2 through A4) and 76 to 88 MHz (channels A5 and A6) for a total range of 44 MHz. The high-band channels A7 through A13 were not mentioned, but with a span of 42 MHz (174 to 216 MHz), the RMA IF was also just high enough to keep the images out-of-band.

Other contemporary sources allow some crosschecking. “Radio Craft” magazine (http://www.americanradiohistory.com/Rad ... _Guide.htm), in its 1946 October issue, ran Part V of an article series “Television for Today” by Milton S. Kiver. This included comment on intermediate frequencies. It was noted that until quite recently, TV IFs were 8.25 MHz sound and 12.75 MHz vision, but that the RMA had recommended a change to 21.25 MHz sound and 25.75 MHz vision. As with the Radio News article, it was observed that these RMA numbers placed the image band outside of the 44 MHz band occupied by TV channels A1 through A6.

A different perspective though was provided in “Radio-Electronics” magazine (http://www.americanradiohistory.com/Rad ... r_Page.htm) for 1950 April in an article “Television Equipment Standards” by Mathew Mandl. The author asserted that there was no such thing as a standard intermediate frequency in receivers, although most were in the same general range. This is confirmed in a tabulation of then-current receiver IFs. Excluding the outliers, sound IFs lay in the range 21.25 to 22.25 MHz, and vision IFs in the range 25.75 to 26.75 MHz. The outliers were in the 30 MHz range.

A later Radio-Electronics article, 1952 January issue, “44-MC I.F. Amplifiers for TV”, by David T. Armstrong, said differently, and it is easiest to quote the opening paragraphs:

“FM sound i.f. has climbed the stairway to the stars from 2.1 to 4.3 to 10.7 to 21.75, and now to 41.25 mc. Video i.f.'s have moved from 8.25 to 12.25 to 25.75, and now to 45.75.

“The frequency to which an i.f. amplifier is to be tuned is a subject to which much thought and many barrels of ink have been devoted. The search is always for an intermediate frequency with the maximum number of advantages and the minimum number of disadvantages. The choice of a satisfactory frequency has been the subject of much study and debate in the councils of the RTMA.

“The values of 21.25 to 21.9 for sound and 25.75 to 26.4 for video were adopted because it was generally believed that these values represented the most satisfactory compromise of all the factors bearing on the matter. Most current-model television receivers employ the nominal 24-mc i.f.

“Practical field experience with the 24-mc i.f. has been exceptionally good except for the peculiar problems caused by spurious oscillator radiation. There have also been minor disadvantages, in-cluding direct i.f. interference from hams and from medical and industrial equipment, from powerful FM stations inducing image interference, and from the international short-wave distortions. It was mainly the problem of oscillator radiation that caused the RTMA to advocate the new 44-mc standard.”


My take on this is that most likely the RMA did recommend the “24 MHz” IF in the 1940s, perhaps quite soon after WWII. And that it recommended the narrow ranges, 21.25 to 21.9 MHz sound and 25.75 to 26.4 MHz vision, rather than specific numbers. But for convenience and brevity, some commentators referred only to the lower edge 21.25/25.75 MHz numbers. Receiver manufacturers mostly stayed within these ranges, or did not move far from them. As observed previously, the 22.25 and 26.75 MHz combination used by a small number of the American setmakers may well have informed Japanese practice.

The opening sentence of the 1950 article, in referring to FM IFs, appears to have conflated both the radio and TV sound cases.

The previously mentioned outliers in respect of the “24 MHz” era included the combinations 31.625/36.125 MHz and 32.8/37.3 MHz. The makers concerned evidently preferred a “high” IF, but they may have been limited as to how high they could go by the need to accommodate the then channel A1, 44 to 50 MHz, something that was not a factor when the RTMA (née RMA) set its “44 MHz” standard. Still, one may see these American 30-something MHz IFs as presaging later European practice, where “high” IFs generally had to keep below the 41 MHz lower edge of Band I.

Cheers,

Steve

 
Posts: 242
Joined: Sun Jan 19, 2014 8:09 pm
Location: Hucknall, Notts.

Re: Television Receiver Intermediate Frequencies

Post by colly0410 » Tue Jul 14, 2015 9:49 am

When I was in the Army our artillery radios tuned from 23.5 to 38 Mhz & the infantry/tanks/cavalry ect from 36 to 60ish Mhz. We used to use around 37 Mhz to talk to the infantry if we didn't have an infantry set or it had conked out. What effect (if any) we had on the local TV's I never found out...

P.S. I got threatened with a charge for listening to 'top of the pops' on 41.5 Mhz on an infantry set when on exercise near Aldershot in the late 70's. :)

 
Posts: 316
Joined: Thu Jun 14, 2012 4:04 am
Location: Mt. Maunganui, New Zealand

Re: Television Receiver Intermediate Frequencies

Post by Synchrodyne » Fri Aug 07, 2015 8:28 am

By way of background information on the original Australian standard TV receiver IFs, I found this comment in Amalgamated Wireless Valve (AWV) “Radiotronics” of 1956 December. This is available at: http://frank.pocnet.net/other/AWV_Radiotronics/.

“The Australian Broadcasting Control Board has allotted a frequency spectrum for use by television receiver manufacturers for the i-f channel which will be kept as free from interfering signals as is possible. The recommendation is to use 36 Mc/s as the i-f vision carrier frequency and 30.5 Mc/s as the i-f sound carrier frequency.”

No further detail was provided, but then the article was primarily about TV IF amplifiers, and not about the IF choice itself. It does suggest though that in part, the selection was made on the basis of the IF channel, 30.5 MHz sound and 36.0 MHz vision, was and could be maintained relatively free of other activities. Perhaps the European standard of 33.4 and 38.9 MHz was not so favourably placed.

Cheers,

Steve

 
Posts: 316
Joined: Thu Jun 14, 2012 4:04 am
Location: Mt. Maunganui, New Zealand

Re: Television Receiver Intermediate Frequencies

Post by Synchrodyne » Sat Apr 02, 2016 10:46 pm

I have recently read an interesting article in Wireless World 1949 July, namely “Television Station Selection – A Look to the Future”, by W.T. Cocking.

A significant part of this article addressed intermediate frequency selection for superheterodyne receivers, including the question of whether the oscillator should be above or below the signal frequency. Particularly interesting was that the author recommended the use of a “high” IF, that is one just under the lowest channel frequencies, even though he was considering only the case of five-channel Band I coverage.

Amongst the conclusions was that the television receiver of the future would be a “Superheterodyne with 34-Mc/s intermediate frequency and local-oscillator frequency above the signal-frequency.”

The complete article is available here: http://www.americanradiohistory.com/Arc ... 949-07.pdf, p.242ff.

It looks as if it were original work, not for example influenced by concurrent work in the US towards adopting a “high” TV IF. And it well predicted the future of British TV IF practice.

Cheers,

Steve

 
Posts: 316
Joined: Thu Jun 14, 2012 4:04 am
Location: Mt. Maunganui, New Zealand

Re: Television Receiver Intermediate Frequencies

Post by Synchrodyne » Wed Apr 27, 2016 5:32 am

The BBC’s own TV receiving equipment shows a range of IFs, some standard, some non-standard, at least judging by the information available on the web.

Early UHF rebroadcast receivers, such as the RC5M/501 (c.1969) and RC5M/502 (c.1972) had non-standard IFs of 37.5 MHz vision and 31.5 MHz sound. Quite why is not readily deduced.

The later RC5M/503 (c.1985) rebroadcast receiver was also non-standard, but different at 40.75 MHz vision, 34.75 MHz sound.

Those numbers also applied to the RC1/511 grade 2 receiver (c.1982). There is a block schematic of this in Wireless World 1984 July (p.39), with the annotation “40.75 MHz (to suit synth)”.

So, 40.75 MHz was chosen to suit what was probably an early and relatively simple synthesizer design. Working through the numbers, that implies a local oscillator frequency of 512 MHz for channel E21, vision carrier 471.25 MHz. That LO frequency is a multiple of 8, and given the 8 MHz channel spacing, the multiple-of-8 for local oscillator frequency would have applied across the whole of Bands IV and V. Conceivably that was a desideratum for simple synthesizer design. And 40.75 MHz was the number nearest to the standard 39.5 MHz for which it was achieved.

The RC5M/503 was also synthesized, so the same considerations would have applied.

On the other hand, the UN1/604 (c.1969) and UN1/642 (c.1971) UHF cueing and general-purpose receivers had standard IFs, 39.5 MHz vision and 33.5 MHz sound.

The UN1/585 (c.1969) VHF 405-line cueing receiver had standard IFs, 34.65 MHz vision and 38.15 MHz sound. And the same was true of the experimental 405-line NTSC colour receiver described in BBC Research Department Report T060-7 of 1958.

But the TV/REC/3A (c.1959) 405-line rebroadcast receiver had a non-standard vision IF of 13.2 MHz, quite low even for a Band I-only unit. With oscillator low, the sound IF would have been 9.7 MHz.

The reason for that might be lurking in BBC Monograph #42, “Apparatus for Television and Sound Relay Stations”. In a brief description of the off-air TV receiver, it was said:

“This has been specially designed to have good reliability for unattended working and to minimize the amount of distortion which inevitably arises in the detection process.

“The i.f. stages are arranged to work at a lower frequency than that commonly employed so that delay distortion correction may be more easily introduced.”


Whilst the Monograph did not identify the specific receiver being referred to, it seems likely that its was the TV/REC/3A. Also described was a Band I TV translator of the double-conversion type, but its IF was not given.

Cheers,

Steve

 
Posts: 316
Joined: Thu Jun 14, 2012 4:04 am
Location: Mt. Maunganui, New Zealand

Re: Television Receiver Intermediate Frequencies

Post by Synchrodyne » Sat Jun 11, 2016 9:52 am

Something I have pondered is the earlier Russian TV IF of 34.25 MHz vision, 27.75 MHz sound. This falls between the “low” and the “high” groups, the former usually towards the lower end of the 20 to 30 MHz range, and the latter usually nudging the lower end of Band I/Low Band. So the obvious question is why was it done that unusual way?

Russia had TV channels in Band II as well as in Band III, and in the earlier days, it would appear that only the Band I and Band II channels were used. At least that is indicated in a Wireless World 1956 August article about Russian broadcasting. Even so, it seems unlikely that any standard IF established in the 1950s would not have taken account of future Band III activities.

It is not difficult to imagine that the standard IF was chosen so that its 2nd harmonic fell between the highest Band I and lowest Band II channels, and that its 3rd harmonic fell above below the highest Band II channel. Thus the 2nd harmonic would need to be between 66 and 76 MHz, and its 3rd harmonic above 100 MHz. The first condition required an IF in the range 33 to 38 MHz, the second condition an IF above 33.33 MHz. As this range allowed the possibility that the 5th harmonic could fall below the lower edge of Band III, namely 174 MHz, that could have been an added condition which would then have limited the IF range to 33.33 to 34.8 MHz. Then perhaps within that range 34.25 MHz was the “best fit”.

That gave a channel 1 local oscillator frequency of 83.5 MHz, maybe enough clear of channel 3 sound (83.75 MHz) not to be a problem. But the channel 2 oscillator frequency was 93.5 MHz, a potential problem for channel 5 vision at 93.25 MHz. Perhaps channels 2 and 5 were not used in the same area. The oscillator frequencies for the three lower Band III channels, 6, 7 and 8 respectively fell into the three highest Band III channels, 10, 11 and 12, but that must have been viewed as being manageable, perhaps as an (n+4) taboo.

The 1st and 2nd conditions mentioned above were also met by the later 38.0 MHz vision IF, which is perhaps why it was chosen rather than say the European 38.9 MHz number. In this case the 5th harmonic fell at 190 MHz, which happened to be the boundary between channels 7 and 8, so it was harmless. The Band III taboo remained at (n+4), but the interference beat moved from mid-video band (2.25 MHz) right to the edge of the video band (6 MHz) where presumably it was less of a nuisance.)

Of course that is all post facto rationalization, and may or may not be correct. But at least it is possible to impute a rational explanation for the 34.25 (and 38.0) MHz numbers.

Cheers,

Steve

 
Posts: 316
Joined: Thu Jun 14, 2012 4:04 am
Location: Mt. Maunganui, New Zealand

Re: Television Receiver Intermediate Frequencies

Post by Synchrodyne » Sun Jun 12, 2016 12:20 am

I also had another look at the Australian original standard TV IF (36.0 MHz vision, 31.5 MHz sound) on a similar basis, given that Australia also used some Band II channels in the earlier years.

The CCIR standard IF of 38.9 MHz would not have created any harmonic problems. But it would have resulted in some local oscillator problems. The channel 1 (56-63 MHz) and 2 (63-70 MHz) oscillator frequencies respectively would have fallen within channels 4 (94-101 MHz) and 5 (101-108 MHz). The 36.0 MHz IF avoided this, with channel 1 oscillator falling in the gap between channels 3 and 4, and channel 2 oscillator at 100.25 MHz, 0.5 MHz below channel 4 sound, where it probably did not do too much harm. The IF 5th harmonic was at 180 MHz, and so at the upper end of the channel 6 (174-181 MHz) vision sideband, with a beat frequency of 4.75 MHz.

One consequence though was that the channel 5 oscillator was at 138.25 MHz, exactly the vision carrier frequency of channel 5A (137-144 MHz). As far as I know, channel 5A, like channel 0 (46-52 MHz), was a later addition, so would not have been included in the initial calculations that arrived at 36 .0 MHz. I imagine that channels 5 and 5A would thus not have been assigned in the same area.

As with the Russian case, it is an uncorroborated, but a plausible explanation for the chosen IF and why it differed from the CCIR standard.

The later move to 36.875 MHz – for reasons not yet uncovered – would not appear to have caused too much upset although the IF 5th harmonic was then less favourably placed, with a 2.125 MHz channel 7 (181-188 MHz) beat. But that happened in the solid-state era when better receiver interstage screening was possible, so presumably was not a major issue.

Cheers,

Steve

 
Posts: 316
Joined: Thu Jun 14, 2012 4:04 am
Location: Mt. Maunganui, New Zealand

Re: Television Receiver Intermediate Frequencies

Post by Synchrodyne » Sun Jun 12, 2016 10:17 pm

I recently happened upon another piece of information about the development of American TV IFs.

This was in an article in ‘Electronics’ for 1950 November entitled “Production Experience with a 40-mc I-F Receiver”. The background note about the author, D.W. Pugsley, a GE staffer, stated:

“In 1945 the author, as a member of the RMA Television Receiver Committee, was appointed Chairman of the Subcommittee of Standards of Good Engineering Practice. This Subcommittee adopted an i-f of 21.25 to 21.9 mc for the sound i-f and 26.75 to 27.4 mc for the picture i-f.

“Practical experience with the intermediate frequencies has been relatively good except for the problems engendered by oscillator radiation. In addition there have been several minor difficulties such as double conversion effects, direct i-f interference from amateurs, industrial equipment, international short-wave stations, and f-m station image interference.

“The problem of oscillator radiation became so severe that in April 1948 the Television Receiver Committee met and rescinded its recommended standard i-f.

“The author became chairman of a special task force which restudied the problem. Foremost among those involved in this study were John D. Reid and P. G. Hoist. One year later, the Task Force recommended an i-f of 41.25 mc for sound and 45.75 mc for the picture. This was subsequently adopted as a standard by RMA.”


That confirms that the original American “low” IF was encompassed a narrow range of numbers, and that these were of RMA origin. Effectively the IF “channel” was defined as being anywhere within the range between (21.0 to 27.0 MHz) and (21.65 to 27.65 MHz). It also shows that the “high” IF dated from 1949.

The article is available at: http://www.americanradiohistory.com/Arc ... 950-11.pdf, page 98ff.

Cheers,

Steve

 
Posts: 316
Joined: Thu Jun 14, 2012 4:04 am
Location: Mt. Maunganui, New Zealand

Re: Television Receiver Intermediate Frequencies

Post by Synchrodyne » Mon Jun 13, 2016 7:52 am

Looking again at some unusual cases, TV-FM receivers have sometimes had apparently non-standard IFs on the sound side. For the purposes of this thread, we can look at the cases where the TV sound IF differed from standard. Those where the FM IF differed from standard but the TV sound IF did not are covered in the corresponding radio receiver IF thread.

In the UK, Murphy in 1958 adopted double-conversion on the sound side for its TV-FM receivers. Both TV sound and FM, emerging from the tuner at the System A standard TV sound IF of 38.15 MHz, were downconverted to 6.31 MHz using oscillator-low conversion. At this lower frequency it was easier to provide adequate FM IF selectivity than at 38.15 MHz. This approach was described briefly in Wireless World 1958 October, page 477. Apparently it was not possible to use 10.7 MHz as the 2nd IF because of patterning problems. Assuming that amongst others, IF harmonics up to the 5th were problematical, then the 5th harmonic of 10.7 MHz, namely 53.5 MHz, would interfere with channel B2, and the 4th might interfere with B1. On the other hand, with a 6.31 MHz IF, these harmonics fell below Band I.

Nevertheless, the 10.7 MHz number was sometimes used for TV sound, as for example in the Jason JTV, JTV2 and Monitor/Mercury II FM-TV hi-fi sound tuners. I have read somewhere that when these were used on a common aerial system with a TV receiver, separated only by a resistive splitter, that patterning was possible. I suspect that that could well have been the interference mechanism that Murphy avoided with its 6.31 MHz 2nd IF. Whether a hybrid splitter would have reduced the interference I don’t know. The Jason JTV2 and Monitor/Mercury II, with but a simple-grounded grid RF amplifier, I imagine would have had greater IF and IF harmonic leakage back through the aerial connection than the original JTV, which I think had a cascode RF amplifier.

But knowingly or otherwise, Jason was following an American precedent, in that some 1950s FM-TV sound hi-fi tuners of the 1950s used the 10.7 MHz IF for both FM and TV sound. An example was the Knight SX702 described Radio-Electronics for 1956 May, page 79ff. For TV sound, this used a Standard Coil TV front end modified to provide a 10.7 MHz output.

Much later, circa 1977, the Pioneer TVX-9500 TV sound tuner, for the American market, was of the double-conversion type and employed a 10.7 MHz 2nd IF. The front end included conventional TV VHF and UHF tuners providing a standard sound IF output of 41.25 MHz, which was then downconverted to 10.7 MHz for feeding into conventional, IC-based FM IF and demodulation circuitry. AFC was provided for the 2nd conversion oscillator, which was on the high side, at 51.95 MHz. I think that it is reasonable to assume that with solid-state circuitry, it was much easier to provide interstage screening than it was with valve circuitry, where heat dissipation was a bigger issue. So 10.7 MHz and harmonics thereof at the aerial connection were probably well suppressed.

There were at least two British TV sound tuners of the 1970s dating from circa 1971, both solid state. The Motion Electronics model was available with VHF and/or UHF tuners, covered AM and FM TV sound, and was said to have an IF of around 35 MHz. The Lowther model was UHF and FM only, with unknown IF, but I’d imagine fairly high.

The “Component TV” era arrived circa 1981. Some of the early TV tuners for the American market had split sound, for better quality than could be obtained with the regular intercarrier system, with its inherent faults. Included in this group were models from Luxman and Sony. Both used a 2nd conversion, from 41.25 MHz to 10.7 MHz for the sound channel. In these units, stopping 10.7 MHz and harmonics from getting into the vision IF channel would have been required in addition to stopping them from getting to the aerial terminals would have been required, but it was evidently quite doable. I don’t know for sure, but I suspect that this kind of split sound approach was developed for the Japanese domestic market when FM-FM TV stereo and dual-channel sound arrived in 1978, and it was found that the intercarrier system was seriously wanting. But I think that the Japanese OEMs did later move to the European-style quasi-split sound for their TV tuners. Still, one may say that 10.7 MHz had non-trivial use as either a TV sound IF or 2nd IF.

Looking backwards again, independent double conversion for TV sound predated the 1958 Murphy case. Excluded here are the early converter-type UHF tuners, which made the whole receiver, vision and sound, double conversion. According to an article in WW 1956 November, page 539ff, second conversion was then the preferred approach for Belgian four-standard TV receivers. The sound 1st IF, 33.4 MHz for Systems B, C and F, and 27.75 MHz for System E, was downconverted to either 7.0 or 11.8 MHz. Apparently either of these minimized interference problems, but it was desirable to use a buffer amplifier between the sound take-off point in the vision IF channel and the sound 2nd frequency changer. That buffer stage would tend to inhibit backward transmission of oscillator and IF signals into the vision IF channel.

So of these four sound IF frequencies:

6.31 MHz appears to have been used only by Murphy. I don’t think that any other UK makers followed its sound channel double-conversion approach, and had they done so, it is open to speculation as to whether they would also have used 6.31 MHz, and beyond that, whether BREMA would have standardized it.

7.0 and 11.8 MHz had at least quasi-standard status in Belgium.

10.7 MHz was of course the de facto worldwide standard for Band II FM receivers, and an actual standard in many countries. And it has been widely used for non-FM applications, as noted in the radio receiver IF thread.


Cheers,

Steve

 
Posts: 316
Joined: Thu Jun 14, 2012 4:04 am
Location: Mt. Maunganui, New Zealand

Re: Television Receiver Intermediate Frequencies

Post by Synchrodyne » Fri Jun 17, 2016 3:21 am

Returning to the American case, I have very recently found what is probably the definitive article on the move to the “high” IF. It is “Television Receiver Intermediate Frequencies”, by Paul F.G. Holst, in the journal ‘Electronics’ for 1948 August. (See: http://www.americanradiohistory.com/Arc ... 948-08.pdf, p.90ff.)

It was a seven-page article that delineated the problems that arose with the original RMA “low” IF, detailed the various and sometimes conflicting requirements and then explored the possibilities, with final detailed treatment and comparison of two options, namely 32.8 MHz sound, 37.3 MHz vision and 41.2 MHz sound, 45.7 MHz vision. The latter was considered the best choice given that TV channel 1, 44 to 50 MHz, had just been vacated. I have excerpted and attached the comparison chart.

Evidently the 41.2/45.7 MHz option was selected by the RMA for standardization, and then slightly adjusted to 41.25/45.75 MHz. Perhaps this was to provide integer edges to the notional IF channel, making it 41.00 to 47.00 MHz.

Cheers,

Steve
Attachments
from Electronics 194808 p.95.gif

 
Posts: 316
Joined: Thu Jun 14, 2012 4:04 am
Location: Mt. Maunganui, New Zealand

Re: Television Receiver Intermediate Frequencies

Post by Synchrodyne » Sat Jun 18, 2016 12:02 am

That American study has prompted another look at the Italian case And in turn that led to another set of post facto rationalizations.

The Italian TV IF “channel”, 40 to 47 MHz, was decreed in 1952 April, which was ahead of the ITU Stockholm VHF planning meeting, during which the Italian TV channels were defined as being a bit different to the European CCIR standard channels.

I’d say that this channel was probably derived from American practice, with the sound carrier moved downwards from 41.25 to 40.25 MHz to accommodate the CCIR 625-line 5.5 MHz sound spacing.

The American influence was already there. In 1949, both American and French TV transmitters were installed at Torino. The American transmitter adhered to American standards, except that it worked at 625/50 rather than 525/60. It retained the American 6 MHz channel and 4.5 MHz intercarrier, and transmitted in channel A6, 82 to 88 MHz. It may well have been the first transmitter to operate according to what later became the System N parameters. (This from WW 1950 February, p.45).

It seems likely that the existence of that Milano transmitter was why Italy was allowed channel C, 81 to 88 MHz, in the 1952 Stockholm plan. A WW 1959 March listing shows RAI Torino as the only major transmitter operating on channel C.

And just maybe the different arrangement of the Italian Band I and Band III channels was done to accommodate its decreed IF channel.

With an IF channel of 40 to 47 MHz, with a vision IF of 45.75 MHz, use of channel E2, 47 to 54 MHz, was probably impracticable. Channel B was aligned with channel E4, 61 to 68 MHz. The left room for just one channel below B, which was channel A at 52.5 to 59.5 MHz, or 1.5 MHz below channel E3. One might have expected alignment with channel E3, since that would have put the vision carrier at 55.25 MHz, the same as for channel A2, which was known from American practice to have been workable with a 45.75 MHz vision IF without creating insoluble receiver mixer regeneration problems. Anyway, the channel A vision carrier was 8.5 MHz below that of channel B.

In Band III, the “edge” channels D and H were aligned with channels E5 (174 to 181 MHz) and E10 (209 to 216 MHz) respectively. But three channels only, E, F and G, were fitted between them, rather than the four of the European plan.

A possible explanation is IF harmonic interference. The vision IF (45.75 MHz) fourth harmonic was 183.0 MHz. This would produce a 750 Hz beat with the channel E6/channel D vision carrier (182.25 MHz), which was probably an undesirable situation. That required that channel D be moved upwards, with 1.5 MHz being the apparent minimum change, and the number that was actually used. That put the interference 750 kHz below the vision carrier, at the lower edge of the vestigial sideband, where receiver response was significantly attenuated.

Whatever actual upward movement were chosen, a consequence was that only three channels could be fitted between E5/D and E10/H. Equispacing of these would have provided an 8.75 MHz spacing of the vision carriers. But in practice 8.5 MHz spacing was chosen for channels D and E, and E and F, with 9 MHz between F and G, and between G and H.

As to the reason for that asymmetry, possibly it was to avoid 5th harmonic sound IF (201.25 MHz) interference. Had channel G been 8.75 MHz below channel H, then its vision carrier would have been at 201.5 MHz, which would have allowed the harmonic to cause a 250 Hz beat. But placing it 9 MHz below put the vision carrier at 201.25 MHz, which reduced the beat to zero, at least nominally. One imagines that there would have been some slope demodulation (by the receiver Nyquist slope) of the interfering FM harmonic, but perhaps that was seen as the lesser of evils in a situation where there was not that much freedom to adjust.

With 8.5 and 9 MHz channel spacings, I should think that receiver vision IF bandpass adjacent channel traps were designed primarily for the more severe 8.5 MHz case, which was then deemed to provide adequate rejection for the 9 MHz case. That might be why channel A was separated 8.5 MHz from channel C, rather than being adjacent to it, although perhaps as cause rather than consequence. Once the 9 MHz separation between channels G and H had been seen as necessary, thus destroying the 8.75 MHz symmetry, there was probably some juggling of the other separations, and one of the limiting parameters could have been just how low could channel A be taken without risking receiver mixer regeneration problems. Perhaps that determined the 8.5 MHz number, which was then used for recalculating the Band III separations.

Cheers,

Steve

 
Posts: 316
Joined: Thu Jun 14, 2012 4:04 am
Location: Mt. Maunganui, New Zealand

Re: Television Receiver Intermediate Frequencies

Post by Synchrodyne » Thu Jul 07, 2016 4:21 am

Axiomatic here are the interrelationships between TV receiver intermediate frequencies, the range of TV channel frequencies to be received, and the geographical allocations of those channels.

In general, one might expect that channel frequencies would have been assigned first, based upon the ITU Atlantic City 1947 band allocations, then the geographical disposition of those channels, and finally the optimum IFs. But it has not always turned out that way, as shown in the following commentary.

In the US case, the FCC assigned the VHF TV channels in 1945. This was ahead of Atlantic City 1947, at which the US 1945 VHF broadcasting bands (minis TV channel 1) were then adopted for all of Region II. Very quickly the RMA developed its initial “low” IF recommendation in line with those channel assignments. This “low” IF turned out to be problematical, so the RMA withdrew it in 1948 and then developed a new “high” IF, released in 1949. Even though a second iteration was needed in respect of the IF choice, both iterations followed rather than preceded the channel assignments. The “high” IF did not impose any undue restrictions on VHF channel geographical allocations beyond what was anyway reasonable practice.

But the FCC then used the new “high” IF as the basis for its UHF channel geographical allocations. With the UHF channels, in-band images and other interferences were, short of upconversion, unavoidable, so allocations had to be made with these in mind. The FCC had no power to fix a standard receiver IF, but by basing its UHF channel allocations on the RMA standard, it did provide a strong endorsement. So in this case, the channel allocations followed determination of the IF.

In Europe, definition of the VHF TV channels, both within and in some cases without the Atlantic City 1947 band limits, was undertaken at the ITU Stockholm 1952 meeting. Initial geographical channel assignments were also made at that meeting. But at the time, no standard TV receiver IFs had been developed except in Italy.

As noted in the previous posting, it certainly looks as if the prior Italian choice of standard IF, modelled on American practice, determined its need for non-standard channelling in both Bands I and III, which meant that it had one fewer Band III channel than did the other countries in Europe that used System B.

The CCIR standard System B IF of 38.9 MHz was evidently developed circa 1954 to fit the channels assigned at Stockholm 1952.

In the UK case, it was the imminent use of Band III and the consequent advent of multichannel tuning that led to the development of the BREMA standard IF. This was based upon the System A channel frequencies already defined.

Both the CCIR and BREMA IFs did not impose any undue restrictions on VHF channel geographical allocations beyond what was anyway reasonable practice.

The French case was somewhat different, though. The tête-bêche channelling for System E was defined at Stockholm 1952. And this included a small number of Band I assignments, although originally it had been planned to use only Band III for the 819-line service.

The development of a standard IF, at least one that was practicable, inevitably created restrictions on the use of those Band I channels. The tête-bêche system required oscillator-high for some channels and oscillator-low for others, depending upon whether the chosen IF channel had the vision carrier at the bottom end or the top end. But oscillator-low – with an IF channel just below Band I - was not workable for the Band I channels, as for the most part, oscillator frequencies would have been within the IF channel. So depending upon which way around the IF channel was oriented, either channels F2 and F4, or channel F3 would have been unusable. As it tuned out, the standard IF channel was chosen with the vision carrier at the bottom end, meaning that channel F3 was unusable. Thus it seems at least possible that the IF channel orientation was chosen to minimize the disruption to Band I use. Apparently channel F3 was originally assigned to a planned Tours transmitter.

Whether the deliberations over the CCIR standard IF included the multistandard receiver case is unknown. It seems possible, as multistandard receivers were the norm in Belgium from the start. But whether considered or not, the 38.9 MHz number was evidently satisfactory for the Belgian four-system receivers. This put the System E sound at 27.75 MHz, meaning that the IF channel was reversed as compared with the French case. In turn that meant that Belgian receivers would have been unable to be configured to receive channels F2 and F4. But that did not matter, as the reception requirements were for channel F8A and perhaps some others of the Band III channels, but not the Band I F-channels. As recorded previously, Belgian practice then worked out suitable frequencies for the sound second IF.

How the introduction of the French standard IF affected French multistandard receivers, developed quite early on for the border areas such as Strasbourg, is unknown. One assumes that these would have been required to cover the full French VHF channel set, so would have had a System E IF channel with vision carrier-low. On the other hand, vision-carrier high would have been required for channels E2 through E4, to allow oscillator-high operation. Actually though, a tête-bêche IF channel whose basic bandpass shape was determined by a pair of Nyquist slopes centred respectively on 28.05 and 38.9 MHz might have worked. For the System E case, a 39.2 MHz sound trap would have been switched in, and for the Systems B/C/F case, a 33.4 MHz sound trap and a high pass filter at around 33.9 MHz would have been switched in. That way the standard IFs for each system could have been used.

The European UHF channels were defined and initially allocated at the ITU Stockholm 1961 (ST61) meeting. The uniform use of 8 MHz channels helped. As with the US case, it appears that channel allocations were based upon existing standard IFs from VHF TV receiver practice. At least, that is the impression that one obtains from this excerpt from the ST61 Technical Annex. Thus were developed transmitter co-siting patterns such as n, n+3, n+6, n+10, avoiding the “taboo” combinations.

ITU Stockholm 1961 Technical Annex p.57.gif


Unfortunately the IFs used in development of the ITU table were not stated, but one may make reasonable deductions. For the Systems G and H case, almost certainly it would have been the CCIR standard 38.9 MHz. That correlates with the image at channel n+9, and oscillator at n+5. Then the Italian System H case looks to have been calculated at both the CCIR standard and the Italian standard of 45.75 MHz. For the latter, the image was at n+11 and the oscillator at n+6.

The System K (OIRT) case was evidently calculated at both the then-Russian standard IF of 34.25 MHz and also at the CCIR standard IF or something proximate to it. The Russian standard IF put the image at n+8 and oscillator at n+4. Perhaps the Russians were by then pondering their later upward move to 38 MHz, in order to align with the general European pattern.

System I was more likely worked out on the basis of a 38.9 MHz IF, as I don’t think that the BREMA 39.5 MHz number had yet been debated or promulgated.

For the French System L case, the standard IF of 32.7 MHz might have been decided before ST61. For this, the oscillator would have been at n-4, and the image at n-9.

Whereas most of the European countries could, for UHF reception, continue using their existing TV receiver IFs established in the VHF era, both the UK and France were faced with new 625-line UHF systems and the need for dual-standard receivers, which would continue to use the established IFs for VHF reception, but needed “new” IFs for UHF.

In the French case, for simplicity at the receiver end, the System L IF channel was chosen to have the same sound IF as for System E, namely 39.2 MHz. A common sound IF was probably advantageous with AM sound on both systems. Since the IF channel should not encroach on Band I, this in turn meant that the vision IF was at the lower end of the channel, at 32.7 MHz. That required oscillator-low, not a problem at UHF, and not for Band III. But it had implications when System L was extended to Band I, where oscillator-low working was infeasible. The solution was to use inverted channels, with the vision carrier at the high end, for the Band I channels, which then allowed oscillator-high working. Hence System L’.

In the UK case, a common sound IF would have been of less value given that most receivers would have been expected to use the intercarrier technique for System I. Also, System I was to be used in VHF Bands I and III in Ireland near-term, and possibly in the UK in the more distant future. Allowing for Band I use meant oscillator-high working, and so an IF channel with the vision carrier at the high end. Effectively then UK dual-standard receivers would have a tête-bêche IF channel. As noted earlier in this thread, this could have been done with respectively 34.65 and 38.9 MHz IFs for 405 and 625 lines, but the 625-line number was moved up to 39.5 MHz to allow wider 625-line vision bandwidth whilst still using a basic double-Nyquist IF bandpass. This in turn created a requirement for much better receiver image rejection performance, given that the previously defined requirement for co-sited n+10 channels remained.

In the Australian case, the standard IF channel, with 36.0 MHz vision IF, was chosen more-or-less at the same as and to be compatible with the (unusual) VHF channel allocations, something that the CCIR standard number would not have done. Elsewhere outside of Europe with System B, the CCIR standard IF appeared to be satisfactory.


Cheers,

Steve

 
Posts: 316
Joined: Thu Jun 14, 2012 4:04 am
Location: Mt. Maunganui, New Zealand

Re: Television Receiver Intermediate Frequencies

Post by Synchrodyne » Fri Jul 08, 2016 11:30 pm

As previously noted, Japan stayed with a “low” IF, 26.75 MHz vision, until circa 1970, after which it changed to a “high” IF, 58.75 MHz vision, that was noticeably higher than any others.

On the face of it, that might be explained by inertia; that is, in 1953, when Japan started TV broadcasting, it simply adopted a variation of the earlier American “low” IF, at a time when the “high” IF was establishing itself in American practice, and for the most part, was yet to arrive in Europe.

But there might have been more to it. Japan adopted a channelling plan that was different to that used in the Americas. Channels J1 through J3 were in Band II, 90 through 108 MHz. Channels J4 through J12 were in Band III, 170 through 222 MHz, with a slight overlap between channels J7 (188 - 194 MHz) and J8 (192 - 198 MHz).

That could have made the use of the American 45.75 MHz IF problematical. Its 2nd harmonic was 91.5 MHz, very close to the channel J1 vision carrier at 91.25 MHz, and its 4th harmonic was at 183 MHz, similarly close to the channel J6 vision carrier at 183.25 MHz. Then the channel J4 oscillator frequency was 217 MHz, close to the J12 vision carrier at 217.25 MHz, although whether J12 was part of the original allocation or whether it was a later addition is unknown.

On the other hand, it does look as if 26.75 MHz was chosen to fit with the channelling plan. Its 4th harmonic was at 107 MHz, within channel J3, but in a position where it probably would not do too much harm. Its 5th harmonic, the highest usually considered, was well below Band III. It also put the channel J4 oscillator at 198 MHz, right on the boundary between channels J8 and J9, so out of harm’s way. Similarly the J5, J6 and J7 oscillators fell on channel boundaries, although the J8 oscillator was at 220 MHz, enough inside channel J12 to be a potential problem, I imagine. If though J12 was a later addition, part of the Band III “creep” beyond its original 216 MHz upper limit, then it would not have been part of the original deliberations.

As well as 26.75 MHz, both 32.75 and 38.75 MHz would have resulted in channel boundary-positioned oscillator frequencies for the lower Band III channels, and both would have avoided the J12 intrusion. But 32.75 MHz had a 3rd harmonic at 98.25 MHz, rather close to the J2 vision carrier at 97.25 MHz. And 38.75 MHz had a 5th harmonic at 193.75 MHz, very close to the J8 carrier at 193.25 MHz. So neither would have been good choices.

So by that back-of-the-envelope analysis, it looks as if the original Japanese IF choice was made as the “best fit” at the time, and it just happened to be a “low” IF. Going above 45.75 MHz was probably not a wise move at the time, as it would have complicated IF strip design with the domestic type valves that were likely to be available.

Quite why the change to 58.75 MHz was made is not clear, but avoidance of the channel J12 conflict and better allocation of the UHF channels were likely to have been factors. By the time 58.75 MHz arrived, the solid-state era was well established, and the requisite IF gains were easily obtained, particularly with integrated circuits.

With 58.75 MHz, the 2nd harmonic was above Band II. The 3rd harmonic was at 176.25 MHz, just above the boundary between channels J4 and J5, and unlikely to be problematical. Band III oscillator frequencies were all above the band. And the UHF “taboo” channels were well-removed from wanted channels, and so well down the RF selectivity curve.

Cheers,

Steve

PreviousNext

Return to Black & White Television



Who is online

Users browsing this forum: Bing [Bot] and 5 guests