This article was nominated for deletion on 11 August 2021. The result of the discussion was keep. |
This is the
talk page for discussing improvements to the
1080i article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Find sources: Google ( books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL |
This article is rated Start-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | |||||||||||
|
Someone cares to add WHY the 0.03 fps difference exists; how it came about, and what the reason is of it's existence? Thanks! —Preceding unsigned comment added by 71.206.65.120 ( talk) 04:03, 17 March 2009 (UTC)
I believe the reason is to avoid a beat with 60hz electricity. 29.97 certainly existed before NTSC color. Adamgoldberg ( talk) 12:49, 15 September 2021 (UTC)
I am removing the line about 1080p offering no advantage and being generally unsupported. In addition to it now becoming more widely supported by the latest generation of LCD sets, the statement about it offering no advantage is unsupported by references and certanly arguably untrue (refer to the article on interlacing). For displays such as DLP, Plasma and LCD interlacing must be removed and causes visible artifacts in the process.
Robbins 06:20, 28 November 2006 (UTC)
Is there any reason why anyone would prefer 1080i over 1080p except if you have a CRT based HDTV set? In uncompressed format the data rate, i.e. pixels per second, is the same and it is easier and more efficient to compress video in progressive format than in interlaced format.
The main article states: "Because of interlacing 1080i has half the vertical resolution of 1080p." This is not true. 1080i and 1080p have exactly the same spatial resolution but 1080i has less temporal resolution. RastaKins 05:06, 21 March 2007 (UTC)
As far as I'm aware, the main reason 1080i still exists is broadcast engineers reluctance to change.. There is no technical reason interlacing is needed anymore (TVs are capable of display progressive scan quite easily now), but it's still around (it's much the same as the obscure 29.97frames/second framerate still existing) - the benefits are questionable (claiming interlacing "doubles the frame rate" is silly, given that there's basically no perceivable difference between 25 and 50fps - if there was, cinema wouldn't still be 24fps..) 81.152.116.183 ( talk) 02:37, 18 July 2008 (UTC)
It seems few are capable of understanding temporal resolution, which operates in the human visual cortex, and that interlaced video, if presented accurately, provides more total resolution than progressive for the same frame rate (two fields in interlaced). The fact that monitor and TV manufacturers wish to abandon it makes no difference, and in fact becomes a self-fulfilling prophecy in that, yes, since the devices alter the signal, the picture is now somewhat slightly degraded in addition to the loss of temporal resolution. — Preceding unsigned comment added by Mydogtrouble ( talk • contribs) 13:58, 13 March 2013 (UTC)
Maybe I'm completely wrong, but I'm sure I've read that 1080i is (often if not always?) 1440*1080 before being stretched to 16:9 (like DVD), at least when being broadcast on terrestrial television. Nova Prime 11:51, 17 January 2007 (UTC)
The terminology used within this article is deprecated. The preferred terminology as described by the ITU and SMPTE has the following format: xxxxy/zz where xxxx is the number of active lines per picture (usually 1080 or 720 when discussing high definition), y is the scanning mode (indicated by a letter i for interlaced scanning or a letter p for progressive scanning), next comes a slash character, and zz is the refresh rate of the picture. Thus standard definition television as used in Europe would be described as 576i/25. 82.127.93.74 17:40, 19 January 2007 (UTC)
The second paragraph says:
Others, including the European Broadcasting Union (EBU), prefer to use the frame rate instead of the field rate and separate it with a solidus from the resolution as in 1080i/30 and 1080i/25, likewise 480i/30 and 576i/25.
The word "solidus" links to the page for the [ mark "slash,"] which includes a very stuffy note that "slash" and "solidus" are not the same. Jackrepenning 22:05, 22 January 2007 (UTC)
When writing a paper you are not suppose to list references for someone who is reasonably skilled in the field would know already. For example 1080i60 uses the same bandwidth 1080p30. Most people who knowing something about video display know this is true. A simple calculation of the image size times the frame rate shows this is true: 60*1920*540= 30*1920*1080. So does wikipedia have a different standard for what should have a reference? Daniel.Cardenas 04:25, 4 April 2007 (UTC)
I'm baffled. I thought 30 fps was the standard for US television, and that the difference between interlaced and progressive was whether the full frame was completed in one pass or in two passes. I'm no techie, so perhaps this could be explained in layman's terms. The discussion above compounds my confusion rather than ending it. Second question, partly related to the first: Here's a quote from the article: "Due to interlacing 1080i has twice the frame-rate but half the resolution of a 1080p signal using the same bandwidth. This is especially useful in sport-shows and other shows with fast-moving action." I'm not sure what "this," the first word in the second sentence, refers to. It seems to me to refer to the subject of the first sentence, 1080i, which means that 1080i is better for shows with fast-moving subjects. I'm almost sure that's wrong, because of the movement of displayed objects in the picture between the first and second scans of the same frame. But even if it's not wrong, perhaps the pronoun "This" that begins the second sentence should be changed to a noun (The 1080i standard or the 1080p standard) to clear up any confusion.
The first paragraph speaks of frame rates of 25 and 30Hz for 1080i; yet the comparison table says 1080i is 50 or 60Hz and 1080p is 24, 25 or 30HZ. This appears to be a contradiction. —Preceding unsigned comment added by KX36 ( talk • contribs) 06:15, 16 August 2007
This is unclear in the article.
"1080p60", has no ambiguity, (60 full frames per second), but "1080i60" might be interpreted as 60 two-field-frames per second, or it might be interpreted as 60 fields per second.
Can we have a section establishing the standard interpretation? Also we may need to clarify or avoid all usages of "N frames per second". fields or full-frames?
Glueball 10:56, 10 September 2007 (UTC)
Until now I had always seen that number meaning fields per second for interlaced modes, so that PAL-B/G is expressed as 576i50. But I'll have to check. -- 150.241.250.3 07:31, 18 September 2007 (UTC)
Why does 540p redirect here, its clearly not the same. 83.108.208.28 ( talk) 00:22, 9 July 2009 (UTC)
Please fix my broken citation (#3, as of the time of writing). I could not find a way to link the image, which is on Wikimedia Commons, inside the reference. The image is the only real reliable reference, in this case (as explained in my edit comment). Comanoodle ( talk) 21:02, 20 September 2009 (UTC)
There is an edit done by an anonymous editor in the fourth paragraph: "1877x1000 (the actual displayed resolution of a 1920x1080 source) resolutions.", in contrast to the original "1920x1080 resolutions". Where in the world have they got that information? To my knowledge, all HDTV resolutions are displayed as such; no cropping is ever done (which I believe the author tries to say, in contrast to scaling). If nobody comments, I'll undo that. Elmo Allen ( talk) 04:39, 23 November 2009 (UTC)
I think the article needs to be made clearer that interlaced video doesn't have to suffer any artefacts at all, and can produce the exact same results as 1080p broadcasts. The image used in the article is a bit misleading. For example, in the UK programmes are broadcast at 1080i for most HD channels, but the source material is 25fps, and so a 1080p television simply combines the two fields to produce an original progressive frame. You'd never under any circumstances see a combing effect, because the two fields are from the exact same original frame. As the article stands, it gives the impression that 1080i is always visually inferior to 1080p, which is nonsense. Teppic74 ( talk) 12:15, 28 September 2011 (UTC)
This article needs much more content!
It only discusses "perfect" content streams -- which is not the real world. What is the full bandwidth of a perfect 1080i stream? What is the real typical bandwidth of OTA broadcasts, cable broadcasts, and Bluray sources? How much of what kinds of compression is used, and how does this degrade various kinds of content? Signal encoding redundancy/error correction/artifacts?- 96.237.4.73 ( talk) 19:08, 14 February 2013 (UTC)
sssss 202.165.91.8 ( talk) 15:14, 26 May 2022 (UTC)
Currently, there is a line at the bottom of the lead which says:
The choice of 1080 lines originates with Charles Poynton, who in the early 1990s pushed for "square pixels" to be used in HD video formats. [1]
I have some issues with this. First, the source is Charles Poynton's personal website. In his words, he is "the inventor of the number 1080 found in HDTV standards
". There's no independent source, so we are relying on his statements alone. If the claim were more minor, and in his biography article, like "In the early 1990s, Poynton encouraged the use of 1080 lines and square pixels in HD video", I wouldn't have a problem with it. But this line goes beyond a minor claim about himself (
WP:ABOUTSELF). It's a claim that he is the originator of 1080 lines (and therefore nobody before him proposed such a thing). That's a little too much to be backed up by his own website alone.
I also have some doubts about the veracity of the claim. According to "European Perspectives on HDTV Studio Production Standards", IEEE Transactions on Broadcasting, Vol. 35, No. 3, September 1989 ( doi: 10.1109/11.35315), Page 281 and 282 (elipses mine, bolded):
The technical problem is to decide how to line up all parameters other than field rate so as to allow maximum convenience, lowest equipment costs, and maximum quality and converted picture quality. ... An example ... is put forth below ... The system has a common bit-rate and a common image structure (1080 X 1920 elements). The digital pixels are square (1920 X 9/16 = 1080). The use of progressive scanning also gives balanced horizontal and vertical resolution (or square analogue pixels).
So, these authors (N. Wassiczek, G.T. Waters, D. Wood), had proposed square pixels and 1080 vertical lines as early as September 1989, earlier than "the early 1990s" from the line.
Here's another example. From Future Development of HDTV, CCIR Report BT.1217, page 180:
[CCIR, 1986-90n] suggests that an image frame with 2 250 000 samples will simplify the task of ensuring compatibility with Recommendation 601 and proposes a common image format based on 1080 active scanning lines per frame. This number of scanning lines is derived from a pixel aspect ratio of 1:1.
If you scroll up, you will see two diagrams also referencing 1920 samples and 1080 lines. I wasn't able to find CCIR 1986-90n itself, but judging from the name, I figure it came out between 1986 and 1990. That still seems too early for him to originate 1080 lines in "the early 1990s", especially since the CCIR 1990 conference ended on 1 June 1990. My speculation is that Mr. Poynton was one of multiple people to come up with the same numbers (by modifying NHK/Sony's de-facto standard, 1920 x 1035, to have square pixels). He certainly may have generated support for 1080 lines as a standard. But as it stands, there certainly aren't enough reliable sources to call him the originator/inventor of that number.
References
HenryMP02 ( talk) 03:58, 3 October 2022 (UTC)
The article incorrectly documented that the odd lines are stored in the first field and then the even lines are stored in the second. This is not correct, but to understand why requires a bit of history.
When interlaced analogue television was developed with the introduction of the 405-line system in 1936, the signal was transmitted as a sequence of odd and even fields. Which was first was technically unimportant, they just occurred odd-even-odd-even etc. etc. There was no difference other than timing changes. Just like a broken line in the middle of the road, the line follows the spaces which follow the line which follows the spaces etc. etc.
The specification for the 405-line system specified that the odd field occurred first in the complete video frame, but there was no technical reason why it should do so. It was purely a matter of convention. In fact (with one notable exception) all the specifications for all interlaced analogue video systems were written assuming the odd field came first. [1] They could have specified even field with no change to anything. The one exception was the US National Television System Committee (NTSC). For some reason that is unlikely to become clear, although the makeup of the video signal was, more or less identical apart from timings, they decided to specify the even frame as being first. As already said, this made absolutely no difference - at least in the analogue world.
However, once digital video formats took off, it did make a difference. Digital video formats encode a complete frame as a distinct video unit (to facilitate compression among other reasons). Because they were originally created for NTSC standard signals, the digital video frame follows the NTSC standard and has the even field first. This survived into the DVD video format, and ultimately into the 1080i digital formats. This even first sequence was preserved when digital formats were copy-pasted for the 625-line analogue systems.
This created an awkward problem though it should not have done. Since 625-line video is specified as odd field first, someone decided that it was necessary to switch the field order to match the digital format's requirement meaning that it had to be switched back on replay (because the digital encoding switches the field order so that the even field which occurred after the odd timewise now comes first giving a time 'hiccup'). The article contains an image of what happens when the digital deinterlacing does not work well. The image could equally illustrate what happens when field order is switched incorrectly (actually a frequent problem with on-line video that was created with analogue systems and has been converted to non-interlaced digital video without switching the field order - the fields play out of sequence giving a similar comb effect).
The unfortunate part was that this was all unnecessary because the digital encoding could have simply used the even field from the previous frame which would have completely obviated the issue.
References
86.162.147.159 ( talk) 17:03, 5 October 2022 (UTC)
This article was nominated for deletion on 11 August 2021. The result of the discussion was keep. |
This is the
talk page for discussing improvements to the
1080i article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Find sources: Google ( books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL |
This article is rated Start-class on Wikipedia's
content assessment scale. It is of interest to the following WikiProjects: | |||||||||||
|
Someone cares to add WHY the 0.03 fps difference exists; how it came about, and what the reason is of it's existence? Thanks! —Preceding unsigned comment added by 71.206.65.120 ( talk) 04:03, 17 March 2009 (UTC)
I believe the reason is to avoid a beat with 60hz electricity. 29.97 certainly existed before NTSC color. Adamgoldberg ( talk) 12:49, 15 September 2021 (UTC)
I am removing the line about 1080p offering no advantage and being generally unsupported. In addition to it now becoming more widely supported by the latest generation of LCD sets, the statement about it offering no advantage is unsupported by references and certanly arguably untrue (refer to the article on interlacing). For displays such as DLP, Plasma and LCD interlacing must be removed and causes visible artifacts in the process.
Robbins 06:20, 28 November 2006 (UTC)
Is there any reason why anyone would prefer 1080i over 1080p except if you have a CRT based HDTV set? In uncompressed format the data rate, i.e. pixels per second, is the same and it is easier and more efficient to compress video in progressive format than in interlaced format.
The main article states: "Because of interlacing 1080i has half the vertical resolution of 1080p." This is not true. 1080i and 1080p have exactly the same spatial resolution but 1080i has less temporal resolution. RastaKins 05:06, 21 March 2007 (UTC)
As far as I'm aware, the main reason 1080i still exists is broadcast engineers reluctance to change.. There is no technical reason interlacing is needed anymore (TVs are capable of display progressive scan quite easily now), but it's still around (it's much the same as the obscure 29.97frames/second framerate still existing) - the benefits are questionable (claiming interlacing "doubles the frame rate" is silly, given that there's basically no perceivable difference between 25 and 50fps - if there was, cinema wouldn't still be 24fps..) 81.152.116.183 ( talk) 02:37, 18 July 2008 (UTC)
It seems few are capable of understanding temporal resolution, which operates in the human visual cortex, and that interlaced video, if presented accurately, provides more total resolution than progressive for the same frame rate (two fields in interlaced). The fact that monitor and TV manufacturers wish to abandon it makes no difference, and in fact becomes a self-fulfilling prophecy in that, yes, since the devices alter the signal, the picture is now somewhat slightly degraded in addition to the loss of temporal resolution. — Preceding unsigned comment added by Mydogtrouble ( talk • contribs) 13:58, 13 March 2013 (UTC)
Maybe I'm completely wrong, but I'm sure I've read that 1080i is (often if not always?) 1440*1080 before being stretched to 16:9 (like DVD), at least when being broadcast on terrestrial television. Nova Prime 11:51, 17 January 2007 (UTC)
The terminology used within this article is deprecated. The preferred terminology as described by the ITU and SMPTE has the following format: xxxxy/zz where xxxx is the number of active lines per picture (usually 1080 or 720 when discussing high definition), y is the scanning mode (indicated by a letter i for interlaced scanning or a letter p for progressive scanning), next comes a slash character, and zz is the refresh rate of the picture. Thus standard definition television as used in Europe would be described as 576i/25. 82.127.93.74 17:40, 19 January 2007 (UTC)
The second paragraph says:
Others, including the European Broadcasting Union (EBU), prefer to use the frame rate instead of the field rate and separate it with a solidus from the resolution as in 1080i/30 and 1080i/25, likewise 480i/30 and 576i/25.
The word "solidus" links to the page for the [ mark "slash,"] which includes a very stuffy note that "slash" and "solidus" are not the same. Jackrepenning 22:05, 22 January 2007 (UTC)
When writing a paper you are not suppose to list references for someone who is reasonably skilled in the field would know already. For example 1080i60 uses the same bandwidth 1080p30. Most people who knowing something about video display know this is true. A simple calculation of the image size times the frame rate shows this is true: 60*1920*540= 30*1920*1080. So does wikipedia have a different standard for what should have a reference? Daniel.Cardenas 04:25, 4 April 2007 (UTC)
I'm baffled. I thought 30 fps was the standard for US television, and that the difference between interlaced and progressive was whether the full frame was completed in one pass or in two passes. I'm no techie, so perhaps this could be explained in layman's terms. The discussion above compounds my confusion rather than ending it. Second question, partly related to the first: Here's a quote from the article: "Due to interlacing 1080i has twice the frame-rate but half the resolution of a 1080p signal using the same bandwidth. This is especially useful in sport-shows and other shows with fast-moving action." I'm not sure what "this," the first word in the second sentence, refers to. It seems to me to refer to the subject of the first sentence, 1080i, which means that 1080i is better for shows with fast-moving subjects. I'm almost sure that's wrong, because of the movement of displayed objects in the picture between the first and second scans of the same frame. But even if it's not wrong, perhaps the pronoun "This" that begins the second sentence should be changed to a noun (The 1080i standard or the 1080p standard) to clear up any confusion.
The first paragraph speaks of frame rates of 25 and 30Hz for 1080i; yet the comparison table says 1080i is 50 or 60Hz and 1080p is 24, 25 or 30HZ. This appears to be a contradiction. —Preceding unsigned comment added by KX36 ( talk • contribs) 06:15, 16 August 2007
This is unclear in the article.
"1080p60", has no ambiguity, (60 full frames per second), but "1080i60" might be interpreted as 60 two-field-frames per second, or it might be interpreted as 60 fields per second.
Can we have a section establishing the standard interpretation? Also we may need to clarify or avoid all usages of "N frames per second". fields or full-frames?
Glueball 10:56, 10 September 2007 (UTC)
Until now I had always seen that number meaning fields per second for interlaced modes, so that PAL-B/G is expressed as 576i50. But I'll have to check. -- 150.241.250.3 07:31, 18 September 2007 (UTC)
Why does 540p redirect here, its clearly not the same. 83.108.208.28 ( talk) 00:22, 9 July 2009 (UTC)
Please fix my broken citation (#3, as of the time of writing). I could not find a way to link the image, which is on Wikimedia Commons, inside the reference. The image is the only real reliable reference, in this case (as explained in my edit comment). Comanoodle ( talk) 21:02, 20 September 2009 (UTC)
There is an edit done by an anonymous editor in the fourth paragraph: "1877x1000 (the actual displayed resolution of a 1920x1080 source) resolutions.", in contrast to the original "1920x1080 resolutions". Where in the world have they got that information? To my knowledge, all HDTV resolutions are displayed as such; no cropping is ever done (which I believe the author tries to say, in contrast to scaling). If nobody comments, I'll undo that. Elmo Allen ( talk) 04:39, 23 November 2009 (UTC)
I think the article needs to be made clearer that interlaced video doesn't have to suffer any artefacts at all, and can produce the exact same results as 1080p broadcasts. The image used in the article is a bit misleading. For example, in the UK programmes are broadcast at 1080i for most HD channels, but the source material is 25fps, and so a 1080p television simply combines the two fields to produce an original progressive frame. You'd never under any circumstances see a combing effect, because the two fields are from the exact same original frame. As the article stands, it gives the impression that 1080i is always visually inferior to 1080p, which is nonsense. Teppic74 ( talk) 12:15, 28 September 2011 (UTC)
This article needs much more content!
It only discusses "perfect" content streams -- which is not the real world. What is the full bandwidth of a perfect 1080i stream? What is the real typical bandwidth of OTA broadcasts, cable broadcasts, and Bluray sources? How much of what kinds of compression is used, and how does this degrade various kinds of content? Signal encoding redundancy/error correction/artifacts?- 96.237.4.73 ( talk) 19:08, 14 February 2013 (UTC)
sssss 202.165.91.8 ( talk) 15:14, 26 May 2022 (UTC)
Currently, there is a line at the bottom of the lead which says:
The choice of 1080 lines originates with Charles Poynton, who in the early 1990s pushed for "square pixels" to be used in HD video formats. [1]
I have some issues with this. First, the source is Charles Poynton's personal website. In his words, he is "the inventor of the number 1080 found in HDTV standards
". There's no independent source, so we are relying on his statements alone. If the claim were more minor, and in his biography article, like "In the early 1990s, Poynton encouraged the use of 1080 lines and square pixels in HD video", I wouldn't have a problem with it. But this line goes beyond a minor claim about himself (
WP:ABOUTSELF). It's a claim that he is the originator of 1080 lines (and therefore nobody before him proposed such a thing). That's a little too much to be backed up by his own website alone.
I also have some doubts about the veracity of the claim. According to "European Perspectives on HDTV Studio Production Standards", IEEE Transactions on Broadcasting, Vol. 35, No. 3, September 1989 ( doi: 10.1109/11.35315), Page 281 and 282 (elipses mine, bolded):
The technical problem is to decide how to line up all parameters other than field rate so as to allow maximum convenience, lowest equipment costs, and maximum quality and converted picture quality. ... An example ... is put forth below ... The system has a common bit-rate and a common image structure (1080 X 1920 elements). The digital pixels are square (1920 X 9/16 = 1080). The use of progressive scanning also gives balanced horizontal and vertical resolution (or square analogue pixels).
So, these authors (N. Wassiczek, G.T. Waters, D. Wood), had proposed square pixels and 1080 vertical lines as early as September 1989, earlier than "the early 1990s" from the line.
Here's another example. From Future Development of HDTV, CCIR Report BT.1217, page 180:
[CCIR, 1986-90n] suggests that an image frame with 2 250 000 samples will simplify the task of ensuring compatibility with Recommendation 601 and proposes a common image format based on 1080 active scanning lines per frame. This number of scanning lines is derived from a pixel aspect ratio of 1:1.
If you scroll up, you will see two diagrams also referencing 1920 samples and 1080 lines. I wasn't able to find CCIR 1986-90n itself, but judging from the name, I figure it came out between 1986 and 1990. That still seems too early for him to originate 1080 lines in "the early 1990s", especially since the CCIR 1990 conference ended on 1 June 1990. My speculation is that Mr. Poynton was one of multiple people to come up with the same numbers (by modifying NHK/Sony's de-facto standard, 1920 x 1035, to have square pixels). He certainly may have generated support for 1080 lines as a standard. But as it stands, there certainly aren't enough reliable sources to call him the originator/inventor of that number.
References
HenryMP02 ( talk) 03:58, 3 October 2022 (UTC)
The article incorrectly documented that the odd lines are stored in the first field and then the even lines are stored in the second. This is not correct, but to understand why requires a bit of history.
When interlaced analogue television was developed with the introduction of the 405-line system in 1936, the signal was transmitted as a sequence of odd and even fields. Which was first was technically unimportant, they just occurred odd-even-odd-even etc. etc. There was no difference other than timing changes. Just like a broken line in the middle of the road, the line follows the spaces which follow the line which follows the spaces etc. etc.
The specification for the 405-line system specified that the odd field occurred first in the complete video frame, but there was no technical reason why it should do so. It was purely a matter of convention. In fact (with one notable exception) all the specifications for all interlaced analogue video systems were written assuming the odd field came first. [1] They could have specified even field with no change to anything. The one exception was the US National Television System Committee (NTSC). For some reason that is unlikely to become clear, although the makeup of the video signal was, more or less identical apart from timings, they decided to specify the even frame as being first. As already said, this made absolutely no difference - at least in the analogue world.
However, once digital video formats took off, it did make a difference. Digital video formats encode a complete frame as a distinct video unit (to facilitate compression among other reasons). Because they were originally created for NTSC standard signals, the digital video frame follows the NTSC standard and has the even field first. This survived into the DVD video format, and ultimately into the 1080i digital formats. This even first sequence was preserved when digital formats were copy-pasted for the 625-line analogue systems.
This created an awkward problem though it should not have done. Since 625-line video is specified as odd field first, someone decided that it was necessary to switch the field order to match the digital format's requirement meaning that it had to be switched back on replay (because the digital encoding switches the field order so that the even field which occurred after the odd timewise now comes first giving a time 'hiccup'). The article contains an image of what happens when the digital deinterlacing does not work well. The image could equally illustrate what happens when field order is switched incorrectly (actually a frequent problem with on-line video that was created with analogue systems and has been converted to non-interlaced digital video without switching the field order - the fields play out of sequence giving a similar comb effect).
The unfortunate part was that this was all unnecessary because the digital encoding could have simply used the even field from the previous frame which would have completely obviated the issue.
References
86.162.147.159 ( talk) 17:03, 5 October 2022 (UTC)