720p VS 1080i

ajk1080

Ukyo's Doctor
Joined
Jul 26, 2003
Posts
1,205
Once and for all can someone please explain the differences between the two? From my understanding(and I could be completely wrong), 720p is better. From the way that it was explained to me, 1080i scans inl the odd lines then all the even lines of resolution(odd, even, odd, even, odd, etc) and 720p puts up all the odds and even up all atthe same time, kind of like one frame of a movie in a reel to reel film stock. Isnt having zero scan lines better than having odd and evens scan in? Wouldn't having 720p virtuly eliminate dot crawl? And then 1080p would be the same technology, just with more dots per inch? I always thougt the order of picture quality went 480i < 480p < 1080i < 720p < 1080p . What do I have wrong and what do I have correct, if anything?
 
Last edited:

JHendrix

Jello Pudding Pop, Y'know? Like that whole Bill C
Joined
Jun 27, 2001
Posts
9,436
ajk1080 said:
Once and for all can someone please explain the differences between the two? From my understanding(and I could be completely wrong), 720p is better. From the way that it was explained to me, 1080i scans in all the odd lines then all the even lines of resolution and 720p puts up all the odds and even up all atthe same time, kind of like one frame of a movie in a reel to reel film stock. Isnt having zero scan lines better than having odd and evens scan in? Wouldn't having 720p vurtuly eliminate dot crawl? And then 1080p would be the same technology, just with more dots per inch? I always thougt the order of picture quality went 480i < 480p < 1080i < 720p < 1080p . What do I have wrong and what do I have correct, if anything?

You've pretty much got it down well enough.

1080i looks better than 720p for still images but for moving pictures (especially games) 720p is the shizzle. 1080i is very good for sports and stuff though, I know the Superbowl is broadcast in 1080i as I think it takes up less bandwidth than 720p.

1080p is like the "real" HD resolution.

Personally I'd prefer a CRT over a cheaper LCD, so I'm getting a set that does 1080i and 480p. That's just for my price range though, if you can afford it a really nice big LCD set is the way to go for gaming.
 

jro

Gonna take a lot
20 Year Member
Joined
Oct 11, 2004
Posts
14,429
I have my 360 set to 1080i. Seems that it brings out smaller details in a lot of games, though Hendrix is right about the movement. I've played plenty with both, and I just tend to like the 1080i effect better.

For HD broadcasting, I haven't seen anything other than 1080i yet. Comcast's HD package here (basic, no premium channels) has about 16 HD channels, and all broadcast in 1080i.

I watch HD-DVDs at 1080i, too, and they look real nice. I haven't really messed around with the player at all, but I'd be curious to see how they look in 720p.
 

RAINBOW PONY

DASH DARK ANDY K,
20 Year Member
Joined
Apr 15, 2002
Posts
24,310
1080i is old, and it's not even progressive scan.

If you can't get a 1080p set (like 95% of people) get a 720p set, or 768p set (same thing). That is the current standard in HDTVs IMO, if you have an older HDTV it probably only does 1080i.

HD-DVDs and games (xbox 360) will look better in 720p.

I have a 720p sony LCD and it's awesome, I won't be upgrading until 1080p sets are affordable.

BTW before my sony lcd I had a 34 inch sony XBR that did 1080i, i returned it before the 30 day period was up cause it just didn't wow me, and I went with the bigger 720p LCD.

If you're big into the 360, you want 720p as 360 games are 720p native.

And you won't even tell the difference with 1080i stuff converted to 720p (certain cable channels).

Also, 1080p encoded stuff (hd-dvds) will look better converted to 720p than 1080i.
 

Buro Destruct

Formerly known as, Buro Destruct, , Southtown Stre
Joined
Jul 27, 2002
Posts
9,058
1080i vs 720p is really a matter of what you're willing to put up with and what you're watching in each resolution.

If my 1080p set weren't currently undergoing repairs (ughhhhhhh) I'd take some up close photographs to really illustrate the "detail" difference between the two resolutions. JHendrix is pretty much right, for more static images, 1080i looks great, however with games, especially 360 games, you'll notice a fuckton of blurring and smudging when things are moving quickly.

720p takes a dip in resolution detail, however, the image is uniformly crisp and clean when objects are moving quickly on-screen.

I don't know how apparent the differences are in these resolutions on a 1080i TV are, but on 1080p, its pretty easy spot.

In conclusion:
1080i = TV
720p = Games
 

ajk1080

Ukyo's Doctor
Joined
Jul 26, 2003
Posts
1,205
jro said:
I have my 360 set to 1080i. Seems that it brings out smaller details in a lot of games, though Hendrix is right about the movement. I've played plenty with both, and I just tend to like the 1080i effect better.

Did hendrix say the movment looks better in 720p and 1080i is better for still pictures? You think 1080i is more detailed than 720p? And what effect of 1080i are you talking about? I have a 1080i crt set, and it is beautiful, and I have never played in 720p before(my set only has 1080i), are games just not optimised yet for 720p or are there some things that just dont look good when in such hi res(corners of polygons, jaggies, textures, etc) in game. This thread was started for games in mind, I really could care less about movies, pictures, tv. Some tvs , I have noticed display all media brilliantly except for games, so I just wana know, why 1080i over 720p for games?

And, can someone, give me a technical explainasion of 1080i VS 720p VS 1080p?

Edit: posted this allittle after the two above me :emb:
 

Buro Destruct

Formerly known as, Buro Destruct, , Southtown Stre
Joined
Jul 27, 2002
Posts
9,058
ajk1080 said:
Did hendrix say the movment looks better in 720p and 1080i is better for still pictures? You think 1080i is more detailed than 720p? And what effect of 1080i are you talking about? I have a 1080i crt set, and it is beautiful, and I have never played in 720p before(my set only has 1080i), are games just not optimised yet for 720p or are there some things that just dont look good when in such hi res(corners of polygons, jaggies, textures, etc) in game. This thread was started for games in mind, I really could care less about movies, pictures, tv. Some tvs , I have noticed display all media brilliantly except for games, so I just wana know, why 1080i over 720p for games?

And, can someone, give me a technical explainasion of 1080i VS 720p VS 1080p?

Edit: posted this allittle after the two above me :emb:
I stole this from Wikipedia, but it might help give you an idea between the different resolutions:
Resolution_chart.png
 

hanafuda

Dr. Brown's Time Machine Mechanic
Joined
Oct 27, 2004
Posts
4,967
Is the difference between interlaced and progressive really that noticable? I swear I can't tell the difference between the two when I am playing either 480i or 480p. Does it only become noticable at the higher resolutions?
 

ajk1080

Ukyo's Doctor
Joined
Jul 26, 2003
Posts
1,205
Buro Destruct said:
I stole this from Wikipedia, but it might help give you an idea between the different resolutions:
Resolution_chart.png

From that pic, 1080i covers more area than the 720p does on the color chart.....
 

Bishamon

Azu Bla, ,
Joined
Aug 24, 2002
Posts
3,624
hanafudaX said:
Is the difference between interlaced and progressive really that noticable? I swear I can't tell the difference between the two when I am playing either 480i or 480p. Does it only become noticable at the higher resolutions?

It depends... if you are comparing 480i and 480p with a DVD player, there will likely be little to no difference. Depending on what has the better de-interlacing hardware (the DVD player or the TV), you may even obtain a better image with the DVD player set to interlaced. This is because DVDs are encoded at 480i.

The same thing goes for other sources. For instance, some TVs (such as CRTs) may display 1080i much more easily than 720p, and despite the 360 running 720p native, it may be that when the 360 handles the scaling to 1080i the end result may be a better image on certain sets. That being said, the ideal situation is to have the source and TV match the same native resolution; for the 360, that's 720p, and for HD-DVD and Blu-Ray that will (eventually*) be 1080p.



*I say eventually because the current HD-DVD players only output 1080i and the Blu-Ray players I know of are taking the 1080p disc, converting it to 1080i, and then de-interlacing it back to 1080p. The end result isn't ideal, in either case (at least for those who own 1080p sets).
 

Mouse_Master

Support your local Sheriff, ,
Staff member
Joined
Aug 13, 2000
Posts
2,047
hanafudaX said:
Is the difference between interlaced and progressive really that noticable? I swear I can't tell the difference between the two when I am playing either 480i or 480p. Does it only become noticable at the higher resolutions?

In my opinion, it the difference between *i and *p is more of an eyeballs sort of thing. 1080i offers the best resolution (but this is arguable), but some can seriously pick up on the flicker. For those people, 720p is the way to go. Better yet, the newest batch of 1080p TV are suppose to correct the flicker (well, the older batch did too, but many would not do 1080p natively, only upconvert a 1080i signal). To be honest, for lots of people, they do not notice the 'flicker' with 1080i unless you tell them to look for it.

I can see the flicker. 1080i for a computer screen and games is fine for me (mainly Ages of Empires 3) but for TV, 720p suits me better.
 

Buro Destruct

Formerly known as, Buro Destruct, , Southtown Stre
Joined
Jul 27, 2002
Posts
9,058
ajk1080 said:
From that pic, 1080i covers more area than the 720p does on the color chart.....
It does, 1080i is a higher screen resolution than 720p, but since the 1080i is running interlaced, you'll see sharper edges, less pixelation, but the screen will appear to "blur" as things move across the screen. This blurring gets worse the faster an object on screen is traveling.
 

galfordo

Analinguist of the Year
15 Year Member
Joined
Mar 14, 2003
Posts
18,418
Another complicating factor that people don't always consider that not all display technologies even display interlaced video signals.

Plasma, for instance, isn't capable of displaying an interlaced signal the way we are told to think about them. Although the frame rate of a 1080i signal will be lower than a 720p signal, you may notice very little difference, and most people (myself included) prefer a 1080i signal on a plasma display. Honestly, when gaming I can't tell that much of a difference, but 1080i signals are clearly superior for watching TV/movies.

There's also yet another complicating factor at work here - many displays have a native resolution (e.g., 1366 x 768) that is different from standard broadcast resolutions (e.g., 720p). Converting down 768 lines of resolution from a 1080i signal seems to be a better choice than converting up from a 720p signal. Perhaps converting down allow the display to retain more information, while converting up forces the internal scaler to "make up" information (i.e., interpolation).

Anyway, there's lots of speculation there, but it does at least serve to illustrate why HDTV tends to make the ordinary consumer shy away.
 

ajk1080

Ukyo's Doctor
Joined
Jul 26, 2003
Posts
1,205
Buro Destruct said:
It does, 1080i is a higher screen resolution than 720p, but since the 1080i is running interlaced, you'll see sharper edges, less pixelation, but the screen will appear to "blur" as things move across the screen. This blurring gets worse the faster an object on screen is traveling.


I completely understand it now, thanks buro. And this also answers why specificly 1080p is superior. Not only does it have the higher resolution (1080) but also wont blur durring high speed movements on screen (progressive scan). BUt, I guess superiority is all relative..... I mean, if the signal was originaly produced in 720p then a native 720p display would be best, correct? Or, are there situations where a signal is originaly produced in 1080i it could look better in 720p or visa versa?
 

RAINBOW PONY

DASH DARK ANDY K,
20 Year Member
Joined
Apr 15, 2002
Posts
24,310
ajk1080 said:
I completely understand it now, thanks buro. And this also answers why specificly 1080p is superior. Not only does it have the higher resolution (1080) but also wont blur durring high speed movements on screen (progressive scan). BUt, I guess superiority is all relative..... I mean, if the signal was originaly produced in 720p then a native 720p display would be best, correct? Or, are there situations where a signal is originaly produced in 1080i it could look better in 720p or visa versa?

dude, you can debate this till the cows come home.

bottom line

if you want a REAL hdtv, get a progressive signal, ala 720p (unless you have the cash to drop for 1080p, then by all means..)

1080i is old, there is no reason you should be buying an HDTV that only does an interlaced signal nowadays. I mean the old 4:3 HDTVs out used 1080i...

if you have a nice open wall, best bang for your buck is a 720p DLP projector for about a grand.

besides that look at the sony line of RPLCDs, the 50 inch A10 is like 1799 now I think, that's what I paid for my 42A10 a year ago!

A few nice DLPs out there as well for well under 2 grand at 46inch +
 

Big Shady

Kyukyogenryu Black Belt
15 Year Member
Joined
Apr 16, 2003
Posts
4,945
Its simple math really

1920 x 1080i versus 1280 x 720p

Each frame of 1920 x 1080i = 1,036,800 pixels a frame because the image is interlaced on each frame. You're only getting half of the vertical lines. So you're really seeing 1920 x 540 per frame which equals 1,036,800 pixels per frame.

Now we go to 1280 x 720p = 921,600 pixels per frame. Now that's a 12.5% decrease from 1080i, but we have to realize is in 720p The entire frame is getting painted each cycle, versus half the frame getting painted each cycle in 1080i. By painting each frame, you get rid of flicker and motion blur. Hence, 720p is better than 1080i
 

Argentina94

Slug Flyer Pilot
20 Year Member
Joined
Feb 18, 2002
Posts
3,905
I run my 360 on 1080i. I have to since it's a 34" widescreen CRT but at no time, whether with HD sports or my 360 games, does any amount of blurring occur no matter the speed of movement onscreen.

One example that was very noticeable and I'll never forget: I always watch Hockey Night in Canada in HD on my set and really loved the picture quality. Aside from the obvious high rez image displayed in 1080i, I never noticed anything out of the ordinary. Some time later, I was at my brother's house who owns a Sony 42" RPLCD and put the game on, since he also owns HD cable. I was amazed that, despite the picture clarity, the amount of blurring involved whenever the players skated. It had difficulties handling moving objects at high speed onscreen.

Now this might be because of the different technologies in TV sets we own, but again, no such blurring, even an iota noticeable to the naked eye, ever happens on my set running ANYTHING in 1080i.

720p, though, I can't give an opinion on.
 

Magician

A simple man who simply loves gaming.
20 Year Member
Joined
Jan 18, 2002
Posts
10,336
I set my X360 to 720p. My SXRD dosen't refreash quick enough, I always got motion blur/ghosting whenever I used 1080i. 720p keeps things smooth as butter.
 

hanafuda

Dr. Brown's Time Machine Mechanic
Joined
Oct 27, 2004
Posts
4,967
As for the difference between 480i and 480p, I am mainly talking about my PS2 games. For instance, if I play KOF XI in 480i or 480p, I really can't see much of a difference.

Normally I play in 480p, but maybe I will fire it up in 480i now to see if I can see a difference.

Then again, I am using a pretty new SONY Bravia, which has a pretty high screen update rate anyway (regardless of the i and p aspect I think).

As for 1080i being blurry, I play Gran Turismo 4 on the same set in 1080i all the time. Lots of stuff moving fast on screen ;) I have never noticed any blurring.

Assuming the same set and game could play GT4 in 1080i and 1080p, and you put two lots side by side for direct comparison, would the difference be obvious then?
 

thirdkind

Chin's Bartender
20 Year Member
Joined
Oct 4, 2001
Posts
1,573
Big Shady said:
Its simple math really

1920 x 1080i versus 1280 x 720p

Each frame of 1920 x 1080i = 1,036,800 pixels a frame because the image is interlaced on each frame. You're only getting half of the vertical lines. So you're really seeing 1920 x 540 per frame which equals 1,036,800 pixels per frame.

Now we go to 1280 x 720p = 921,600 pixels per frame. Now that's a 12.5% decrease from 1080i, but we have to realize is in 720p The entire frame is getting painted each cycle, versus half the frame getting painted each cycle in 1080i. By painting each frame, you get rid of flicker and motion blur. Hence, 720p is better than 1080i

Interlaced signals are displayed as alternating fields, not frames. Each 1080i field is 1920x540, but those 540 lines are different in each field and the fields alternate 60 times per second, so the resolution your eyes see is 1920x1080. This perceived resolution drops during fast motion because of the inherent limitations of interlaced signals and displays.

As you said, 720p displays the full frame all at once, so it offers superior resolution during fast motion, but during still scenes and scenes without much movement, 1080i offers superior detail.

Any progressive display, which is pretty much anything digital like plasmas and LCDs, should provide superior results with 1080i if the deinterlacing is done properly. Many displays don't deinterlace 1080i properly, so 720p provides a superior result.

I'm ignoring all the bullshit that happens during filming and mastering that drops the actual resolution to something lower than 1920x1080, but for the sake of this discussion, we'll work with signal types rather than true resolution.

As a side note, anyone using any of the currently available HD-DVD players (they're actually all basically the same rebranded Toshiba) should avoid setting the output to 720p. There's a bug in the scaler that downscales the 1080p signal on the disc to 480p and back up to 720p. Stick to 1080i. Of course, if your display does a shitty job of converting 1080i--and many do--you won't see much difference anyway.
 

ajk1080

Ukyo's Doctor
Joined
Jul 26, 2003
Posts
1,205
Magician said:
I set my X360 to 720p. My SXRD dosen't refreash quick enough, I always got motion blur/ghosting whenever I used 1080i. 720p keeps things smooth as butter.

The Sony 60" SXRD that does 1080p? Thats the TV im going to buy! So 720p for the 360 is amazing display eh? Are you using components or VGA?
 

Magician

A simple man who simply loves gaming.
20 Year Member
Joined
Jan 18, 2002
Posts
10,336
ajk1080 said:
The Sony 60" SXRD that does 1080p? Thats the TV im going to buy! So 720p for the 360 is amazing display eh? Are you using components or VGA?

Nah, I have an 05' version of Sony 50" SXRD that does not support a 1080p input; I've heard that the 06' is suppose to support it though. Looking at pictures at 1080i on the SXRD makes them drop dead gorgeous, but 720p is the way play vids on a projection tv. I use component video.
 

Lastblade

Friend me on Facebook!,
20 Year Member
Joined
Aug 13, 2001
Posts
5,840
Any of you SXRD owners hook up a PC to it via HDMI? I am curious how the overscan is on that....
 

jro

Gonna take a lot
20 Year Member
Joined
Oct 11, 2004
Posts
14,429
Hmm, lot of good info in this thread, and a lot of good reasons to go either way, depending on the capabilities of one's set, really.

@thirdkind: thanks for the tip. I have my Toshiba set to 1080i, but I've considered trying 720p. Needless to say, I shan't be doing that now.
 

IMTheWalrus

Pao Pao Cafe Waiter
Joined
Jul 7, 2003
Posts
1,780
A lot of good comments here, but I think it underscores why CRTs are going away.

Having 1080i only really reduces what's available for the TV. In my opinion, if you are gaming you MUST have 720p at least as an option.

The refresh rate is huge too. I don't notice a lot of motion blurring on my LCD for 1080i broadcasts, but my response time is very short. I really notice the difference between interlaced and progressive when you are watching something in 1080i and you switch images completely (like different camera angles) and then I notice a quick stitching, but I think you really have to look for it and be somewhat of a videophile to pick it up.

I don't think CRTs are a viable option anymore. The cost isn't low enough relative to LCDs to make it worth the sacrifices you are making for it, most notably huge differences in color, contrast, and display options. The HD sets are huge and weigh a ton as well. LCDs keep dropping in price too.
 
Top