[quote author=“gr00vy0ne”][quote author=“Gilbert”]define hi-def for yourself.
480i
480p
720i
720p
1080i
1080p
at 1080i you will have 3-5mb/sec compresed and the built in card will not be able to handle it, even if the drive can.
a S series laptop with an ati radeon 9700 might be able to play 1080i while in performance mode (if you are in adaptive mode, you will drop frames due to the nature of the playback), although the 1080p will be a problem.
By the way, HD content is officially only 720p and 1080i. 1080p is not a standard although MS is pushing it as one of the standards for their WMV-HD specification (720p and 1080p). 480 stuff is considered SD (Standard Definition)...it’s high definition relative to good ‘ole analog 352x240 but not true HD.
It’s a big mess nonetheless. When they were coming up with HD they never should have introduced 1080i. It should have just been 720p and/or 1080p. The slightly higher pixel count of 1080i over 720p is lost since most broadcasters compress the heck out of 1080i to keep bandwidth rates down. The problem is that they compress it so much more than 720p that 720p content ends up looking better.
That said…true 1080p stuff does look somewhat better than 720p. Those nature documentaries in WMV-HD are jaw dropping amazing on a LCD TV. Step into Liquid is worth getting if you want to see the difference between DVD and WMV-HD. It makes DVDs look crappy.
I tend to disagree. Everything above 480i is considered hi-def no matter who calls is SD or ED or HD. I personally prefer the numerical relations, due to the fact that 1080p will be SD in several years and something else will be HD… Of course, there is always MS way, Sony’s way, my way your way… originally everything above the 352x240 was considered hi-def, now they even call it digital ... at least for the 480 part of it…