32" HDTV as HTPC: resolution problems

Discussion in 'OT Technology' started by Supergeek, Jan 29, 2007.

  1. Supergeek

    Supergeek New Member

    Joined:
    Jan 23, 2007
    Messages:
    1,855
    Likes Received:
    0
    Location:
    Colorado
    Basic problem: Outputting DVI in high definition from my computer to my HDTV.

    Details:

    I've got a 32" ILO HDTV that I'm using as one of the outputs on my Geforce 7900 GT. The TV supports 480ip/720p/1080i, but when I use it to play videos with the PC it usually says it's in 800x600 mode.

    The other output is to my 20" LCD ViewSonic monitor, which is 1680x1050

    I recently "upgraded" my Nvidia drivers to the latest and greatest, and now I can't view anything higher than 640x480ish or so on my TV without using Dscaler. I used to be able to view higher resolution videos (1280x720 for example) but now I just get No Signal on my TV. I can still play lower resolution videos with no problem (624x352 for example).

    I installed a bunch of codec packs and alternative viewers to try to get it working, but I think I'm just screwing it up even more. Now I'm getting letterboxing on the videos when I wasn't before.

    Help! What's the easiest way for me to clean up the garbage dump of codecs on my PC? Would the codecs be the problem with my resolution issues, or is it the drivers? Or is there no way to tell?
     
  2. retorq

    retorq What up bitch??

    Joined:
    Dec 14, 2006
    Messages:
    6,061
    Likes Received:
    0
    Location:
    Mohave Desert
    I have an NVidia based (5900 I thinks??) HTPC with my Sony HDTV hooked up thru a DVI to HDMI cable, it detects the TV fine and I'm able to use 480, 720 and 1080 resolutions without any issues. I don't have a monitor hooked to it at the same time though, I would say your issue is your drivers and not your codecs. Have you tried telling NVidia what kind of TV you have instead of letting it find it on it's own??
     
  3. Supergeek

    Supergeek New Member

    Joined:
    Jan 23, 2007
    Messages:
    1,855
    Likes Received:
    0
    Location:
    Colorado
    After posting, I figured I probably should have posted in the PC forum.

    As for telling my PC which TV I have hooked up, I'm not sure how to do that. It has automatically detected a specific TV before, off and on over the past few months. I can't recall the brand and model; it was different from the actual labels on my TV, but then again, a lot of TVs are sold under different labels but have the same guts underneath. But, most of the time it just has a generic LCD configured there.

    Actually, I installed beta Nvidia drivers, then went back to the latest official drivers, which was one revision ahead of the drivers I had before I installed the beta drivers. (Two steps forward, one step back.)

    I've tried to use the Nvidia configurator to tell it it's a 720p or 1080i television, but that doesn't seem to help.

    Something I haven't tried yet is switching to HDMI; I have the adapter for DVI->HDMI, maybe the TV will be more intelligent on the HDMI interface.

    Thanks for the input. I'll keep trying.
     
  4. Supergeek

    Supergeek New Member

    Joined:
    Jan 23, 2007
    Messages:
    1,855
    Likes Received:
    0
    Location:
    Colorado
    HDMI didn't help; in fact, it was less flexibile than straight DVI.

    The TV is recognized by XP as a Plug and Play monitor under Display Settings/Advanced. Under Nvidia Control Panel, it's recognized as a "SHINCO LCD TV."

    Still no joy, so I posted on the AVS HTPC forum as well.
     

Share This Page