Does it make any difference (quality-wise and input-delay-wise) if I use a DisplayPort to HDMI cable directly or a DisplayPort to HDMI adapter, followed by a regular HDMI cable?

  • Onno (VK6FLAB)@lemmy.radio
    link
    fedilink
    arrow-up
    10
    ·
    5 months ago

    Yes and no.

    At the frequencies that HDMI operates, the path a signal takes can interfere with that signal. It’s why sometimes a cheap HDMI cable causes issues, where one certified for 4K or 8K doesn’t - the requirements to carry more information, means higher frequencies and thus better shielding.

    A connector is a potential location where signal can be affected if the connection between two conductors is poor.

    In general, less connectors and less joins will give you a higher chance of success and less chance of interference, but it depends entirely on what type of distance and signal you’re trying to send across it.

    In general, the shorter the connection, the less loss.

    It might be that a single longer cable is worse than a connector and a short cable.

    If you already have a connector and a HDMI cable, try it. If you have issues, start by reversing the HDMI cable. It won’t make the electrons reverse or anything like that, but the connection might be slightly different.

    If you have neither, I’d get a cable without a join. Buy from people who take returns.

    Budget will be the determining factor for most people.

    TL;DR; try it.

  • ABCDE@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    5 months ago

    My adapter and most I found could not do 120Hz at 4K, but the cables were easier and cheaper to buy.

  • kakes@sh.itjust.works
    link
    fedilink
    arrow-up
    3
    ·
    5 months ago

    I imagine it would depend on the adaptor. If it’s an “active” adaptor (ie, there is some processing done on the signal using a microprocessor), that introduces at least some lag.

    A “passive” adaptor or a (passive) cable should have no latency otherwise.

    In terms of quality, there shouldn’t be much difference if any, though it may depend on the particular adaptor you use.

  • RegalPotoo@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    3
    ·
    edit-2
    5 months ago

    Edit: from the other answers, I’m probably wrong - maybe don’t trust this as correct

    I don’t think so - HDMI and Display Port actually carry their signals in the same way, so the adapter is basically just converting between two plug types without any smarts in the middle.

    In theory you could get an adapter that is badly made and adds some noise to the signal or something and forces the monitor to down-spec it’s signal but I’m not sure how likely that is to come across.

    • SomeoneSomewhere@lemmy.nz
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      5 months ago

      HDMI and DP do not carry their signals in the same way. HDMI/DVI use a pixel clock and one wire pair per colour, whereas DP is packet-based.

      “DisplayPort++” is the branding for a DP port that can pretend to be an HDMI or DVI port, so an adapter or cable can convert between the two just by rearranging the pins.

      To go from pure DisplayPort to HDMI, or to go from an HDMI source to a DP monitor, you need an ‘active’ adapter, which decodes and re-encodes the signal. These are bigger and sometimes require external power.

      • realbadat@programming.dev
        link
        fedilink
        arrow-up
        2
        ·
        5 months ago

        To add to this, virtually all GPUs out there with DP are DP++ and will not require an active adapter.

        Consumers will almost never need to consider an active adapter for DP to HDMI, as well as single link DVI. HDMI to DP will always require an active (powered) adapter. As would DP to dual link DVI, VGA, or component.

  • MeekerThanBeaker@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    5 months ago

    In my experience, in IT, cables are usually the way to go over adapters. Adapters tend to break more often than cables.

    Someone else may offer a more thorough differing opinion… like perhaps a very high quality HDMI cable with a high quality adapter may be better than a mediocre off-brand standalone cable, but if the prices/brand are the same… I’d stick with the single cable.

    Before USB-C became standard, we stuck with HDMI cables and then had different adapters on hand just for convenience, but if we knew a cable is going to be used in the same equipment for a long time, we’ll try for the single cable.

  • lurch (he/him)@sh.itjust.works
    link
    fedilink
    arrow-up
    1
    ·
    5 months ago

    I just know I couldn’t use any of those to get 4k. I needed to get a cheap ATI Radeon card with direct HDMI output and a high quality HDMI to HDMI cable to get 4k. I had to make sure both support 4k. The card and the display will do a very short test of the cable on each connection and if it doesn’t carry enough bandwidth and fails, the card will tell the OS and 4k won’t even show up for selection.