Thread: HDMI vs USB
View Single Post
  #2  
Old February 9th 21, 08:07 PM posted to alt.comp.hardware
Paul[_28_]
external usenet poster
 
Posts: 1,467
Default HDMI vs USB

wrote:
For a variety of reasons, the Smart TV features on my 43 inch LG
are a royal PITA - I've been using my laptop and connecting to the
TV with a Display Port=HDMI cable. The TV also has USB sockets -
is there any significant difference in quality between HDMI and USB
cables when used for this ? I'm considering buying a longer
cable HDMI=HDMI for my wife to use with her new laptop.
I did try a google search but got lost-in-the-weeds.
Thanks John T.


Ports can do many things.

All I can do in this case, is cite a typical case.

USB is used for USB Mass Storage, and for the connection
of a USB flash drive containing movies or photos. The
TV can play these.

DisplayPort and HDMI are for the carriage of video signals.
The quality is only as good as the source. If the source
upscales content recorded at low-res, then the picture
looks like ****.

If you had Netflix 4K (making up an imaginary service),
then they would try to play movies recorded in 4K and
having a healthy dollop of content. These should play
nice when connecting the computer with the Netflix session,
via cable, to the TV set HDMI or DP inputs.

Other ways to play Netflix, would be if the TV set has
a "Netflix.app" and then, again, a healthy dollop of
4K content is shipped down the Ethernet wire (or Wifi)
as packets, and makes a decent quality picture.

If the Internet service does not have sufficient bandwidth,
Netflix may detect this, and send lower resolution content,
which when upscaled, doesn't look as good.

If the TV set has multiple HDMI connectors, then you
may be able to connect more than one computer, then switch
to a particular TV port and display the signal coming from
it. This allows switching from your output, to the output
the laptop of your wife provides.

The Wikipedia HDMI and DisplayPort articles, describe the
resolution and refresh limits of the various HDMI and
DisplayPort standards versions. The source equipment
and the destination equipment both have limits, and
the common denominator is what they can use to talk to
one another. Not all computers can drive a TV set
at the highest rate desired.

And generally this doesn't make too much difference to the
cables. There can be quality differences between cables,
which affects the maximum length possible while using them.
The standards are not specific enough to recognize the
difference. You might guess that a six foot cable would be
OK, while stringing 30-50 feet of cable, and doing 4Kp60
might result in colored error snow on the screen. The cables
have attenuation, and if the cable is long enough, and the
signal is a high enough frequency, the receiver diff pair
can no longer slice the threshold and extract digital bits
from it.

Summary:

1) Keep cable short. Don't be an idiot. Don't run signal
from attic to basement. Cable (HDMI or DP) will likely be
OK when sitting on the couch and in front of the TV.
Six foot cable should be OK. Fifty foot cable, less so.

2) When using computers for drive, the image is only as
good as the resolution of the content. Upscaling
720x480 content to fill a 4K screen, is not going to be
all that good.

3) The same goes for a movie stored on a USB stick.
4K movie plays at 4K. Low res movie upscales too much
to look nice. If you store a DVD movie on a USB stick,
think about what it's recorded as (720x480 ?).

4) TV sets have internal "Apps", like Roku. In these cases,
the Internet site has to source nice 4K content, ISP
is required to not throttle the packets (so the playback
isn't jerky). Then, it might just work. The signal in this
case, flows over the TV set Ethernet cable or the TV set
Wifi interface to your home router that has Wifi.

Paul