http://www.pcworld.com/article/id,12177 ... icle.html#
Choice Quotes:
"One obstacle we did face involved establishing solid connections with our devices' HDMI ports. In some cases, we connected the cable but no image appeared. Sometimes wiggling the cable fixed the problem, and sometimes it didn't. But the trouble seems to stem from the the standard HDMI connector design used by all cable vendors."
Which ironically leads to:
"In our tests, we had the most trouble when trying to attach Monster's $300 M1000HDMI cable to the Epson's HDMI port. Easily the thickest, stiffest, heaviest model we reviewed, the Monster cable pulled away from the projector's HDMI port, often causing the screen to go blank."
And finally:
"Once you get a good HDMI connection, our tests indicate, you can expect flawless performance from any 4-meter cable, regardless of price."
To sum it all up:
"For its part, digital carries just ones and zeros. In HDMI, if the signal voltage is high, it encodes a one; if low, a zero. The voltage encoded as a one can drop a fair amount and still be distinguishable from voltage encoded as a zero. After a certain point, however, the signal voltage drops so low that ones and zeros look alike, and the TV's receiver chip attempts to guess their value. So rather than gradually diminishing in accuracy, the way an analog signal does, a digital signal may remain perfect up to a critical level and then fail catastrophically."
So, basically it either works flawlessly (100% accuracy) or it noticeable fails. There is little to no "grey area" when it comes to quality of these digital signals.
*EDIT*
I just found a site that does claim there is a difference in signal quality and rates cables based on it:
http://www.avreview.co.uk/news/article/ ... 70/v/1/sp/
Too bad the only comment on their review is this:
"HDMI cables carry only digital signals.
Suggesting a difference in picture quality with different cables is bogus and depreciates the review. Either the signal is decoded at the other end (picture) or it isn't (no picture). If the signal quality was right on the cusp, a TV may attempt to display an incorrectly decoded picture but this would be very obvious (broken images etc.) rather than a subtle change of hue or loss of sharpness.
HDMI 1.3 specifies two categories of cable, category 1 (standard TV and HDTV) and category 2 (above HDTV). No mention is made of this.
To my mind, testing would have included these criteria:
Able to carry 1080P
Compatibility with various equipment
Quality of build, e.g. flexibility, ..."
*Edit 2*
I didn't see the "More" button at the bottom of the post but there is a long discussion there about this topic. Marketplace did an episode on this and from what I remember the signal analyzer could not see any difference between sent and received signals during their testing. Maybe the placebo effect is coming into play here???