Is there much of a difference between using optical vs. digital coax for transmitting legacy Dolby Digital and DTS sound from a Blu Ray player to an older non-hdmi receiver? I heard from some people that digital coax can handle much more bandwidth than optical, therefore sound will be better. Some have also said that digital coax can carry Dolby TrueHD and DTS Master HD signals, but I'm not sure if I believe that, since I imagine bandwidth of those new formats is proably 100 times more. I also heard that Blu Ray disks nowadays have a higher quality legacy Dolby Digital and DTS track than DVD's do. I'm not talking about Dolby TrueHD or DTS Master HD, which are lossless, but plain old DD and DTS. Speaking of lossless, funny thing is the other day I was looking at some packages of digital coax and optical cables in a store and it said "lossless sound" on the package. How can this be? I thought only Dolby TrueHD and DTS Master HD are lossless. Sounds to me like marketing gimmick.
Have you ever heard of Optical to Digital Coax converters, or the other way around? I thought that a straight line connection between say a Blu Ray player to receiver is best, but what if you have a converter in the middle, like that which I mentioned above, or even an HDMI converter? Is sound degraded at all?
Thanks for all your help.
Much has been argued about the "difference" between coax and optical. I have a fairly simple analogy for this...
Consider the digital audio encoding to be Morse Code (digital 1s & 0s are now dots and dashes).
To decode Mose code, you need a sufficiently long string of dots and dashes to decipher the message. The same thing happens with digital bits - you need all of the 1s & 0s to get to the processor to decode and instruct the playback device what sounds are required.
A direct comparison/analogy for coax & optical would be a telegraph line, and a light box. If you were to send an "SOS" via morse code, it makes no difference if it's going down a wire or being flashed from a light box between two ships - the message is still "SOS". The person on the other end (the "receiver") gets the series of dots & dashes and decodes the SOS.
The signals traveling down the coax or optical lines are the same information (UNLESS the output device itself restricts a particular format on a particular output... something I've not heard of, but possible), so it makes zero difference HOW the info gets into the decoding device (receiver). Just like the SOS.... the encoded material is the same, regardless of transmission carrier.
Now - there are physical differences between the cable types. Coax is a much more robust carrier... I've done installations of over 150 feet. Your typical consumer grade optical cables (not the huge glass bundles buried underground, but the normally plastic fiber stuff) is more susceptible for digital errors over distance. This is due to inter-cable reflections that can possible make the signal unreadable on some devices (this does vary from brand to brand). I've seen optical cables function well up to 75 feet, but barely at 20 feet on other gear, so "your mileage may vary".
Optical to coax (& vice versa) converters are fairly simple, and do no damage to the signal. I've yet to see one fail.
As far as "degraded" signals are concerned - the usual thing with digital audio is "all or nothing" for the most part. A signal may degrade in a long or poorly constructed cable (optical or coax), but as long as the receiving device can decode it, you'll get perfect reproduction. If it gets degraded to the point where it cannot be decoded, this will result in either HUGE digital errors (lots of clicking, popping, stuttering, etc...), or no sound at all.
I hope this answers your question... Cheers!