What are we better off using: analog or digital signals?
You may have listened to a song on an audio cassette when you were younger, back when CDs and MP3 files were not in extensive use. Today, however, CD and MP3 recordings are widely used, and it is unlikely that you listen to music in another format. The audio from these different formats come from electronic signals, which are messages encoded by electric current.
Audio experts claim that analog signals, the older form of electronic signals, are of a better quality than digital signals, since they are continuous and detect small nuances captured in the original recording. Although this may have once been true, and analog equipment is less expensive than digital equipment, digital signals are used more often today. Digital signals are rapid “on” and “off” pulses that compared to analog signals, are much easier to transmit. Unlike analog formats (cassettes, vinyl discs, etc.), digital formats (CDs and MP3 files) cannot deteriorate over time.
With technology constantly improving, the sound quality difference between analog and digital is almost undetectable today. Or is it? You decide!
When analog signals are converted to digital, their continuous waveforms are approximated using square waves. Why might this be a problem regarding sound quality?
Credit: WdwdSource: http://upload.wikimedia.org/wikipedia/commons/9/9a/Digital.signal.svgLicense: CC BY-NC 3.0
- How could the problem you determined in the question above be resolved?
- Digital signals are a collection of "on" and "off" pulses. What does this mean regarding the minimum amplitude of a digital square wave?