Hey guys,
I have a DAC in my circuit which I need to add a line-out connection to. Specifically, consumer audio line-out:
The DAC outputs 0..5v unless I've screwed something up. And the line-out should be right around +-.447v. So I figured my first task was to use a voltage divider to get the voltage down around .894v, and then I'd add my high-pass / DC blocking filter capacitor and hope that worked.
I ended up finding an online circuit simulator so that I could better understand what was going on, and I set up my circuit. But I found that while a 10K and 2.2K resistor gave me the right voltage, when I added the filter capacitor and its resistor, and my load, that dropped quite a bit. I found that raising the 2.2K to 3.3K fixed the voltage sag, and that resulted in this circuit:
10K and 3.3K:
http://bit.ly/P6ikn5
Here, on the left, you have an oscillator which is supposed to simulate a sine wave output from my DAC. Then you have the 10K and 3.3K for the voltage divider portion of the circuit, and then you have a 33uF cap and 22K resistor for a high pass filter. (I'm still not sure what the corner frequency for that is, in this circuit it seems to be around 5hz but filter calculators online indicate otherwise.)
Initially I was doing most of my testing at the lowest frequencies. 1hz to 10hz. And I found I could swap out the 33uF and 22K resistor for a 10uF and 10K resistor and get results which were not far off from the other values. I tried this because I thought I might be able to consolidate some components and reduce cost.
In my experimenting though, I eventually tried 22khz for my frequency and saw a large sag in voltage on my output indicating I had created a low pass filter. And one of the filter calculators shows a resistor before a capacitor going to ground makes a low pass, and technically there is a resistor in my circuit before the capacitor, and the capacitor does go to ground through my load, so maybe I have inadvertently created one.
It then occurred to me that maybe I had too much resistance before the capacitor, so I went back to that Wikipedia article and saw this:
Instead, line level circuits use the impedance bridging principle, in which a low impedance output drives a high impedance input. A typical line out connection has an output impedance from 100 to 600 ?, with lower values being more common in newer equipment. Line inputs present a much higher impedance, typically 10 k?.
And I thought, maybe I need a resistor before that cap which is more in the range of 100 to 600 ohms? So I tried this setup:
100 and 22 ohm resistors:
http://bit.ly/M5m96W
As you can see I used a 22 ohm resistor there rather than a 33ohm when I scaled the values down because I noticed I saw less voltage sag when I used the lower values and now I'm getting values closer to what I expect at both my probes.
I'm a little worried about how much current this will draw from the DAC though. 5V through 100 ohms is around 50mA. But maybe I should be using the .9v I get at my probe in my calculation instead? That gives 9mA. I think using 5V here is correct though. So I guess maybe I should increase my 100 ohm resistor to maybe 270 ohms, to get that max current draw down below 20mA, or whatever my DAC is rated for?
What do you think? Am I on the right track?
This is the audio portion of my circuit btw. This stuff would be attached to the left of C21 between the DAC and the amp:
http://shawnswift.com/arduino/schematic-audio.png