Need some assistance with a line-out circuit simulation

Hey guys,

I have a DAC in my circuit which I need to add a line-out connection to. Specifically, consumer audio line-out:

The DAC outputs 0..5v unless I've screwed something up. And the line-out should be right around +-.447v. So I figured my first task was to use a voltage divider to get the voltage down around .894v, and then I'd add my high-pass / DC blocking filter capacitor and hope that worked.

I ended up finding an online circuit simulator so that I could better understand what was going on, and I set up my circuit. But I found that while a 10K and 2.2K resistor gave me the right voltage, when I added the filter capacitor and its resistor, and my load, that dropped quite a bit. I found that raising the 2.2K to 3.3K fixed the voltage sag, and that resulted in this circuit:

10K and 3.3K:
http://bit.ly/P6ikn5

Here, on the left, you have an oscillator which is supposed to simulate a sine wave output from my DAC. Then you have the 10K and 3.3K for the voltage divider portion of the circuit, and then you have a 33uF cap and 22K resistor for a high pass filter. (I'm still not sure what the corner frequency for that is, in this circuit it seems to be around 5hz but filter calculators online indicate otherwise.)

Initially I was doing most of my testing at the lowest frequencies. 1hz to 10hz. And I found I could swap out the 33uF and 22K resistor for a 10uF and 10K resistor and get results which were not far off from the other values. I tried this because I thought I might be able to consolidate some components and reduce cost.

In my experimenting though, I eventually tried 22khz for my frequency and saw a large sag in voltage on my output indicating I had created a low pass filter. And one of the filter calculators shows a resistor before a capacitor going to ground makes a low pass, and technically there is a resistor in my circuit before the capacitor, and the capacitor does go to ground through my load, so maybe I have inadvertently created one.

It then occurred to me that maybe I had too much resistance before the capacitor, so I went back to that Wikipedia article and saw this:

Instead, line level circuits use the impedance bridging principle, in which a low impedance output drives a high impedance input. A typical line out connection has an output impedance from 100 to 600 ?, with lower values being more common in newer equipment. Line inputs present a much higher impedance, typically 10 k?.

And I thought, maybe I need a resistor before that cap which is more in the range of 100 to 600 ohms? So I tried this setup:

100 and 22 ohm resistors:
http://bit.ly/M5m96W

As you can see I used a 22 ohm resistor there rather than a 33ohm when I scaled the values down because I noticed I saw less voltage sag when I used the lower values and now I'm getting values closer to what I expect at both my probes.

I'm a little worried about how much current this will draw from the DAC though. 5V through 100 ohms is around 50mA. But maybe I should be using the .9v I get at my probe in my calculation instead? That gives 9mA. I think using 5V here is correct though. So I guess maybe I should increase my 100 ohm resistor to maybe 270 ohms, to get that max current draw down below 20mA, or whatever my DAC is rated for?

What do you think? Am I on the right track?

This is the audio portion of my circuit btw. This stuff would be attached to the left of C21 between the DAC and the amp:
http://shawnswift.com/arduino/schematic-audio.png

I forgot to mention one thing. The 10K on the right in the simulations is supposed to be the load. According to that Wikipedia article and some amp schematics I've looked at I think that's right for the input of an external amp.

I've done some more simulations, and I think this is the best so far:
http://bit.ly/LY51VK

Here I have used a 270 and a 59 ohm resistor which gives me very close to .447v peak on the output. I have chosen these values because 270 ohms at 5V is a little less than 1/10W so even if the line out is shorted the resistor won't overheat. It also puts me well below the maximum current rating of the DAC's output in that same situation. And the rest of the time the output should be well below 1/10W and 19mA, so under normal usage the components shouldn't be stressed.

I've also swapped the 33uF cap for a 10uF cap because I've used those elsewhere all over the board so it will be cheaper and at 20hz, the lower limit of human hearing the simulation is telling me that the different value cap is causing a voltage drop of only around a 1/500th of a volt. So there should be no audible difference. I tested at 22khz as well and I don't see any issue there with the 10uF cap either. Unfortunately the simulation can't go up to 44khz, but 22khz is also about the range of human hearing so I don't think there should be an issue.

But again, looking for feedback here. I can't be certain I haven't screwed something up with my simulation.

Hm... I also just remembered I wanted to try swapping out the 22K for a 10K to see if that would affect things, and it doesn't seem to have a very large effect. In fact, removing it all together doesn't seem to have a large effect. I think the person that showed me that portion of the line out circuit said the 22K was to avoid the pop when connecting something to the line out connection, but I'm not sure that's even something I care about since this circuit is designed to be installed once and left in place. And as far as the high pass filtering aspect of it goes, I'm not even sure why I'd want that. The capacitor alone seems to do the job of DC blocking.

So what do you think about removing that 22K entirely?

It's just occurred to me that rather than sticking a voltage divider on the DAC's output to get my line out to the right level, another thing I could do instead is reduce the DAC's voltage output and change the gain on my amp to compensate for the reduced voltage. I'd have to recalculate all my resistor and cap values though.

Also, I'm not sure in the end it would actually save me anything if I then implement the HPF and bias resistor as suggested in this schematic:

I'm still not sure what the need for a HPF on the output even is, or what the bias resistor is supposedly doing there, because in my simulations it doesn't seem to do much at all:
http://bit.ly/LYBV8T

(Note: Here the oscillator is outputting 0 .. 894mv to simulate what would happen if I reconfigured my DAC, not 0..5v as in my above examples with my current configuration)

Hm... turns out if I want the DAC to output .447V peak, I'd need a voltage divider on the Vref pin. Which is two resistors. So I'd just end up moving two resistors from the output to the input. And then I'd need to add a 100 ohm or higher resistor on the output to limit the current in a short like in that last circuit drawing I posted. So I'd actually end up with more resistors and a lot of work reconfiguring the amp and stuff. And I'd need larger caps on the amp's input to maintain the same filtering there. So maybe adjusting the dac's output isn't a great idea.

What's your application that you need this exact -10dBV signal? Are you building some sort of test system where you need to generate test-tones at known signal levels?

If you are running real audio signals (such as voice or music) the signal level (volume) jumps around from moment-to-moment anyway, and you've usually got a volume control.

Normal consumer audio equipment is not "calibrated"... The maximum line-level output from a soundcard, CD player, or DVD player is typically 1-2V peak. -10dBV is probably a good minimum guideline.

So if you don't have any special requirements, and you just want to hook-up to a receiver or amp, or something like that... I'd just add the 100 Ohm "safety" resistor and be done with it.

Or, check the "input sensitivity" specs for the amp... Some amps might actually need 1VRMS to hit full-power.

P.S.
I just checked the specs on a random Crown power amp.... It said 1.4V (which I assume means 1.4VRMS). So with that particular amp, you'd some boost following the DAC if you wanted to hit full-power.

I suppose I don't need this exact -10dbv signal. I'm new to the whole audio thing so I didn't know how precise things needed to be. I also see a lot of different line-out designs and there must be some advantages and disadvantages to them but I don't know what they are, and I also don't know that my circuit simulation is accurate, or that there won't be some components in the external amp that interact with them in such a way as to create a LPF and give me muddy sound, or that there won't be some kind of ground loop that causes oscillations.

I've been researching the stuff, but it's slow going finding answers to these questions. But in regards to the volume, I did recently discover that amps have a sensitivity rating which basically says the peak voltage at which the volume is maxxed out, and then anything above that will boost a quiet source or result in clipping if the voltage is already high enough.

It also looks like there's two different voltages I could output. What wikipedia calls "consumer audio" are the RCA outputs with the colorful connectors, where the peak voltage is just below 0.5v What they call "professonal audio" are the 1/4" and 1/8" jacks, where the peak is just above 1v. The specific values they list however don't seem to be strictly adhered to, so rather than 1.2v on a 1/4" jack it might just be 1V. Or it could be as high as 2V according to some people, as you also stated.

Thankfully, like you said, there's a volume control on most amps that adjust the voltage before the amp amplifies it. But you would still want to be in some sane voltage range so you don't have tons of clipping and distortion if you turn the volume knob up over 10%.

I just checked the specs on a random Crown power amp.... It said 1.4V (which I assume means 1.4VRMS). So with that particular amp, you'd some boost following the DAC if you wanted to hit full-power.

That's a 1/4" jack though, right? A TRS connector I guess they're called. Cause if it is an RCA connector like most amps have that seems way out of spec.

Or, check the "input sensitivity" specs for the amp... Some amps might actually need 1VRMS to hit full-power.

I have been googling amp sensitivity and it looks like most amps designed for car or motorcycles have an input sensitivity of 200mV, but a few out there go as low as 100mV. The only amps I've seen that list a sensitivity of around 1-2V are home audio amps like those used for surround sound systems. I haven't come across any musical instrument amps in my searches but I'm not targeting those. And I assume the 1-2V sensitivity listing on those home audio amps is for a 1/4" jack or something and not for the RCA inputs I would assume most RCA inputs would adhere the the standard so having a 1-2V sensitivity on them wouldn't make sense.

So if you don't have any special requirements, and you just want to hook-up to a receiver or amp, or something like that... I'd just add the 100 Ohm "safety" resistor and be done with it.

A 100ohm resistor would be too small to protect against a shorted line input. Have to be around 270ohms to be safe. But if I've got the voltage divider there is that even necessary, since I've got the 270 ohm in series with the capacitor? I don't think it is.

If you're suggesting I don't use a voltage divider at all... my dac is configured to put out 5V. That's way over spec. And making it output less would just stick the voltage divider on the other side or it. So I might as well have the voltage divider. I don't want folks to have to turn the volume down to almost 0 to get acceptable sound. They will assume something is wrong with my board.

And as for the high pass filter... I've read a lot of places that I need a DC blocking capacitor there anyway, and the voltage needs to swing between - and +, and my dac outputs 0..5v... so that's kind of necessary.

On a side note, I found if I lowered the cap size to 10uF I get a rolloff over 10hz which would kill my bass, so I guess I can't swap that out to reduce my unique component count. And the 22K resistor... well it doesn't seem to do much. I guess it's there to reduce popping if switching outputs. The guy who showed me that schematic said it was a bias resistor but it doesn't seem to be biasing anything. I may not include it. Or I may replace it with a 10K. Not sure.

Anyway, I guess I have a decent circuit design here. I've got low pass rolloff below 5hz, and the amps this is most likely to be used with will have that 200mV to 100mV sensitivity. So they should be able to boost the volume by 2-4x if needed.