Help with routing all audio through Brutefir

Status
Not open for further replies.
Following 1201's very useful guide here https://www.diyaudio.com/forums/pc-based/302900-brutefir-dsp-pc-step-step.html and a few other resources, I've managed to integrate Brutefir with a local wav file and mpd. I'd like to be able to use it for all audio applications like spotify where I cannot figure out how to incorporate Brutefir (got it working from here - GitHub - librespot-org/librespot: Open Source Spotify client library). Since I'd also like to incorporate analog sources like vinyl later, is there a way to force all audio through Brutefir?



Searching the web and here, it looks like some combination of alsa's dmix, loopback and dsnoop will do this, but I cannot find enough documentation to understand how this works.



Thanks!
 
Searching the web and here, it looks like some combination of alsa's dmix, loopback and dsnoop will do this, but I cannot find enough documentation to understand how this works.
You are on the right track with the above.

The loopback is a way to connect outputs (sources) to inputs (sinks). Dmix is a way to allow multiple sources to send their audio, at the same time, to the same sink. Keep in mind that in order for dmix to "mix" the samples all streams must be at the same sample rate. This is set by ALSA, not your program, and is usually (I believe) 48kHz. I think it is possible to change the dmix sample rate by editing an ALSA config file, but I can't recall off the top of my head which file.

If you type the command
Code:
aplay -L | grep dmix
you should see several lines listed, including one or more containing CARD=Loopback. If not, you should first activate the loopback by typing sudo modprobe snd-aloop.

The loopback (in its default invocation) has 8 "subdevices". Each of these is a separate "pipe" to connect sinks and sources. Each pipe has two ends, one is "device 0" and the other "device 1". The direction, e.g. 0-->1 or 1-->0 is not established initially that is the sink and source ends are only set once you start connecting other processes to one of the loopback pipes (subdevices).

The general idea is this:
BruteFIR will use the output (source) end of the loopback pipe as input. All system audio programs will send their output to the input (sink) end of the loopback AND they will send it to the DMIX device (e.g. not the HW or PLUGHW) device. Finally, BruteFIR will send its audio on to either a DAC or another loopback subdevice that can be used as the intput to another program, e.g. ecasound or other audio processing program, via ALSA.

Thus we have the scheme:
Code:
system audio program --> dmix:CARD=Loopback,DEV=0
hw:CARD=Loopback,DEV=0 --> BruteFIR
BruteFIR --> hw:0,0 (or whatever the card:device of your DAC)
Note the Loopback usage above will use the first subdevice, because it is not explicitly specified.

It's sometime useful to "suggest" to all system audio programs that they send their audio to dmix:CARD=Loopback,DEV=0 by declaring this as the "default" ALSA device. One way to do this is to create a file called .asoundrc in your home directory and using this format to define default device and control:
Code:
pcm.NAME {
	type hw               # Kernel PCM
	card INT/STR          # Card name or number
	[device] INT          # Device number (default 0)     
	[subdevice] INT       # Subdevice number, -1 first available (default -1)
	mmap_emulation BOOL   # enable mmap emulation for ro/wo devices
}
E.g. something like this:
Code:
pcm.!default {
	type dmix   #send to the dmix device
	card 0       #use the loopback card #
	device 0    #here we are choosing dev=0 as the input end
}

ctl.!default {
	type dmix
	card 0       #use the loopback card #
	device 0
}
Save the .asoundrc file after making these edits.


Then BruteFIR's input would be:
Code:
hw:0,1,0 or hw:0,1 or hw:CARD=Loopback,DEV=1
 
Last edited:
If you want to route all audio incl. recorded stream through brutefir, I would look at virtual loopback soundcard snd-aloop. Brutefir would read from the alsa loopback source, while all applications would write to the loopback sink sound device. Even PA could be used, writing to the loopback sink.

Recorded audio from analog input can be transferred to the loopback sink using e.g. alsaloop utility Ubuntu Manpage:

alsaloop - command-line PCM loopback
, or by pulseaudio.
 
Thanks, everyone. It took a little playing around with the settings, but I finally got it. With the above information from Charlie, the keys were outputting to "dmix:X,0,0" (X = your Loopback card number) for each program and setting the input device to "hdmi" in BruteFIR.

For whatever reason, I had a lot of trouble getting the input devices to work. For example, even though my HDMI card is CARD=0, and the output device is DEV=3, I could not play anything over devices like "hw:0,3", only "hdmi" would work.
Up next will be figuring out how to have all these start in a proper configuration when I boot-up and then get back to tweaking the system.
 
For whatever reason, I had a lot of trouble getting the input devices to work. For example, even though my HDMI card is CARD=0, and the output device is DEV=3, I could not play anything over devices like "hw:0,3", only "hdmi" would work.

Configuration files of the "hdmi" alsa device manage a few switches when the device is open, typically output stream preamble bytes (AESXX) and IEC958 output enable switch.

General "hdmi" device setup alsa-lib/hdmi.conf at 3ec6dce5198f100fa8dd2abfc1258fa4138ceb1a * alsa-project/alsa-lib * GitHub

Additional setup for specific cards e.g. alsa-lib/HDA-Intel.conf at 3ec6dce5198f100fa8dd2abfc1258fa4138ceb1a * alsa-project/alsa-lib * GitHub
 
Thanks, phofman. I looked at some of these 'generic' alsa configuration files when I was trying to get my audio set-up, but I still don't understand what they are doing or saying. For example, are these ran at startup to detect and configure your audio cards, and if so, how are they compiled (eg - Bash script, programming language compiler)?


I should mention this is my first real experience in using Linux although I do have some experience with programming (generally data analysis languages). Since I learn only by getting my hands dirty and working through a project, it's only now that I'm really starting to get a feel for and understand how Linux works. I am starting to really enjoy working from the terminal, and I like how everything you need is stored in a text file. It makes for a very efficient workflow.
 
The configuration files are processed when opening the device, programmed within the playback application. They are written in alsa configuration language (e.g. Advanced Linux Sound Architecture - ArchWiki).

In this case they configure "IEC958 Playback Default" (SPDIF preamble bytes) and "IEC958 Playback Switch" (enable spdif/hdmi output) controls - available e.g. by command amixer contents.

My 2 cents you did not have the digital output enabled when outputting directly to hw:XX device. That is a typical mistake.
 
My 2 cents you did not have the digital output enabled when outputting directly to hw:XX device. That is a typical mistake.


I assume that would explain why I could never do anything with the card in alsamixer. Would there be benefit to output directly to the hw: device over what I'm doing now? In either case, just for my own learning, what would I need to do to enable the device?
 
Just look at the config files I linked. Using "hdmi" is "hw:X.3" + setting AES preamble and digital output controls.

IME the digital output switch is almost always present in alsamixer (control of type MIXER), often turned off by default. The other control is of type PCM, being available only in the more general amixer tool. Usually the AES preamble has good default values and does not have to be set in advance.

But use "hdmi" and everything will be taken care of as configured.
 
Status
Not open for further replies.