Phase shift from absorption?

Here is a theoretical question, and googling did not help me answer it.

Let's assume a driver radiates through a layer of absorbtive material (like a layer of mineral wool, or foam). The sound radiates through this wall of material on its way to our ears.

The absorptive material has little absorption at 100 hz, but very high absorption at 1000 Hz. Therefore the sound that reaches our ears has been filtered.

Is there a phase shift associated with this filtering? On the one hand, I could imagine that the absorption process is a minimum phase process, so yes there is a phase shift. On the other hand I could imagine that since the decrease in SPL is purely mechanical in nature (thermodynamic friction process), it is not filtering in the same sense as an electrical filter, so no there is no phase shift.

Thoughts?

J.
 
The use of foam wedges in compression drivers to line array has been used. The shortest path has the longest foam wedge.

So the answer is yes absorption material does slow down acoustic propagation. Changes in delay with frequency is your actual question and that also occurs.

Should be an ElectroVoice patent on the process.