Why so few waveguides?

Recently I've been reading up on the importance of a (somewhat) constant directivity. A waveguide can be used to narrow the beamwidth of tweeters to match the midrange at around the crossover frequency. But waveguides also help limit edge diffraction AND make phase integration easier (Z-offset between mid and tweeter will be smaller). So three benefits for the cost of a bit of plastic (or aluminum).

So why are waveguides not much more common? Is there some drawback I'm missing?
 
The fun being of course we can add them to the drivers ourselves, just to keep DIY real DIY. Horn adepts would recognise this.

But serious now. High directivity isn't always to be preferred in home environments and casual listening. I think low directivity even sells better to the average listener. And something has to be said for that.
 
  • Like
Reactions: 5 users

stv

Member
Joined 2005
Paid Member
probably because HiFi market is very conservative.
also waveguides may still be considered as "horn" (and thus BAD, BAD, BAD!) by some consumers.
manufacturers don't want to take that risk.

in contrast modern studio monitors quite commonly have waveguides (mostly included in the enclosure construction).

I also think it is a mechanism of market: you buy expensive drivers to put them in your (flat) baffle.
driver/tweeter manufacturers may not like us to know about how much more the baffle/waveguide/enclosure form impacts sound, because then they might not be able to sell their expensive drivers.
 
  • Like
Reactions: 3 users
Design philosophy? Some people want wider dispersion.

I've seen a few tweeters that include a waveguide but it complicates things. You either buy them as a unit, or you have to match 2 parts together which is a recipe for returns. And by the time you grit your teeth and start seriously looking at the options, you start finding compression drivers.

And horn design is quite the DIY rabbit hole its own right.
 
  • Like
Reactions: 3 users
Come to think of it, waveguides are indeed more common in studio monitors, didn't think of that. I don't see how/when a wider dispersion would be favorable though, as this would only be limited to the lower frequencies the tweeter produces. And then there's the additional benefits of limited edge diffraction and better phase integration. Maybe it's just a matter of time before they become more common in DIY and the hifi market.
 
If you have a room with decent acoustics so that the contribution from reflections is more positive than negative (and more so if you lean towards orchestral music where a larger contribution from room reflections is expected) then a wide beamwidth is likely to be preferable. However, even with a wide beamwidth there is still a requirement to control and narrow the beamwidth at higher frequencies if a reasonably natural timbre is to be maintained. I haven't tried it but suspect a 4 way with a varying curved baffle could do a good job without waveguides but with a waveguide/s is likely easier.

Of course plenty of people want to reduce the contributions of room reflections whereas others want to maximize them in order enhance a sense of spaciousness. Perfectly valid objectives which lead to rather different cabinet configurations and waveguides. A flat baffle is of course a waveguide in that it guides the sound radiation.
 
  • Like
Reactions: 4 users
By the way, there's no objective line between "is a wave guide" and "not a wave guide". A box with a flat baffle and square corners can be thought of like a badly designed LeCleac'h horn with a (swept) 180° conical throat segment, followed by a sharp 90° change.

The inevitable resonances can be a bit sibilant, once you do a direct comparison and know what to listen for.
 
  • Like
Reactions: 2 users
diyAudio Moderator
Joined 2008
Paid Member
Some like to say to use one the same size as the woofer. It's a guideline, but it's still an oversimplification. Typically the larger the better for a waveguide so there's little margin for error or practicality in design that way.
 
  • Like
Reactions: 1 user
I’d say you have to investigate the X-over range of the midwoofer and its directivity in that range. Then pick a waveguide that matches this directivity pattern in that range. Don’t forget to incorporate the baffle and it’s effect on the directivity when doing this.
 
  • Like
Reactions: 1 user
So why are waveguides not much more common? Is there some drawback I'm missing?

1/ : Most of the market is non educated and buy speakers with non rational things in mind as : reviews from magazines living from ads, look, novelty, price, marketing, wow friends factor non based on sound quality, close shop acess, and so on!
2/ : WG is more expensive for manufacterers (WG itself, filter parts) and not sure the marketing comunication about it is a choice factor for the buyer
3/ : WG is a trade off and brands at the end of the day do not know about the room ot the customers, their tastes, their listening distance, etc
4/ : Most magazines want a flat magnitude response and/or the readers do not understand about how to read the tests: power response, DI, etc !
5/ Eventually when everyone will make speakers with WG, some will ask why there are not more speakers w/o WG :p
 
Last edited:
  • Like
Reactions: 1 users
Recently I've been reading up on the importance of a (somewhat) constant directivity..benefits for the cost of a bit of plastic (or aluminum).

So why are waveguides not much more common? Is there some drawback I'm missing?

One factor has not been discussed: there is a general lack of knowledge/expertise with regard to horns and their proper design to meet the needs of home hi-fi uses--among practicing loudspeaker design engineers.

[I also own several books on DIY loudspeakers, i.e., those professing to be able to successfully integrate into successful loudspeakers, that are completely silent on the subject of horns--especially DIY books.]

Why?

It's not that horn-loaded loudspeakers don't or haven't continuously existed. They've existed from ~1877...in large quantities.

So why do so few manufacturers produce well designed horn-loaded loudspeakers? (This includes hybrid horn HF/direct radiating woofer designs that many people still call "horns"--which is actually inaccurate.) The answer is at least two-fold:
  1. Almost no current loudspeaker designer knows how to actually design new production horns to get desired coverage and distortion performance:
    • People have apparently gotten so used to undersized direct radiating loudspeakers that completely ignore acoustic directivity control that they are now completely ignorant of how to do horn design or even use modern horn designs to integrate into good loudspeaker designs.
  2. Almost no current loudspeaker designer knows how to use the analysis tools made more recently affordable for analyzing candidate horn geometries:
    • The capability to economically analyze acoustic horn designs via BEM and FEA techniques has been relatively recent--well within the careers of most established loudspeaker designers today, so these more established loudspeaker designers never learned the physics/engineering required to do it.
    • Apparently, some practicing horn designers do not use BEM and FEA tools, but rather heuristic "home grown" horn design tools. They apparently do very well using these alternative methods of analysis.
So there are apparently only a few engineers that actually know how to design horns that perform well. The ones that do it as part of their job apparently do not design horns frequently, so their experience must be picked up over a long period of time while the mostly perform other tasks than designing horns.

I've had the distinct privilege of knowing one of those design engineers over the past 15+ years with the requisite knowledge and experience of commercial horn design for a large loudspeaker manufacturer. But he doesn't actually share how he does new horn designs with anyone (even apparently within his company). However, I've picked up some experience on how to spot horn designs that perform poorly from him, and what design criteria should be expected for successful horn designs.

Apparently, I do get the impression that company organizations that employ these engineers typically do not highly value their capability to design good horn profiles as much as they probably should. Horn design technology is nowadays often considered old hat and well within the state of the art, and so is often dismissed as "not a full time job". This is apparently a source of the issue why there are so few acoustic horn design engineers currently.

In my view, this expertise should be viewed by loudspeaker manufacturing companies as at least as valuable as compression driver and dome and cone driver designers. Some of the skills both groups of engineers use are apparently interchangeable/interdependent.

Chris
 
  • Like
Reactions: 1 users
Almost no current loudspeaker designer knows how to actually design new production horns to get desired coverage and distortion performance

One thing that certainly makes me feel quite ill every time is when developers, be it DIY or commercial, claim their horn loudspeaker plays cleanly to high SPL because of its high sensitivity, intentionally or unintentionally ignorant of the fact that the very mechanisms that enable that high sensitivity increase (nonlinear) distortion.
 
  • Like
Reactions: 1 user
...intentionally or unintentionally ignorant of the fact that the very mechanisms that enable that high sensitivity increase (nonlinear) distortion.
Could you clarify the exact types of nonlinear distortion that are you referring to: harmonic, modulation, (thermal) compression, etc.?

Are you saying that increasing sensitivity (I assume you're talking about voltage sensitivity and not efficiency) lowers the on-axis SPL threshold for the onset of audible nonlinear distortion of a certain type?

Chris
 
  • Like
Reactions: 1 user