View Single Post
Old 30th November 2005, 13:56
RotHorseKid RotHorseKid is offline
Junior Member
Join Date: Nov 2005
Location: Germany
Posts: 8
Thanks: 0
Thanked 2 Times in 2 Posts

Originally Posted by till
Only the traffic that goes through apache is counted.
That's what I thought.

Originally Posted by Tenaka
emulate_httpd_log on
Been there, done that.

Could I just let Squid write it's httpd-emulated logs to the respective web_log files for the sites?
If that's feasible, how would I do that with Squid for different logs based on IP/Domain? I did not find that this would be possible.

Originally Posted by till
Why do you want to use squid, so much traffic that apache cant handle it?
In tests, I found decreases in latency by using Squid, especially for pages with lots of graphics (as Tenaka already found). This is simply for the fact that, for 50+ different sites on one server, Squid does a better job serving pages/graphics from memory than Linux and Apache alone. So by not going through apache, I decrease the latency introduced by disk IO.
This is good for commercial pages, where the user's illusion of "fastness" or "responsiveness" of a web site mainly can be expressed in the amount of milliseconds it takes from the user entering an URL or clicking a bookmark/link until something starts to get rendered in the browser.

Secondly, I have some sites with high-latency DB connections. Much near-to-static data comes from these DBs. By tuning the web applications to write the correct Cache-Control headers for these pages, I can minimize the amount of actual DB queries made.

At least these were my findings, I am open to discussion here.
Reply With Quote