Go Back   HowtoForge Forums | HowtoForge - Linux Howtos and Tutorials > ISPConfig 2 > Installation/Configuration

Do you like HowtoForge? Please consider supporting us by becoming a subscriber.
Reply
 
Thread Tools Display Modes
  #1  
Old 28th November 2005, 16:14
RotHorseKid RotHorseKid is offline
Junior Member
 
Join Date: Nov 2005
Location: Germany
Posts: 8
Thanks: 0
Thanked 2 Times in 2 Posts
Arrow Squid as a Reverse Proxy for ISPconfig on the same machine

Hello All.

I would like to setup Squid as a reverse proxy for my sites which I administer with ISPconfig.
I am using SuSE 9.3 with the perfect setup.

I think by browsing this forum I already found out what to do, I just want to check back to make sure my idea is not completely braindead:

1. Change make_vhost() in /root/ispconfig/scripts/lib/config.lib.php so the sites get created on a different port (say, 8080)
2. Change /etc/apache2/listen.conf so the webserver listens on port 8080 for all my IPs
3. Change /etc/apache2/vhosts/Vhosts_ispconfig.conf so that the existing vhosts listen on port 8080
4. Install and configure Squid with httpd_accel_port set to 8080 and listening on all my IPs

Please let me know if this approach makes sense. Did I miss anything?
I would also appreciate any insights on if and what problems with updating this creates.

Regards,
RHK
Reply With Quote
Sponsored Links
  #2  
Old 28th November 2005, 17:56
falko falko is offline
Super Moderator
 
Join Date: Apr 2005
Location: Lüneburg, Germany
Posts: 41,701
Thanks: 1,900
Thanked 2,721 Times in 2,562 Posts
Default

Looks good!

Regarding ISPConfig updates, whenever /root/ispconfig/scripts/lib/config.lib.php is changed, you have to edit it again and change the ports to 8080...
__________________
Falko
--
Download the ISPConfig 3 Manual! | Check out the ISPConfig 3 Billing Module!

FB: http://www.facebook.com/howtoforge

nginx-Webhosting: Timme Hosting | Follow me on:
Reply With Quote
  #3  
Old 28th November 2005, 19:19
Ovidiu Ovidiu is offline
Senior Member
 
Join Date: Sep 2005
Posts: 1,257
Thanks: 75
Thanked 22 Times in 18 Posts
Default

have you already made a case study? do you know for which kind of traffic squid makes sense as an accelerator? I was interetsted too because I am hosting a site with a huge image gallery but I have not found an answer as to when it is indicated to use squid as an accelerator...
Reply With Quote
  #4  
Old 29th November 2005, 16:25
RotHorseKid RotHorseKid is offline
Junior Member
 
Join Date: Nov 2005
Location: Germany
Posts: 8
Thanks: 0
Thanked 2 Times in 2 Posts
Thumbs up Tested it, seems ok

I just tested the setup with one of my domains.

It seems to work as expected.

But - I am not completely sure if the traffic restrictions for sites and clients still work, as much of the traffic is now handled by Squid. The same is true for Webalizer Stats.

Any ideas?
Reply With Quote
  #5  
Old 29th November 2005, 16:28
till till is online now
Super Moderator
 
Join Date: Apr 2005
Location: Lüneburg, Germany
Posts: 35,433
Thanks: 813
Thanked 5,209 Times in 4,085 Posts
Default

Only the traffic that goes through apache is counted. Why do you want to use squid, so much traffic that apache cant handle it?

Last edited by till; 29th November 2005 at 21:22.
Reply With Quote
  #6  
Old 29th November 2005, 18:39
falko falko is offline
Super Moderator
 
Join Date: Apr 2005
Location: Lüneburg, Germany
Posts: 41,701
Thanks: 1,900
Thanked 2,721 Times in 2,562 Posts
Default

Quote:
Originally Posted by RotHorseKid
But - I am not completely sure if the traffic restrictions for sites and clients still work, as much of the traffic is now handled by Squid. The same is true for Webalizer Stats.
As long as Apache is logging to the same log file as before everything should be fine.
__________________
Falko
--
Download the ISPConfig 3 Manual! | Check out the ISPConfig 3 Billing Module!

FB: http://www.facebook.com/howtoforge

nginx-Webhosting: Timme Hosting | Follow me on:
Reply With Quote
  #7  
Old 29th November 2005, 21:27
Ovidiu Ovidiu is offline
Senior Member
 
Join Date: Sep 2005
Posts: 1,257
Thanks: 75
Thanked 22 Times in 18 Posts
Default

Maybe this helps:

Quote:
emulate_httpd_log on

The option emulate_httpd_log, if set to ON, specifies that Squid should emulate the log file format of the Apache web server. This is very useful if you want to use a third party program like Webalizer to analyze the Web Server httpd log file.
Taken from here. See here and here for more info.



I'll try it too if you succeed and you think its worth the hassle AND if I manage to get more free time, so maybe next year

###edit###

just answered one of my own questions:

Quote:
The cache serves references to cachable objects, such as HTML pages and GIFs, and the true httpd (on port 81) serves references to non-cachable objects, such as queries and cgi-bin programs. If a site's usage characteristics tend toward cachable objects, this configuration can dramatically reduce the site's web workload.
So yes, squid will help my site's performance because I am hosting a site which gets around 50-100GB traffic / month and mainly consists of a picture gallery..

Last edited by Ovidiu; 29th November 2005 at 21:31.
Reply With Quote
  #8  
Old 30th November 2005, 12:56
RotHorseKid RotHorseKid is offline
Junior Member
 
Join Date: Nov 2005
Location: Germany
Posts: 8
Thanks: 0
Thanked 2 Times in 2 Posts
Default

Quote:
Originally Posted by till
Only the traffic that goes through apache is counted.
That's what I thought.

Quote:
Originally Posted by Tenaka
emulate_httpd_log on
Been there, done that.

Could I just let Squid write it's httpd-emulated logs to the respective web_log files for the sites?
If that's feasible, how would I do that with Squid for different logs based on IP/Domain? I did not find that this would be possible.

Quote:
Originally Posted by till
Why do you want to use squid, so much traffic that apache cant handle it?
In tests, I found decreases in latency by using Squid, especially for pages with lots of graphics (as Tenaka already found). This is simply for the fact that, for 50+ different sites on one server, Squid does a better job serving pages/graphics from memory than Linux and Apache alone. So by not going through apache, I decrease the latency introduced by disk IO.
This is good for commercial pages, where the user's illusion of "fastness" or "responsiveness" of a web site mainly can be expressed in the amount of milliseconds it takes from the user entering an URL or clicking a bookmark/link until something starts to get rendered in the browser.

Secondly, I have some sites with high-latency DB connections. Much near-to-static data comes from these DBs. By tuning the web applications to write the correct Cache-Control headers for these pages, I can minimize the amount of actual DB queries made.

At least these were my findings, I am open to discussion here.
Reply With Quote
  #9  
Old 30th November 2005, 13:45
Ovidiu Ovidiu is offline
Senior Member
 
Join Date: Sep 2005
Posts: 1,257
Thanks: 75
Thanked 22 Times in 18 Posts
Default

Quote:
Originally Posted by RotHorseKid
That's what I thought.
a) Could I just let Squid write it's httpd-emulated logs to the respective web_log files for the sites?
b) In tests, I found decreases in latency by using Squid, especially for pages with lots of graphics (as Tenaka already found). This is simply for the fact that, for 50+ different sites on one server, Squid does a better job serving pages/graphics from memory than Linux and Apache alone. So by not going through apache, I decrease the latency introduced by disk IO.
c) Secondly, I have some sites with high-latency DB connections. Much near-to-static data comes from these DBs. By tuning the web applications to write the correct Cache-Control headers for these pages, I can minimize the amount of actual DB queries made.
d) At least these were my findings, I am open to discussion here.
Quote:
Originally Posted by till
Only the traffic that goes through apache is counted.
I do not udnerstand this fully, isn't traffic still going through apache? the only difference I see is that apache isn't delivering to the client but to squid. And squid, I suppose is requesting from apache in the same way a client would, so the logfiles should still be ok as usual? Or is there a major point I am missing here?

a) see my above lines
b) great ;-)
c) can you explain in a little bit more detail here? are you talking about optimizing mysql settings?
d) please keep in mind that I am talking about theory, so be patient, I have not yet tested this just been reading about it and considering implementing right now
Reply With Quote
  #10  
Old 30th November 2005, 14:10
RotHorseKid RotHorseKid is offline
Junior Member
 
Join Date: Nov 2005
Location: Germany
Posts: 8
Thanks: 0
Thanked 2 Times in 2 Posts
 
Default

Quote:
Originally Posted by Tenaka
I do not udnerstand this fully, isn't traffic still going through apache? the only difference I see is that apache isn't delivering to the client but to squid. And squid, I suppose is requesting from apache in the same way a client would, so the logfiles should still be ok as usual? Or is there a major point I am missing here?
The problem is, AFAIK ISPconfig is using the Apache logs to find out which site is generating how much traffic.
With Squid in front of Apache, Squid serves the content it has cached directly, Apache does not know about that, therefore it will not be in the logs (the log/web_log files found beneath the web directories).
BUT the traffic is generated anyway (at the external interface of my server, where my ISP measures the traffic, they don't care if it's cached or if apache served it), and I (resp. my clients) still have to pay for that.
At least I believe that is how it works. Tell me if I am wrong.

Quote:
Originally Posted by Tenaka
c) can you explain in a little bit more detail here? are you talking about optimizing mysql settings?
Not exactly.
I serve a web page that has some content coming from a DB. Connecting to this DB is VEEERY expensive, latency-wise (it's Oracle, perhaps you know what I mean, in my case there are round-trips of 1000-1500ms).
What I know is that some of these pages are static over a long period of time (like the items in a webshop for example). So I go ahead and serve these pages with a Cache-Control: public http header. (Think header() in PHP for example).
Squid caches these pages now, and there are no DB connections.
Reply With Quote
Reply

Bookmarks

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +2. The time now is 12:39.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.