View Single Post
  #4  
Old 30th July 2007, 06:36
bpmee bpmee is offline
Member
 
Join Date: Aug 2006
Posts: 86
Thanks: 3
Thanked 7 Times in 6 Posts
Lightbulb Found a solution - but not sure if it is viable

Hi Falko, others,

I found a solution to this problem, but I am not sure if it is sustainable or healthy for the server.

It is here: http://nixforums.org/about55655.html

Basically, HTTPD is telling me that the server has reached its processing limits.

When I check the processing limits by doing:
Code:
ulimit -n
1024
I get 1024.

According to this post, if I increase my process limits, for example doubling to 2048:
Code:
ulimit -n 2048
the problem should be fixed.

I tested this solution and it appeared to work. HTTPD restarted normally and websites were functional.

To make sure HTTPD would restart in the future correctly, I added:
Code:
ulimit -n 2048
echo Ulimit set to 2048
to my HTTPD file in /etc/rc.d/init.d/httpd

My new question: Is this healthy for the server? Will I be putting exessive load on my box? Could this start a run-away server?

Thanks!
Reply With Quote