Hi Falko, others,
I found a solution to this problem, but I am not sure if it is sustainable or healthy for the server.
It is here: http://nixforums.org/about55655.html
Basically, HTTPD is telling me that the server has reached its processing limits.
When I check the processing limits by doing:
I get 1024.
According to this post, if I increase my process limits, for example doubling to 2048:
the problem should be fixed.
I tested this solution and it appeared to work. HTTPD restarted normally and websites were functional.
To make sure HTTPD would restart in the future correctly, I added:
ulimit -n 2048
echo Ulimit set to 2048
to my HTTPD file in /etc/rc.d/init.d/httpd
My new question: Is this healthy for the server? Will I be putting exessive load on my box? Could this start a run-away server?