the symptom

If tomcat just respondes with HTTP 500 and you find something like

org.apache.tomcat.jni.Error: Too many open files
       at org.apache.tomcat.jni.Socket.accept(Native Method)
       at org.apache.tomcat.util.net.AprEndpoint$Acceptor.run(AprEndpoint.java:1110)
       at java.lang.Thread.run(Thread.java:619)
Mar 12, 2009 12:01:34 PM org.apache.tomcat.util.net.AprEndpoint$Acceptor run
SEVERE: Socket accept failed

in catalina.out

then you might have too many open file handles.

Too be sure … lsof and ulimit are you friends

use

ulimit -a to see you current configuration:


$ ulimit -a
core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 13664
max locked memory       (kbytes, -l) 32
max memory size         (kbytes, -m) unlimited
open files                      (-n) 1024
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 13664
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited

so

open files (-n) 1024

shows there are 1024 possible open file handles per process.

ulimit -n <value> lets you set the value. You might increase it to e.g. 4096. But be sure there is not something in you code which consumes too much file handles. And don’t forget to restart the process after tweaking with ulimit.

to check for the current open file handles use lsof

lsof |grep tomcat shows you all open handles from tomcat (or use grep java)

to get a number to compare with you settings wc comes in handy :-)

$ lsof |grep tomcat|wc -l
136

Interested in Cloud or Chef Trainings? Have a look at our Commandemy Trainings page. Need help migrating to the cloud? Check out Infralovers.

comments powered by Disqus
Blog Tags