Fixing ‘Too many open files’ maven problem. Solution for TeamCity, Hudson, Local and ssh shell environments

By neokrates, written on August 30, 2010


  • Join date: 11-30-99
  • Posts: 224
View Counter:
Rate it
  • Most important features for perfect mp3 player are?

    View Results

    Loading ... Loading ...
  • bodytext bodytext bodytext

One day some builds in your enterprise start to fail with “Too many open files”. Not all of them and it is hard to reproduce. If let untreated, your buildchain will become unstable. That means more time, more errors unrelated to code. In the end, you will face “too many file handles” problem everywhere, nothing will build… Better idea would be to understand the problem and fix it as soon as possible.

Works for:

✔ Maven2

✔ any Linux

✔ Reactor

What does the problem look like?

There are many causes where maven needs extra file handles. First time it can’t get any, it will fail the build. One popular case is during the exec of external program. In this case some pipes are needed for stdin, stdout and stderr hence file handles. Like:

[04:28:17]: [ERROR] BUILD ERROR
[04:28:31]: [INFO] Failed to create assembly: Failed to retrieve OS environment variables. Reason: Cannot run program "env": error=24, Too many open files
Just how many open file handles?

Maven2 really holds many open file handles to the dependency jar files. I follow chapters and presume that you are running a maven process, and know it’s process id $M2_BUILD_PID

List open file handles

Start your big maven reactor project and see file handles it holds open: ls -l /proc/$M2_BUILD_PID/fd/

Profile open file handles behavior of your project

You can also watch the number of open file handles during the build with:
watch -n 5 "date>>fh.log;ls -l /proc/$M2_BUILD_PID/fd/|wc -l>>fh.log"

Two possible reasons, two different solutions

Reason one: Maven2 reactor normally holds many open file handles

Maven2 really holds many file handles. How to know, that’s your case? You can profile the build as described above. If the average number of open file handles remains similar, that’s not a memory leak or code problem, that’s just a feature of maven.

❓ How many file handles can maven hold?
In our really big reactor project it was 300-900 open file handles with average maybe around 450. Rarely, number spiked over 900 handles.

👉 Solution is to increase maximal number of open file handles

Type ulimit -n. Per default the maximal number of open file handles is 1024. That is too small for many practical purposes.
Solution is to permanently increase the maximal number of open file handles. This fix will work because there is no process eating all possible file handles, just a maven build which needs some more.


Edit file handle limits pro user

add to /etc/security/limits.conf following lines:

* soft nofile 2048
* hard nofile 4096


Ensure it will be loaded during next login

add to /etc/pam.d/login following line:

session required


Activate new settings

How new settings are to be activated, depends on the system type and access method:

Remote Ssh session
If you have Ssh access to the system, for settings to take effect you need to logout and re-login.

Local host / developer workstation, local build
For the local host, log out or restart is needed for settings to take effect.

Experiments with pam settings ( /etc/pam.d/* files ) may allow you to enable changes even without relogin. But only for some situations like su or sudo. I couldn’t find a general solution and maybe there is none.

CI systems
For Team City and Hudson, build agents need to be restarted for settings to take effect

ulimit -n should print 2048 (or 4096 in some cases). For CI Agents or Slaves, which is the case with Hudson and TC, you can use the shell execution build configuration. As in the case of local build, just run ulimit -n and see the output.

Reason two: memory leak, problem in code, not released file handle

Also here, profile open file handles behavior first. You will see that the number of open file handles climbs during the reactor build of new modules.

👉 Solution is to find the root cause, like plugin or specific base test

Obviously, if the number of open file handles climbs, there must be a code which repeatedly creates and fails to release them. And, there is no general solution because there is unlimited number of possibilities for coding errors.

Usual risk groups:

⭐ Execution of plugins, which occurs for each project in m2 reactor;

⭐ Unit test base classes. This is a common pattern to write a base class which does the routine and then extend it with specific business logic test. Advantage is, you don’t need to write trivial code twice. Drawback is that you may have an error in that base code which results in things like not closing streams;

⭐ Static code with data structures, accessible from each reactor module.

Ideas on how to detect the problematic code:

1. You can profile a number of open file handles in the build. In the same time you can log out the maven build results with the logger configuration, which shows time for each line of output. Maybe use mvn -X option to get more log output. Having the number of open file handles every 5 seconds and knowing exactly what was running in this moment may point you in the right direction.

2. Many Java Profiling tools, like YourKit or JProfiler will show you a number of open file handles and in which code part they where created. Just download the trial, it is easy to use and free for some weeks.

Good luck debugging!

Be Sociable, Share!
Does that help to solve your problem?
VN:F [1.8.5_1061]
Rating: +2 (from 2 votes)
2 votes 'YES'  0 votes 'NO'


Be Sociable, Share!


Leave a Reply