The Sledgehammer Approach

The maximum memory available to the JVM is provided during startup. The limit for 32-bit systems is around 1.5 GB, with 64-bit systems (and the corresponding 64-bit JDK) allowing a few gigabytes. The memory limits are defined in the startup scripts or the Windows Service configuration.

There are always two values:

  • -Xms (or wrapper.java.initmemory): Initial memory.

  • -Xmx (or wrapper.java.maxmemory): Maximum memory.

They would look something like this:

-Xms512M -Xmx1536M

Here, the VM is allocated 512 MB when it starts, and it can use a maximum of 1536 MB (1.5 GB) during runtime. As mentioned, this is the upper limit for 32-bit systems. These details can be found in the following files:

Linux/Unix:


The relevant line will start something like this:

./bin/execute.sh
JAVA_OPTIONS="-Xmx1536M -Xms512M -server ...


Windows for starting the Integration Server in the console:


The relevant line will start something like this:

./bin/hub.bat
set OPTIONS=-Xms512M -Xmx1024M -server ...


Windows for starting as a service:


Here, the details are:

./etc/wrapper.conf
# Initial Java Heap Size (in MB)
wrapper.java.initmemory=512
# Maximum Java Heap Size (in MB)
wrapper.java.maxmemory=1024


If you are using a 64-bit system, you can certainly set the maximum memory limit at a few gigabytes. However, you should also be aware that the same Java program on a 64-bit JVM will use around 30 percent more memory, so make sure to take this into account. Current memory usage is regularly written to one of the log files. The file is called message.log and is located in the ./logs/services directory on the Integration Server. The information is shown as follows.

12:16:59 NORMAL SYSTEM:SERVICEFACTORY:MEMORYTHREAD Total mem: 518912 KB, free mem: 411826 KB, max. mem: 1555392 KB, 20% used

You should ideally carry out a few reasonable load tests on profiles that are candidates for high memory use and then check these log entries. This will give you an idea of what to expect in live operation. Unix/Linux users, of course, can use the wonderful tool 'grep'.

This can keep memory problems at bay for a while, but the title of this section is not 'the sledgehammer approach' for nothing. All that it achieves is to let 'memory hogs' run rampant for longer. Eventually, you will reach a point where even a few gigabytes are no longer enough for the volume of data. What you should do instead is set up your profiles from the start (or at least convert them now) so that they cannot become problems in the first place. Simply keep reading to find out how.