The Intel® MPI Library provides an experimental feature to enable support for Java MPI applications. Java bindings are available for a subset of MPI-2 routines. For a full list of supported routines, refer to the Developer Reference, section
Miscellaneous > Java Bindings for MPI-2 Routines.
Running Java MPI applications
Follow these steps to set up the environment and run your Java MPI application:
- Source
mpivars.sh from the Intel® MPI Library package to set up all required environment variables, including
LIBRARY_PATH and
CLASSPATH.
- Build your Java MPI application as usual.
- Update
CLASSPATH with the path to the
jar application or pass it explicitly with the
-cp option of the
java command.
- Run your Java MPI application using the following command:
$ mpirun <options> java <app>
where:
- <options> is a list of
mpirun options
- <app> is the main class of your Java application
For example:
$ mpirun -n 8 -ppn 1 -f ./hostfile java mpi.samples.Allreduce
Development Recommendations
You can use the following tips when developing Java* MPI applications:
- To reduce memory footprint, you can use Java direct buffers as buffer parameters of collective operations in addition to using Java arrays. This approach allows you to allocate the memory out of the JVM heap and avoid additional memory copying when passing the pointer to the buffer from JVM to the native layer.
- When you create Java MPI entities such as
Group,
Comm,
Datatype, and similar, memory is allocated on the native layer and is not tracked by the garbage collector. Therefore, this memory must be released explicitly. Pointers to the allocated memory are stored in a special pool and can be deallocated using one of the following methods:
- entity.free(): frees the memory backing the
entity Java object, which can be an instance of
Comm,
Group, etc.
- AllocablePool.remove(entity): frees the memory backing the
entity Java object, which can be an instance of
Comm,
Group, etc.
- AllocablePool.cleanUp(): explicitly deallocates the memory backing all Java MPI objects created by that moment.
- MPI.Finalize(): implicitly deallocates the memory backing all Java MPI objects and that has not been explicitly deallocated by that moment.