Android support has been added for libidx, libeblearn and libeblearntools.
Take a look at the eblearn/tools/mobile/android/eblearn/README.txt for a face detection demo.
In the android demo, all assets that are included in the .apk are given a .mp3 extension so that they are not compressed. However, if you look at eblearn.java, the assets are transfered to /sdcard/eblearn without their .mp3 extension.
This makes it a much simpler approach to access files from the c++ library rather than passing file descriptors and offsets, and also lets you pass conf files that you train on linux/mac/windows, without any change to file paths etc, as long as the paths are relative to the assets root directory.
The android demo is very generic and is not limited to faces. It just depends on the conf file you give, and can be used as such for different objects.