-
Enhancement
-
Resolution: Fixed
-
P4
-
7u6
-
None
-
generic
-
generic
When building using the JDK repository, the architecture of the build machine is used to decide the architecture of the JDK being built. At least on Linux. So if you build on a 64-bit machine it tries to build a 64-bit JDK.
As powerful 64-bit build servers becomes available we want to be able to do a 32-bit build on such systems.
Hotspot already supports this. If you set ARCH_DATA_MODEL=32 then it will set all the necessary build and compiler flags for a 32-bit build, even on a 64-bit system. (Of course your 64-bit system must have all the necessary 32-bit development libraries installed).
For a JDK build you can force it to 32-bit by setting mach=i586 on the make invocation. This replaces the value read via "uname -m" and will then set ARCH/SRCARCH/LIBARCH etc correctly. (You can't set ARCH directly as that would also be passed to hotspot during a full control build and hotspot wants ARCH=i686 where the JDK wants ARCH=i586 - so we change the mach setting instead.)
In addition the JDK build lets the compiler (gcc) output for its default platform - so on a 64-bit system gcc would output 64-bit binaries. To deal with this we need to pass -m32 as a gcc/ld flag. This can be done by setting the environment variables:
OTHER_CFLAGS, OTHER_CPPFLAGS, OTHER_CXXFLAGS and OTHER_LDFLAGS to contain "-m32".
Even then there are places in the build where this does not work (see suggested fix for details):
- the OTHER_XXXFLAGS are not used on the compile command
- the OTHER_XXXFLAGS are set directly in the Makefile ( = instead of +=) and replace the value from the environment
- recursive makes interact badly with conditional variables that are also set in the environment
These issues can all be worked around as per the suggested fix.
However a better fix would be for the JDK build to use ARCH_DATA_MODEL to set a 32-bit or 64-bit ARCH (etc) as requested and directly pass the right flags to the compiler.
There may also be other issues lurking if any lower level makefiles specifically check the build architecture.
As powerful 64-bit build servers becomes available we want to be able to do a 32-bit build on such systems.
Hotspot already supports this. If you set ARCH_DATA_MODEL=32 then it will set all the necessary build and compiler flags for a 32-bit build, even on a 64-bit system. (Of course your 64-bit system must have all the necessary 32-bit development libraries installed).
For a JDK build you can force it to 32-bit by setting mach=i586 on the make invocation. This replaces the value read via "uname -m" and will then set ARCH/SRCARCH/LIBARCH etc correctly. (You can't set ARCH directly as that would also be passed to hotspot during a full control build and hotspot wants ARCH=i686 where the JDK wants ARCH=i586 - so we change the mach setting instead.)
In addition the JDK build lets the compiler (gcc) output for its default platform - so on a 64-bit system gcc would output 64-bit binaries. To deal with this we need to pass -m32 as a gcc/ld flag. This can be done by setting the environment variables:
OTHER_CFLAGS, OTHER_CPPFLAGS, OTHER_CXXFLAGS and OTHER_LDFLAGS to contain "-m32".
Even then there are places in the build where this does not work (see suggested fix for details):
- the OTHER_XXXFLAGS are not used on the compile command
- the OTHER_XXXFLAGS are set directly in the Makefile ( = instead of +=) and replace the value from the environment
- recursive makes interact badly with conditional variables that are also set in the environment
These issues can all be worked around as per the suggested fix.
However a better fix would be for the JDK build to use ARCH_DATA_MODEL to set a 32-bit or 64-bit ARCH (etc) as requested and directly pass the right flags to the compiler.
There may also be other issues lurking if any lower level makefiles specifically check the build architecture.
- relates to
-
JDK-7171653 32-bit cross-compile on 64-bit build host generates 64-bit data for awt/X11 leading to crash
- Resolved