Uploaded image for project: 'JDK'
  1. JDK
  2. JDK-8205687

TimeoutHandler generates huge core files

XMLWordPrintable

    • Icon: Bug Bug
    • Resolution: Fixed
    • Icon: P4 P4
    • 11
    • None
    • hotspot
    • b27

        In test/failure_handler/src/share/conf/linux.properties

        We use gcore to generate core files:
        native.core.app=gcore
        native.core.args=-o ./core.%p %p
        native.core.params.timeout=3600000

        This is problematic since gcore seems to dump all reserved memory to disk, not the committed memory, or the paged in memory.

        For example, if you run the following program:
        --- mem.c ---
        #include <sys/mman.h>
        #include <stdio.h>

        int main() {

          // mmap 2GB of reserved memory
          void* mem = mmap(0, 2ULL * 1024 * 1024 * 1024, PROT_NONE, MAP_PRIVATE | MAP_ANON, 0, 0);
          if (mem == MAP_FAILED) {
            perror("mmap failed");
            return -1;
          }

          for (;;) {
            // Spin forever
          }

          return 0;
        }
        ---
        $ gcc -Wall mem.c
        $ ./a.out

        and in another terminal run:
        $ gcore <pid of a.out>

        This generates a 2.1G core file.

        If you then run with this instead:
        $ kill -SIGABRT <pid of a.out>

        The program crashes (as expected) and creates a much smaller 240K core file.

        This is indicative of what the overhead is when using gcore. When running a small Java program with G1 and default flags the numbers are:
        gcore: 5.3GB
        kill -SIGABRT: 648M

        This usage of gcore is problematic in our testing farm, and it eats up the disk space on our testing machines.

        This is even more problematic with ZGC, which always reserves (but not commits) huge memory areas (17 TB).

              iignatyev Igor Ignatyev (Inactive)
              stefank Stefan Karlsson
              Votes:
              0 Vote for this issue
              Watchers:
              6 Start watching this issue

                Created:
                Updated:
                Resolved: