Uploaded image for project: 'JDK'
  1. JDK
  2. JDK-6407840

Perm gen keeps growing, class loaders not released?

XMLWordPrintable

    • Icon: Bug Bug
    • Resolution: Cannot Reproduce
    • Icon: P2 P2
    • None
    • 5.0u6
    • hotspot
    • gc
    • sparc
    • solaris_8

      This is from customer's escalation #1-15690147, case #64944067.

      Our customer has a scenario involving multiple re-deployments to
      a weblogic 9.1 cluster in which weblogic instances hang and sometimes
      run out of permgen memory.
      The problem occurs for them after 4 to 5 re-deployments.

      Data files are located in
      /net/cores.central/cores/dir18/64944067/Mar28

      There is a steady increasing trend in perm gen usage
      after each re-deployment (see jmap-heap.out):

        85.78%, 91.38%, 96.99%, 98.05%, 99.99%

      There is an apparent corresponding increasing number of
      weblogic.utils.classloaders.GenericClassLoader object instances
      in the heap (see jmap-histo.out)

      10152 141 weblogic.utils.classloaders.GenericClassLoader
      12672 176 weblogic.utils.classloaders.GenericClassLoader
      15192 211 weblogic.utils.classloaders.GenericClassLoader
      15192 211 weblogic.utils.classloaders.GenericClassLoader
      17064 237 weblogic.utils.classloaders.GenericClassLoader

      BEA did verified that weblogic did not particularly hold on to
      the weblogic.utils.classloaders.GenericClassLoader objects that
      might prevent them from being unloaded.


      Out of the 5 re-deployments, CU collected 3 core files,
      one at the begining - core.032806_093309 (core1)
      one in between - core.032806_093727 (core2)
      one at the end - core.032806_094645 (core3)

      Perm gen stats were collected by running SA on the core files:

      % grep GenericClassLoader sa.permstat.core1 | wc
           141 846 12775
      % grep GenericClassLoader sa.permstat.core1 | grep live | wc
           121 726 10909
      % grep GenericClassLoader sa.permstat.core1 | grep dead | wc
            20 120 1866

      % grep GenericClassLoader sa.permstat.core3 | wc
           237 1422 21294
      % grep GenericClassLoader sa.permstat.core3 | grep live | wc
            73 438 6597
      % grep GenericClassLoader sa.permstat.core3 | grep dead | wc
           164 984 14697


      There were 20 dead GenericClassLoader at the beginning,
      but 164 dead at the end. It seems the dead ones are not
      being unloaded over time.

      This GenericClassLoader is just one thing that stands out from
      an initial analysis of the core files. There might be others.
      Please evaluate.

      Note: I failed to run hat on the heapdump generated from
            the core file using SA:

      % java -jar hat.jar -stack false heap.bin.core1
      Started HTTP server on port 7000
      Reading from heap.bin.core1...
      Dump file created Sun Apr 02 22:48:19 PDT 2006
      java.io.IOException: Name not found for id 0xd4b3f008
      at hat.parser.HprofReader.getNameFromID(HprofReader.java:598)
      at hat.parser.HprofReader.getNameFromID(HprofReader.java:584)
      at hat.parser.HprofReader.read(HprofReader.java:212)
      at hat.parser.Reader.readFile(Reader.java:90)
      at hat.Main.main(Main.java:149)


      Sunsolve shows the following related bugs. As the problem
      was reproduced on JDK 1.5.0_06 with -client option,
      these bugs should not be the cause:

      * 5033614 ClassLoaders do not get released by GC, causing OutOfMemory in Perm Space
        - fixed in JDK 5.0u1
      * 4896986 Soft Refs not cleared before throwing OOM on failed perm allocation
        - fixed in JDK 1.5
      * 4957990 PermHeap bloat in and only in server VM
        - CU is running with -client

            poonam Poonam Bajaj Parhar
            lkchow Lawrence Chow
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Created:
              Updated:
              Resolved:
              Imported:
              Indexed: