Uploaded image for project: 'JDK'
  1. JDK
  2. JDK-8238858

G1 Mixed gc young gen sizing might cause the first mixed gc to immediately follow the prepare mixed gc



    • Enhancement
    • Resolution: Fixed
    • P4
    • 16
    • 15
    • hotspot
    • gc
    • b04


      G1 sizes young gen to the minimum value allowed (G1NewSizePercent) during mixed gc by default, as default MaxGCPauseIntervalMillis is MaxGCPauseMillis+1.

      This causes very small young gens (and edens i.e. 1 region) during mixed gc; particularly the first mixed gc after the Prepare Mixed gc will follow it very quickly because the survivor space for the prepare mixed gc is calculated using the current (young only) young gen. This means that for the next (mixed) gc, eden will be *very* small up to only 1 region.

      E.g. heap size 3.5g, log output:

      Young gc:
      44,337s: GC(42) Eden regions: 755->0(726)
      44,337s: GC(42) Survivor regions: 90->91(106) // i.e. at end of young-only gc there are 91 survivor regions, max capacity is 106

      [Concurrent mark happens as GC(43)]

      Prepare Mixed gc (~2s later):
      46,288s: GC(44) Eden regions: 726->0(1)
      46,288s: GC(44) Survivor regions: 91->92(103) // i.e. at the end of prepare-mixed gc there are 92 survivor regions, max capacity is 103. This is a problem because the number of survivor regions is already larger than the total young gen size allowed during the next mixed gc (87 according to G1NewSizePercent). So the next (mixed) gc will get 1 eden regions (the minimum allowed), which will make that next mixed gc occur almost instantly which breaks the MMU goal.

      "Technically" only because by default the goal is 200/201 (in that configuration), but gctimeratio will spike quite a bit.

      Mixed GC (~50ms later!):
      46,340s: GC(45) Eden regions: 1->0(75)
      46,340s: GC(45) Survivor regions: 92->12(12)

      One solution would be sizing the survivor regions in Prepare Mixed based on the next (i.e. Mixed gc) young gen size and the MMU. This does not happen though at the moment.


        Issue Links



              tschatzl Thomas Schatzl
              tschatzl Thomas Schatzl
              0 Vote for this issue
              2 Start watching this issue