-
Bug
-
Resolution: Duplicate
-
P2
-
None
-
5.0u10
-
x86
-
windows
This bug is filed for book keeping purpose, intended for java sustaining team.
(attention Vaibhav Kulkarni)
This is a regression only happens with a special FVB build (dated 7/26) which
is based on 5.0u9 source base.
This problem is reported by customer Starwood, case #64990026.
Here's notes from 7/31:
* I can reproduce the memory leak condition with FVB (7/26) and
not with 1.5.0_07. For that reason, I believe there is a
regression in the 1.5.0_09 based FVB that causes memory leak.
Note that on 7/28
I did reproduce a recursive condition with
sun.font.TrueTypeFont.getTableBuffer, which I have not
reproduced on 7/31. Not sure if this memory leak could be the root cause of
the recursive stackOverFlow error experienced earlier.
Recap on how to reproduce:
Here's the steps involved:
1. Log on to http://10.5.20.148:8080/lightspeed?overridejvm=true
user: 1200 pass: summer
2. In View combo box under House List, select "Arrivals"
you will see only "green" guest entries, (in fact I see only one entry),
select that entry, then select "modify"
(if there is any error popup, select "Open Reservation in Modify Mode")
3. In the new screen, select the "Pre-Block" link somewhere towards
the middle of the screen
4. When the popup complete loading, select on "close window"
**Now, repeat #4 many times (I actually did more than 100 times to
capture the info below)**
* On 7/28, I did reproduce OOM. On 7/31 I didn't repeat long enough times
for the OOM, but it would have gotten there had I continue on.
I captured a histogram for evey 10 "pre-block" operations.
(data in /net/cores.central/cores/dir31/64990026/0731)
- With 15007, no apparent memory leak
% grep Total jdk.15007.log
Total 315264 20434040
Total 237805 17303224
Total 260587 18634024
Total 336298 21276856
Total 278028 19764064
Total 250131 17731728
Total 346252 22808504
Total 287370 20099544
- With 15009 FVB, saw constant memory leak
% grep Total jdk.15009.log
Total 310245 20094872
Total 389023 23688552
Total 446246 25589584
Total 552226 31323656
Total 647131 36291376
Total 734108 40492440
Total 801207 41570784
Total 870433 47707488
Total 938731 51164936
Total 1007282 54318936
Total 1093861 55100928
Total 1172903 61592904
Total 1203508 61686680
- With 15007, the SortTableData class grows and shrinks
% grep SortTableData jdk.15007.log
24: 3120 74880 com.lightspeed.web.sorttable.SortTableData
20: 3120 74880 com.lightspeed.web.sorttable.SortTableData
13: 9360 224640 com.lightspeed.web.sorttable.SortTableData
10: 31200 748800 com.lightspeed.web.sorttable.SortTableData
11: 12480 299520 com.lightspeed.web.sorttable.SortTableData
23: 3120 74880 com.lightspeed.web.sorttable.SortTableData
11: 31200 748800 com.lightspeed.web.sorttable.SortTableData
11: 12480 299520 com.lightspeed.web.sorttable.SortTableData
- With 15009 FVB, the SortTableData class only grows
grep SortTableData *9*
24: 3120 74880 com.lightspeed.web.sorttable.SortTableData
11: 28080 673920 com.lightspeed.web.sorttable.SortTableData
8: 46800 1123200 com.lightspeed.web.sorttable.SortTableData
5: 78000 1872000 com.lightspeed.web.sorttable.SortTableData
4: 109200 2620800 com.lightspeed.web.sorttable.SortTableData
3: 137280 3294720 com.lightspeed.web.sorttable.SortTableData
3: 159120 3818880 com.lightspeed.web.sorttable.SortTableData
3: 180960 4343040 com.lightspeed.web.sorttable.SortTableData
3: 202800 4867200 com.lightspeed.web.sorttable.SortTableData
3: 224640 5391360 com.lightspeed.web.sorttable.SortTableData
3: 252720 6065280 com.lightspeed.web.sorttable.SortTableData
3: 277680 6664320 com.lightspeed.web.sorttable.SortTableData
3: 287040 6888960 com.lightspeed.web.sorttable.SortTableData
Here's notes from 7/28:
I am able to reproduce the OutOfMemory Error after running enough times. (eg 40 "pre-block operations)
with FVB (7/26).
I took serveral checkpoints where I captured java thread dumps and class histograms.
There seems to be a recursion happening between checkpoint #4 and
checkpoint #5, followed by OOM.
"thread applet-com.lightspeed.web.sorttable.SortTableApplet.class" prio=4 tid=0x0d1d5408 nid=0x65c runnable [0x070d4000..0x070ff9ec]^M
at java.lang.Throwable.fillInStackTrace(Native Method)^M
at java.lang.Throwable.<init>(Unknown Source)^M
at java.lang.Exception.<init>(Unknown Source)^M
at java.io.IOException.<init>(Unknown Source)^M
at java.nio.channels.ClosedChannelException.<init>(Unknown Source)^M
at sun.nio.ch.FileChannelImpl.ensureOpen(Unknown Source)^M
at sun.nio.ch.FileChannelImpl.position(Unknown Source)^M
at sun.font.TrueTypeFont.getTableBuffer(Unknown Source)^M
- locked <0x106bb160> (a sun.font.TrueTypeFont)^M
at sun.font.TrueTypeFont.getTableBuffer(Unknown Source)^M
- locked <0x106bb160> (a sun.font.TrueTypeFont)^M
<many lines deleted>
Class histograms from the java heap were collected at several checkpoints.
I noticed a surge in the # of intances of the following lightspeed classes
between checkpoint #4 and #5, which might be related to the recurson/OOM:
- com.lightspeed.web.sorttable.SortTableData
- com.lightspeed.web.sorttable.common.MultiLineCellRenderer.
check rank total total class name
point# #instances #bytes
--------------------------------------------------------------------------
1 NA NA NA NA
2 11: 18720 449280 com.lightspeed.web.sorttable.SortTableData
3 11: 21840 524160 com.lightspeed.web.sorttable.SortTableData
4 9: 43680 1048320 com.lightspeed.web.sorttable.SortTableData
5 13: 21840 524160 com.lightspeed.web.sorttable.SortTableData
--------------------------------------------------------------------------
1 NA NA NA NA
2 45: 54 19872 com.lightspeed.web.sorttable.common.MultiLineCellRenderer
3 42: 63 23184 com.lightspeed.web.sorttable.common.MultiLineCellRenderer
4 35: 126 46368 com.lightspeed.web.sorttable.common.MultiLineCellRenderer
5 40: 72 26496 com.lightspeed.web.sorttable.common.MultiLineCellRenderer
The surge of the following nio related classes is observed after OOM:
--------------------------------------------------------------------------
1 377: 6 288 java.nio.HeapByteBuffer
2 516: 4 192 java.nio.HeapByteBuffer
3 517: 4 192 java.nio.HeapByteBuffer
4 537: 4 192 java.nio.HeapByteBuffer
5 28: 471 70608 java.nio.HeapByteBuffer
--------------------------------------------------------------------------
1 NA NA NA NA
2 NA NA NA NA
3 NA NA NA NA
4 NA NA NA NA
5 34: 1465 35160 java.nio.channels.ClosedChannelException
--------------------------------------------------------------------------
(attention Vaibhav Kulkarni)
This is a regression only happens with a special FVB build (dated 7/26) which
is based on 5.0u9 source base.
This problem is reported by customer Starwood, case #64990026.
Here's notes from 7/31:
* I can reproduce the memory leak condition with FVB (7/26) and
not with 1.5.0_07. For that reason, I believe there is a
regression in the 1.5.0_09 based FVB that causes memory leak.
Note that on 7/28
I did reproduce a recursive condition with
sun.font.TrueTypeFont.getTableBuffer, which I have not
reproduced on 7/31. Not sure if this memory leak could be the root cause of
the recursive stackOverFlow error experienced earlier.
Recap on how to reproduce:
Here's the steps involved:
1. Log on to http://10.5.20.148:8080/lightspeed?overridejvm=true
user: 1200 pass: summer
2. In View combo box under House List, select "Arrivals"
you will see only "green" guest entries, (in fact I see only one entry),
select that entry, then select "modify"
(if there is any error popup, select "Open Reservation in Modify Mode")
3. In the new screen, select the "Pre-Block" link somewhere towards
the middle of the screen
4. When the popup complete loading, select on "close window"
**Now, repeat #4 many times (I actually did more than 100 times to
capture the info below)**
* On 7/28, I did reproduce OOM. On 7/31 I didn't repeat long enough times
for the OOM, but it would have gotten there had I continue on.
I captured a histogram for evey 10 "pre-block" operations.
(data in /net/cores.central/cores/dir31/64990026/0731)
- With 15007, no apparent memory leak
% grep Total jdk.15007.log
Total 315264 20434040
Total 237805 17303224
Total 260587 18634024
Total 336298 21276856
Total 278028 19764064
Total 250131 17731728
Total 346252 22808504
Total 287370 20099544
- With 15009 FVB, saw constant memory leak
% grep Total jdk.15009.log
Total 310245 20094872
Total 389023 23688552
Total 446246 25589584
Total 552226 31323656
Total 647131 36291376
Total 734108 40492440
Total 801207 41570784
Total 870433 47707488
Total 938731 51164936
Total 1007282 54318936
Total 1093861 55100928
Total 1172903 61592904
Total 1203508 61686680
- With 15007, the SortTableData class grows and shrinks
% grep SortTableData jdk.15007.log
24: 3120 74880 com.lightspeed.web.sorttable.SortTableData
20: 3120 74880 com.lightspeed.web.sorttable.SortTableData
13: 9360 224640 com.lightspeed.web.sorttable.SortTableData
10: 31200 748800 com.lightspeed.web.sorttable.SortTableData
11: 12480 299520 com.lightspeed.web.sorttable.SortTableData
23: 3120 74880 com.lightspeed.web.sorttable.SortTableData
11: 31200 748800 com.lightspeed.web.sorttable.SortTableData
11: 12480 299520 com.lightspeed.web.sorttable.SortTableData
- With 15009 FVB, the SortTableData class only grows
grep SortTableData *9*
24: 3120 74880 com.lightspeed.web.sorttable.SortTableData
11: 28080 673920 com.lightspeed.web.sorttable.SortTableData
8: 46800 1123200 com.lightspeed.web.sorttable.SortTableData
5: 78000 1872000 com.lightspeed.web.sorttable.SortTableData
4: 109200 2620800 com.lightspeed.web.sorttable.SortTableData
3: 137280 3294720 com.lightspeed.web.sorttable.SortTableData
3: 159120 3818880 com.lightspeed.web.sorttable.SortTableData
3: 180960 4343040 com.lightspeed.web.sorttable.SortTableData
3: 202800 4867200 com.lightspeed.web.sorttable.SortTableData
3: 224640 5391360 com.lightspeed.web.sorttable.SortTableData
3: 252720 6065280 com.lightspeed.web.sorttable.SortTableData
3: 277680 6664320 com.lightspeed.web.sorttable.SortTableData
3: 287040 6888960 com.lightspeed.web.sorttable.SortTableData
Here's notes from 7/28:
I am able to reproduce the OutOfMemory Error after running enough times. (eg 40 "pre-block operations)
with FVB (7/26).
I took serveral checkpoints where I captured java thread dumps and class histograms.
There seems to be a recursion happening between checkpoint #4 and
checkpoint #5, followed by OOM.
"thread applet-com.lightspeed.web.sorttable.SortTableApplet.class" prio=4 tid=0x0d1d5408 nid=0x65c runnable [0x070d4000..0x070ff9ec]^M
at java.lang.Throwable.fillInStackTrace(Native Method)^M
at java.lang.Throwable.<init>(Unknown Source)^M
at java.lang.Exception.<init>(Unknown Source)^M
at java.io.IOException.<init>(Unknown Source)^M
at java.nio.channels.ClosedChannelException.<init>(Unknown Source)^M
at sun.nio.ch.FileChannelImpl.ensureOpen(Unknown Source)^M
at sun.nio.ch.FileChannelImpl.position(Unknown Source)^M
at sun.font.TrueTypeFont.getTableBuffer(Unknown Source)^M
- locked <0x106bb160> (a sun.font.TrueTypeFont)^M
at sun.font.TrueTypeFont.getTableBuffer(Unknown Source)^M
- locked <0x106bb160> (a sun.font.TrueTypeFont)^M
<many lines deleted>
Class histograms from the java heap were collected at several checkpoints.
I noticed a surge in the # of intances of the following lightspeed classes
between checkpoint #4 and #5, which might be related to the recurson/OOM:
- com.lightspeed.web.sorttable.SortTableData
- com.lightspeed.web.sorttable.common.MultiLineCellRenderer.
check rank total total class name
point# #instances #bytes
--------------------------------------------------------------------------
1 NA NA NA NA
2 11: 18720 449280 com.lightspeed.web.sorttable.SortTableData
3 11: 21840 524160 com.lightspeed.web.sorttable.SortTableData
4 9: 43680 1048320 com.lightspeed.web.sorttable.SortTableData
5 13: 21840 524160 com.lightspeed.web.sorttable.SortTableData
--------------------------------------------------------------------------
1 NA NA NA NA
2 45: 54 19872 com.lightspeed.web.sorttable.common.MultiLineCellRenderer
3 42: 63 23184 com.lightspeed.web.sorttable.common.MultiLineCellRenderer
4 35: 126 46368 com.lightspeed.web.sorttable.common.MultiLineCellRenderer
5 40: 72 26496 com.lightspeed.web.sorttable.common.MultiLineCellRenderer
The surge of the following nio related classes is observed after OOM:
--------------------------------------------------------------------------
1 377: 6 288 java.nio.HeapByteBuffer
2 516: 4 192 java.nio.HeapByteBuffer
3 517: 4 192 java.nio.HeapByteBuffer
4 537: 4 192 java.nio.HeapByteBuffer
5 28: 471 70608 java.nio.HeapByteBuffer
--------------------------------------------------------------------------
1 NA NA NA NA
2 NA NA NA NA
3 NA NA NA NA
4 NA NA NA NA
5 34: 1465 35160 java.nio.channels.ClosedChannelException
--------------------------------------------------------------------------
- duplicates
-
JDK-6351698 Regression: 4506928 testcase is passing with 142_10-b03 but failing with 142_11-b01
- Resolved