FULL PRODUCT VERSION :
java version "1.7.0"
Java(TM) SE Runtime Environment (build 1.7.0-b147)
Java HotSpot(TM) 64-Bit Server VM (build 21.0-b17, mixed mode)
FULL OS VERSION :
Linux collab-dev-401 2.6.32-36-server #79-Ubuntu SMP Tue Nov 8 22:44:38 UTC 2011 x86_64 GNU/Linux
A DESCRIPTION OF THE PROBLEM :
HashMaps created and kept for a number of GC generations, then released, accumulates HashMaps where the map erroneously has a key declared to be a reference to the enclosing HashMap when it does not, causing a memory leak.
THE PROBLEM WAS REPRODUCIBLE WITH -Xint FLAG: Did not try
THE PROBLEM WAS REPRODUCIBLE WITH -server FLAG: Did not try
STEPS TO FOLLOW TO REPRODUCE THE PROBLEM :
Execute provided code. Set NewSize so that the garbage collector is called repeatedly by the given time killer method. Set a small maximum heap to have reasonable crash times. Over time, the memory used by this application gradually increases until it eventually runs out of memory.
EXPECTED VERSUS ACTUAL BEHAVIOR :
Expected: Memory should peak at a relatively small value as the garbage collector runs routinely. Performance should drop slightly as the gc kicks in, then stay constant.
Actual: memory continues climbing slowly. On average, each successive gc run is less effective than the one before. Performance gradually slows as the garbage collector runs more and more frequently.
REPRODUCIBILITY :
This bug can be reproduced always.
---------- BEGIN SOURCE ----------
/**
*
*/
package com.gracenote.jvm;
import java.util.HashMap;
import java.util.logging.Level;
import java.util.logging.Logger;
/**
* @author dmcennis
*
*/
public class TestHashMap {
/**
* @param args
*/
public static void main(String[] args) {
for(int i=0;i<Integer.parseInt(args[0]);++i){
ClassName map = new ClassName();
for(int j=0;j<100;++j){
map.generateAllocations();
}
Logger.getLogger(TestHashMap.class.getName()).log(Level.INFO,"Iteration "+i);
}
}
public static class ClassName{
HashMap<Item,HashMap<Item,HashMap<Item,HashMap<Item,Item>>>> map = new HashMap<Item,HashMap<Item,HashMap<Item,HashMap<Item,Item>>>>();
public ClassName(){
for(int i=0;i<100;++i){
Item item = new Item(i);
map.put(item, new HashMap<Item,HashMap<Item,HashMap<Item,Item>>>());
for(int j=0;j<100;++j){
Item item2 = new Item(j);
map.get(item).put(item2, new HashMap<Item,HashMap<Item,Item>>());
for(int k=0;k<100;++k){
Item item3 = new Item(j);
map.get(item).get(item2).put(item3, new HashMap<Item,Item>());
for(int l=0;l<100;++l){
Item item4 = new Item(l);
map.get(item).get(item2).get(item3).put(item4, item);
}
}
}
}
}
public void generateAllocations(){
for(Item i : map.keySet()){
for(Item j : map.get(i).keySet()){
for(Item k : map.get(i).get(j).keySet()){
for(Item l : map.get(i).get(j).get(k).keySet()){
Double d = new Double(Math.random());
Double e = new Double(Math.random());
}
}
}
}
}
}
public static class Item implements Comparable{
Double i;
HashMap<Double,Double> k = new HashMap<Double,Double>();
public Item(double val){
i = val;
}
@Override
public int hashCode() {
String ret = i.toString();
return ret.hashCode();
}
@Override
public boolean equals(Object obj) {
return (compareTo(obj)==0);
}
@Override
public int compareTo(Object o) {
return i.compareTo(((Item)o).i);
}
}
}
---------- END SOURCE ----------
CUSTOMER SUBMITTED WORKAROUND :
not using large number of HashMaps unless it is a static object (never reclaimed) or subdivide jobs into smaller data sets, restarting the JVM every so many objects.
java version "1.7.0"
Java(TM) SE Runtime Environment (build 1.7.0-b147)
Java HotSpot(TM) 64-Bit Server VM (build 21.0-b17, mixed mode)
FULL OS VERSION :
Linux collab-dev-401 2.6.32-36-server #79-Ubuntu SMP Tue Nov 8 22:44:38 UTC 2011 x86_64 GNU/Linux
A DESCRIPTION OF THE PROBLEM :
HashMaps created and kept for a number of GC generations, then released, accumulates HashMaps where the map erroneously has a key declared to be a reference to the enclosing HashMap when it does not, causing a memory leak.
THE PROBLEM WAS REPRODUCIBLE WITH -Xint FLAG: Did not try
THE PROBLEM WAS REPRODUCIBLE WITH -server FLAG: Did not try
STEPS TO FOLLOW TO REPRODUCE THE PROBLEM :
Execute provided code. Set NewSize so that the garbage collector is called repeatedly by the given time killer method. Set a small maximum heap to have reasonable crash times. Over time, the memory used by this application gradually increases until it eventually runs out of memory.
EXPECTED VERSUS ACTUAL BEHAVIOR :
Expected: Memory should peak at a relatively small value as the garbage collector runs routinely. Performance should drop slightly as the gc kicks in, then stay constant.
Actual: memory continues climbing slowly. On average, each successive gc run is less effective than the one before. Performance gradually slows as the garbage collector runs more and more frequently.
REPRODUCIBILITY :
This bug can be reproduced always.
---------- BEGIN SOURCE ----------
/**
*
*/
package com.gracenote.jvm;
import java.util.HashMap;
import java.util.logging.Level;
import java.util.logging.Logger;
/**
* @author dmcennis
*
*/
public class TestHashMap {
/**
* @param args
*/
public static void main(String[] args) {
for(int i=0;i<Integer.parseInt(args[0]);++i){
ClassName map = new ClassName();
for(int j=0;j<100;++j){
map.generateAllocations();
}
Logger.getLogger(TestHashMap.class.getName()).log(Level.INFO,"Iteration "+i);
}
}
public static class ClassName{
HashMap<Item,HashMap<Item,HashMap<Item,HashMap<Item,Item>>>> map = new HashMap<Item,HashMap<Item,HashMap<Item,HashMap<Item,Item>>>>();
public ClassName(){
for(int i=0;i<100;++i){
Item item = new Item(i);
map.put(item, new HashMap<Item,HashMap<Item,HashMap<Item,Item>>>());
for(int j=0;j<100;++j){
Item item2 = new Item(j);
map.get(item).put(item2, new HashMap<Item,HashMap<Item,Item>>());
for(int k=0;k<100;++k){
Item item3 = new Item(j);
map.get(item).get(item2).put(item3, new HashMap<Item,Item>());
for(int l=0;l<100;++l){
Item item4 = new Item(l);
map.get(item).get(item2).get(item3).put(item4, item);
}
}
}
}
}
public void generateAllocations(){
for(Item i : map.keySet()){
for(Item j : map.get(i).keySet()){
for(Item k : map.get(i).get(j).keySet()){
for(Item l : map.get(i).get(j).get(k).keySet()){
Double d = new Double(Math.random());
Double e = new Double(Math.random());
}
}
}
}
}
}
public static class Item implements Comparable{
Double i;
HashMap<Double,Double> k = new HashMap<Double,Double>();
public Item(double val){
i = val;
}
@Override
public int hashCode() {
String ret = i.toString();
return ret.hashCode();
}
@Override
public boolean equals(Object obj) {
return (compareTo(obj)==0);
}
@Override
public int compareTo(Object o) {
return i.compareTo(((Item)o).i);
}
}
}
---------- END SOURCE ----------
CUSTOMER SUBMITTED WORKAROUND :
not using large number of HashMaps unless it is a static object (never reclaimed) or subdivide jobs into smaller data sets, restarting the JVM every so many objects.