-
Bug
-
Resolution: Duplicate
-
P3
-
None
-
1.1, 1.1.2
-
None
-
x86, sparc
-
solaris_2.5.1, windows_95
Name: sgC58550 Date: 03/17/97
When serializing a hashtable containing large strings (100k etc),
the following exception is generated:
java.io.UTFDataFormatException
at java.io.DataOutputStream.writeUTF(DataOutputStream.java:310)
at java.io.ObjectOutputStream.writeUTF(ObjectOutputStream.java:1032)
at java.io.ObjectOutputStream.outputString(ObjectOutputStream.java:480)
at java.io.ObjectOutputStream.checkSpecialClasses(ObjectOutputStream.java:301)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:182)
at java.util.Hashtable.writeObject(Hashtable.java:406)
at java.io.ObjectOutputStream.outputObject(ObjectOutputStream.java:629)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:225)
The documentation states that writeUTF stores the length of the
string in 2 bytes. This suggests a limit of 32000 characters in
a string.
This is clearly a bug in serializtion, but it could also be
argued that the implementation of writeUTF is rather naive :)
company - , email - ###@###.###
======================================================================
the following programm shows the problem:
If the String is bigger (or equal) than 64 kB, there
is an exception thrown by writeObject:
> java t >/dev/null
write-exception: java.io.UTFDataFormatException
------- t.java -------
public class t {
public static void main(String arg[]) {
String s="", a="test";
for (int i=0; i<256*64; i++)
s=s+a;
try {
new ObjectOutputStream(System.out).writeObject(s);
} catch (IOException e) {
System.err.println("write-exception: "+e);
}
}
};
company - , email - ###@###.###
=======================================================================
- duplicates
-
JDK-4025564 ObjectInputStream.readObject() failures when size of Object
-
- Closed
-