Name: saC57035 Date: 07/13/98
The java.text.DecimalFormat.applyPattern(String pattern) sets minimum
integer digits incorrectly if all the digits both in the integer part
and in the fraction part of the pattern are optional.
For example, when applying "##.##" pattern, minimum integer digits
should be equal to 0, but getMinimumIntegerDigits() returns 1.
Here is the test demonstrating the bug:
-----------------TestD.java------------------------
import java.text.*;
public class TestD {
public static void main (String args[]){
DecimalFormat sdf = new DecimalFormat();
String pattern = "##.##";
System.out.println("Applying pattern \"" + pattern + "\"");
sdf.applyPattern(pattern);
int minIntDig = sdf.getMinimumIntegerDigits();
if (minIntDig != 0) {
System.out.println("Test failed");
System.out.println(" Minimum integer digits : " + minIntDig);
System.out.println(" new pattern: " + sdf.toPattern());
} else {
System.out.println("Test passed");
System.out.println(" Minimum integer digits : " + minIntDig);
}
}
}
---------- Output ----------------
Applying pattern "##.##"
Test failed
Minimum integer digits : 1
new pattern: #0.##
---------------------------------
======================================================================
- duplicates
-
JDK-4134300 DecimalFormat bug added in 1.1.6
-
- Closed
-