Uploaded image for project: 'JDK'
  1. JDK
  2. JDK-4147295

java.text.DecimalFormat.applyPattern doesn't set minimum integer digits

XMLWordPrintable

    • sparc
    • solaris_2.5



      Name: dfC67450 Date: 06/10/98



      The java.text.DecimalFormat.applyPattern(String pattern) doesn't set minimum
      integer digits correctly. For example, when applying "#,###" pattern minimum
      integer digits should be equal to 0, but getMinimumIntegerDigits() returns 1.

      Here is the test demonstrating the bug:

      -----------------TestD.java------------------------
      import java.text.*;

      public class TestD {
          public static void main (String args[]){
              DecimalFormat sdf = new DecimalFormat();
              String pattern = "#,###";
              System.out.println("Applying pattern \"" + pattern + "\"");
              sdf.applyPattern(pattern);
              int minIntDig = sdf.getMinimumIntegerDigits();
              if (minIntDig != 0) {
                System.out.println("Test failed");
                System.out.println(" Minimum integer digits : " + minIntDig);
                System.out.println(" new pattern: " + sdf.toPattern());
              } else {
                System.out.println("Test passed");
                System.out.println(" Minimum integer digits : " + minIntDig);
              }
          }
      }
      ---------Output from the test---------------------
      Applying pattern "#,###"
      Test failed
        Minimum integer digits : 1
        new pattern: #,##0
      --------------------------------------------------

      ======================================================================

            aliusunw Alan Liu (Inactive)
            dfazunensunw Dmitri Fazunenko (Inactive)
            Votes:
            0 Vote for this issue
            Watchers:
            0 Start watching this issue

              Created:
              Updated:
              Resolved:
              Imported:
              Indexed: