However changing the card table size is a tradeoff between the time it takes to scan the card area and the other processing needed for cards (clear CT, memory overhead).
A rule of thumb is to look at card scanning time - if it takes a significant time of your pauses (random value: 30-40%; read bigramtester), then strongly consider a smaller card table size; if it takes a very small part of the pause time (read: specjbb) it might be useful to increase its size.
Essentially the size of a card directly impacts card scan time - after all it is directly related to the amount of overhead needed to find the potentially single reference in that card. The larger, the more objects you might potentially search in vain.
This is very hard to gauge for people trying to tune the VM: what would help is some measure of how many references into the collection set were found during scanning the cards (i.e. to determine a ratio between bytes scanned / interesting reference).
Add to logging.