I have a query against an Oracle database that returns a decimal value for a price. Ruby’s Oracle connector returns that as a BigDecimal object.
Once I have got all the items I have to aggregate them. As part of the aggregation I run this code:
@hash[id].price = ( ( @hash[id].price * @hash[id].fill_volume ) + ( item.price * item.fill_volume ) ) / ( @hash[id].fill_volume + item.fill_volume )
Each time it runs this code it gets slower. After about 250 loops of this code it gets really slow. Here’s the time it takes to do the 269th loop (the times are UNIX time with microseconds):
…just over a second! This seems to happen more if the id is the same over many loops.
A very simple solution is to make the price be a Float rather than a BigDecimal. When I do that, I get the following times:
Significantly faster (and given I have 10000s of lines to parse this is a big thing) worth the potential loss of accuracy. What I’d like to know is why BigDecimal behaves like this (at least in ruby 1.8.6). Any ideas?
UPDATE 20101221: Matt Patterson has had a look into it and it looks like BigDecimal is O(n2) (or worse) as the size of the BigDecimal gets larger. It looks like I’m going to have to have a look at BigDecimal#limit and BigDecimal#round.