I've just been running the universe test on one of my machines and I've noticed a bug in the parser which is similar to the scale problem identified in the md-gromacs test-set.
On fast machines, the result given isn't in seconds anymore, it's in milliseconds.
This leads to some stupid results for fast machines like the one I tested on. It reported times of 7000ms which the parser interprets as 7000s.
I'm looking into the parser now to work out how best to fix it.
On fast machines, the result given isn't in seconds anymore, it's in milliseconds.
This leads to some stupid results for fast machines like the one I tested on. It reported times of 7000ms which the parser interprets as 7000s.
I'm looking into the parser now to work out how best to fix it.
Comment