I'm sorry but I totally disagree, the final version can be much faster even more than what you think.
Originally Posted by hax0r
I'm studying Software Engineer and when we build a software("debug, alpha" version), they are provided with debug informations and lot's of "bulk" software in it.
There's no real reason to build a release version with all the optimizations on it since it's a DEV version.
Firstly we do want everything to work as it has to work before compile it as a release version.
For example : if you print a single line in console or in a file (it's a very basic instruction but often used "log files etc..") afer every action you do when you ask a SQL database for example. The overall performance of the database will drop down drastically.
Of course a test is a mean to have informations on what you are doing.
So this bench is not totally useless.
But you shouldn't care about drops in performance right now but more about increases
The next alpha can have results totally different from this one and the following will follow the same way.
Until all optimizations are turned on and all the debug features removed you can't say that's worse.
Anyway none of this drops will stay like this, they do take care of performance and all will change but a drop by 90% of performance
will not be effective on the release.
Hope this help everyone to understand what an "alpha, beta etc." is.
Last edited by dl.zerocool; 12-09-2009 at 02:08 PM.
Anyone with a default Ubuntu install from 9.10 onwards?
Originally Posted by V!NCENT
Ubuntu doesn't follow this development model. It imports packages from debian, it has its own kernel team. It doesn't do a debug/release version of packages, it does stripped debug builds. All the packages that are in ubuntu are not in an alpha state, they're just current versions.
Originally Posted by dl.zerocool
It do not import everything from debian.
Originally Posted by garytr24
They also build their own version based on svn sources.
It's common to see ubuntu packages with svn in their version.
All canonical supported packages are not only "imported" from debian.
Sources are !
That's why they do all this stages before getting a release.
They are not only waiting for new packages to come to debian and then integrate them and hopefully I would say.
All devs will tell you that during a testing stage there's no insurance that there's not some code here or there that will slow down the all thing.
So you can't only base your judgement on beta/alpha's benchmarks.
I'm not here to say that this test is useful or not. Just to attract attention on this point.
I don't know how useful benchmarks are for the initial alpha versions of distributions. I would fully expect that speed of bootup is the least of the team's worries at this point. The team has put it in the spotlight yes, but bootup speed is not something that just happens when you throw everything together and once they have the right components then they can start worrying about speed.
Or ask the rfs4 guys, they must be laughing at the ext4 ones right now. Recovering a filesystem correctly from a software crash is a solved problem. Giving a useful level of data integrity without running slower than molasses has been done too - ext3 does it. I don't know how these guys managed to screw a formerly good filesystem up so badly.
Originally Posted by PerfMonk