This is an EU funded initiative that proposes to give a quality score to open source projects in the hope that this can be used when people are deciding what software to use. As of this writing it's still very much a shell, and the main thing there is just the paper which was successfully submitted to the International Conference on Open Source Systems. In the paper they mention that maintaining a quality score for OSS projects will benefit users in two ways. It helps them choose between alternative projects, and the quality of projects in general will rise as developers improve their processes to increase their quality score.
So how do they propose to measure project quality? Well they don't go into any detail but mention:
- metric analysis of source code
- repository checkin and mailing list characteristics for project vibrancy
- bug characteristics (frequency, reponse time, type, ...)
- size of development team for project longevity
It's a pity that they don't currently go into any detail of how they will compute the score from the above measurements, as I can't see how it could be done with enough accuracy to statisfy the procurement needs of users. In my opinion, and something I've thought about previously, one very important factor that can be measured and that users could/should use when choosing OSS, is the "openness" of a project. This could be seen as a subset of OSS project quality I suppose.
So how could this "Openness Score" be determined? From my experience the most important attribute is a public mailing list. It's nice to have a bug and patch tracker which filter valid items from the mailing list, and even nicer to have access to the latest source. So something like the following pseudo code would be a useful metric:
score=0 if binary components score -= 10 if mailing list score += 10 if SCM used score += 15 if bug tracking score += 4 if !mailing list score += 4 if patch tracking score += 3 if !mailing list score += 3 if bug tracking score -= 2 release frequency = min(10,(yearly frequency/4)*10) //once a quarter = 10 beta release frequency = min(5,(yearly frequency/12)*5) //once a month = 5 if !SCM used release frequency *= 1.5 beta release frequency *= 2 score += release frequency score += beta release frequencyWith these metrics the minimum score possible is -10 and the maximum is 45,
and giving examples of projects I've some experience with:
project | maintainer | mailing list | bug tracker | patch tracker | SCM | releases/year (beta) | score |
util-linux | Adrian Bunk | no | no | no | no | 0 (1) | 00.84 |
coreutils | Jim Meyering | yes | yes | yes | yes | 7 (4) | 43.33 |
FSlint | Pádraig Brady | no | yes | no | yes | 3 (0) | 30.50 |
Note the yes values have links to the actual mailing lists etc. and therefore a list like this would be a nice portal for people wanting to interact with Open Source projects.
Update Jan 2007:
The register reported that SQO-OSS is one of four parts of a €50m EU project coordinated by UK Open Source Services firm Sirius.
Also the util-linux project has been forked and quality has much improved from the score presented above.
Update Aug 2007:
I noticed the Qualoss project which has a nice list of related EU funded projects.
Update Jun 2009:
I noticed Tom 'spot' Calloway's OSS Fail metrics
Update Sep 2016:
I noticed the Apache Project Maturity Model