Hi Kohsuke,
Kohsuke Kawaguchi wrote:
> I'd propose you count a skipped test as a failure. If you just don't
> run such a test at all, it's too easy to forget it and leave it
> skipped forever.
Counting it as a failure kind of defeats the purpose of skipping the
test. The tests already fail or error. I want to make sure that the
WSIT/JAXWS developers who are required to run these tests before
check-in can assume all tests that get run are expected to pass... which
has not been the case thus far... I think that expecting a developer to
go look at hudson to make sure he/she is getting the same number & kind
of failures/errors would likely cause more problems than skipping tests
that are known to be broken.
I think it would be really handy if we could separate out the count/list
of skipped tests... either have JUnit keep track of them (somehow) or
add code to the harness itself to track & record them. Does something
like that sound acceptable?
thanks,
Ken