quality@glassfish.java.net

Re: Request for comments : FishCAT, the way forward

From: Kristian Rink <rink_at_planconnect.de>
Date: Thu, 1 Jul 2010 08:50:56 +0200

Folks;

Am 30.06.2010 09:12, schrieb Judy Tang:
> But let me write some thing fun, the discussion here seems like a
> debate between socialism and capitalism :-)

Oh well, hope this discussion didn't really appear to be something like
this. :) Actually, talking about things "fun": At the very least I
think "capitalism-vs.-socialism" (or systems clash on whichever level)
is the least thing it all is about... Possibly, people are into a thing
like FishCAT for a whole set of completely different reasons, including
being a personal stakeholder of the project (in example when
running/deploying applications on top of the system), being technically
enthusiastic and interested about things, considering communication and
collaboration in an open, "tech/geek" environment as fun to be had, or
maybe due to altruism as well. There's a whole diverse bunch of
different approaches in my opinion, and none of them is better or
worse. ;)

Anyway, _final_ notes on that as far as I am concerned. Let's move
on. :)



> http://wiki.glassfish.java.net/Wiki.jsp?page=FishCAT2010SecHalf
>
>
> Current Focus for this FishCAT phase
>
> Focus of FishCAT will be on testing GlassFish 3.1, but we may want to
> check there are no regressions in 3.0.1 and verify that key bugs in
> 3.0 that were marked as fixed in 3.0.1 were indeed fixed correctly.
[...]
> There are many issues fixed from 3.0 release. If you can prove
> regression from version 3.0 to 3.0.1 or 3.1, please send a special
> message to the quality alias with subject 'Regression proved!' and the
> issue number. Please see points awarded for this type of test.

True, IMO this should definitely be a part of FishCAT future direction.
However I was thinking a lot about the term of "CAT" as "community
acceptance test", lately, and do have a few thoughts on that...

- Generally, given the very terminology, it seems a "test phase",
probably later in a software project development cycle, to figure out
whether (or not) a "community" is willing and ready to "accept" a piece
of technology, a product, ... in its present state and, if not, provide
feedback on why it is this way to make it better. I wonder how, all
along the GlassFish development process, this differs from objectives
and target group selection of, say, a Beta Testing program?

- Given a "community", formed by whichever rules and motivations, each
person involved will have a very special set of requirements based upon
which (s)he will be willing (or not willing/ready) to "accept" the
given development state. Considering a few different kinds of
stakeholders that might be part of FishCAT are...

 - people who do consulting, training, coaching on top of GlassFish /
   Java EE or write books / documentation on that issues,

 - people who sell and support applications which shall run inside a
   "Java EE compliant" container,

 - people who are involved into other Java (EE) open source projects
   and want their applications / libraries to work with GlassFish, too,

 - people who run and deploy "application bundles" or SaaS including a
   Java EE container which eventually happens to be GlassFish given the
   platform does all it needs,

 - people who run Java EE hosting environments for customers to run
   their applications in and, consequently, require an application
   server platform ready and capable of dealing with virtually all
   kinds of Java EE applications out there,

 - spare time enthusiasts interested in the platform, ready and willing
   to play with it (and eventually, in some parts, take it for a hard
   test drive),

 - students which want to learn both about the internals of a Java EE
   application server and, ideally, about how a professional, yet
   close-to-community software development process works,

 - ... ?

Each of these group of stakeholders is likely to have a different
(though not disjoint) set of requirements and, subsequently, criterias
making them (not) accept the platform in a Community Acceptance Test
procedure.

- Likewise, there are different general areas of testing in which
acceptance might (not) be gained and, consequently, which should be
part of the testing, most notably...

 - runtime aspects -> including everything relating to using GlassFish
   as a runtime and deployment platform for any kind of application no
   matter whether critical or not,

 - management aspects -> including everything that relates to keeping
   track of the server while running, like monitoring, logging,
   debugging, general module / component / resource administration, ...

 - development aspects -> including everything related to building
   applications targeted at running in this platform (like IDE tooling,
   documentation, ease of testing in a local development
   environment, ...),

 - documentation aspects -> including availability and quality of
   documentation relating to any real-life aspects of the server
   platform that might be of interest.

- People will have different amounts of time at hand as well as
different intensities in which they can interact with the FishCAT
community:

 - Someone who is doing testing along deploying his/her applications to
   GlassFish, making them run on the latest revision and communicating
   only if encountering any issues along this road is likely to be
   pretty quiet as soon as all the applications in question do as they
   should in the new environment.

 - A student, in example, who is about to learn about software testing
   and debugging will have lots of time at hand doing planned and
   coordinated testing, trying to track down actual issues and
   eventually figuring out how to fix them.

 - People doing Java EE coaching and consulting, eventually preparing
   for moving to the new server platform, are likely to provide input
   on a lot of things talking about ease-of-use and ease-of-development
   on the platform.

Thoughts? I wonder whether it would be worth setting up a table
enumerating various groups of stakeholders and their special
requirements to see who can do what?

Cheers,
K.

-- 
Dipl.-Ing.(BA) Kristian Rink * Software- und Systemingenieur
planConnect GmbH * Könneritzstr. 33 * 01067 Dresden
fon: 0351 215 203 71 * cell: 0176 2447 2771 * mail: rink_at_planconnect.de
Amtsgericht Dresden HRB: 20 015 * St.-Nr. FA DD I 201 / 116 / 05360
Geschäftsführer: Stefan Voß, Karl Stierstorfer