[Ns-developers] Request for Merge -- Validation
mathieu.lacage at sophia.inria.fr
Thu May 7 00:34:30 PDT 2009
On Wed, 2009-05-06 at 21:50 -0700, Tom Henderson wrote:
> Mathieu Lacage wrote:
> > On Tue, 2009-05-05 at 19:18 -0700, Tom Henderson wrote:
> >>> 1) Changes to regression.py to allow non-trace-based tests. The files in
> >>> regression/tests work just like they used to, except if there's an attribute
> >>> called "trace_compare" in the test module. In this case, it tells
> >>> regression.py to just look at the return code from the test program. See
> >>> regression/tests/test-rng-exponential.py for an example.
> >> I'd prefer that we do not make trace-based comparison the default
> >> choice, as a hint to not lean on this mode of testing too much.
> > +1
> >>> 2) There are four new .py files in regression/tests corresponding to the
> >>> tests for four of the rng distributions I socialized a while back.
> >>> 3) There is a new valver (validation and verification) directory to hold
> >>> dedicated tests.
> >> I would prefer tests/ to valver/.
> > +1
> >>> 4) In the valver directory there is an rng directory in which you can find
> >>> four .cc files corresponding to the chi-square tests for the four
> >>> distributions.
> >>> N.B. The rng validation tests introduce a GSL dependency.
> >>> With this in place, we can begin to write validation and verification tests
> >>> that do not use the trace comparison function.
> >> My main comment/question is that there doesn't seem to be any
> >> integration with the unit test framework, which could also be performing
> >> the rng checks you are doing.
> >> - what type of test should be run by ./waf check, and what type of test
> >> by ./waf --regression? I suppose that trace-based and python script
> >> tests are constrained to the python-based regression framework, but what
> >> is the guidance to give to people to put C++ test programs into the unit
> >> tests or into the python-based framework?
> > I am worried that the need to add a .py file for each new test, as well
> > as the need to write a main function for each new test increases the
> > cost of writing tests. I feel that we should try to lower the barrier to
> > writing tests as much as possible and that the current regression+valver
> > python-based wrappers increase too much the size of the barrier to write
> > tests.
> How would you prefer to store the metadata about the test (e.g. whether
> a regression trace should be checked) if not in a separate .py file?
Well, we could assume for now that anything under regression/ should be
checked for regression trace output. Could we not ?
> >> - can C++ tests in the new directory use the src/core/test.h framework
> >> including macros? Should we have an example along those lines?
> > I feel that src/core/test.h could indeed be used to write our new tests
> > but maybe it would make sense to consider the possibility of ditching
> > src/core/test.h altogether to use something easier to use such as
> > http://code.google.com/p/googletest/ (it should be a matter of dropping
> > gtest.h/cc in src/core). I would be fine with doing the work of porting
> > our existing tests to a new framework such as this one.
> Can you clarify, are you suggesting to investigate googletest and that
> you would like to prototype this by porting existing unit tests to this
> framework before proceeding further on Craig's patch?
I think we need to make sure that the barrier to writing tests is low.
We could reuse the src/core/test.h code to decrease it from craig's
current patch. We could also go even further and reuse something like
googletest which does a pretty amazing job at making it easy to write
I could live without this fancy stuff but my point is that I think that
it would be a good investment of everyone's time to try to lower the
barrier to writing tests as much as humanly possible. This would make it
much easier for us to request code submitters to provide tests.
> >> It seems like it would be useful to have these additional examples
> >> available for ns-3.5:
> >> 1) deterministic model verification example such as a TCP example
> >> 2) python-based example script that doesn't rely on trace comparison
> >> but at this stage, it would be helpful to first resolve the ./waf check
> >> vs. ./waf regression question, and to get agreement that something like
> > A related question is, where do you put newly-written tests. Should new
> > tcp tests go in src/internet-stack/tcp-test.cc ? Should they go in
> > tests/tcp-test.cc ? or tests/internet-stack/tcp-test.cc. I would be fine
> > either way and would be happy to move the current unit tests somewhere
> > in tests/ if we feel that this is the way to go.
> I don't have a strong preference, but in the past, I have found it
> convenient to have the unit tests co-located with the implementation so
> I can quickly see example usage. I was assuming that self-contained
> unit tests could stay with the implementation but that tests that
> required composition of several models could be in a separate test
Do you have examples of such tests which require the composition of
several models ?
More information about the Ns-developers