[OTDev] Performance testing and monitoring

Christoph Helma helma at in-silico.de
Fri Mar 12 17:20:00 CET 2010


Dear Vedrin,

You can restart the tests for our services (with exception of the fminer
test, that I have mentioned in my previous email). In fact we would need
some continous testing, to find out what caused the high latencies you
have mentioned.

For the destructive tests I will send you requests in the format below
(probably after easter holidays - I do not want to troubleshoot during
holidays). 

And from time to time I will have to ask you to stop testing (eg. during
debugging server configurations - then it is hard to spot errors if
there are too many entries).

Best regards,
Christoph

PS Can you also create scripts with sequences of calls (e.g. create
dataset, create fminer features, wait for task, create lazar model, make
prediction, check for correct answer, delete model, delete feature
dataset, delete dataset) - thats how we test internally (guarantees,
that you have all necessary resources avaliable and it is possible to
clean up with a post test hook)

Excerpts from Vedrin Jeliazkov's message of Fri Mar 12 16:15:56 +0100 2010:
> Dear Partners,
> 
> Since some of you have expressed concerns that the performance testing
> and monitoring we've been running so far is not suitable to their
> needs, I've changed our setup -- from now on only IDEA's services
> would be continuously tested and monitored by default:
> 
> http://ambit.uni-plovdiv.bg/cgi-bin/smokeping.cgi
> 
> In fact, we use these test results on a daily basis in our development
> work and don't mind to share them publicly.
> 
> We would perform testing and monitoring for any other services only on
> demand. Those of you who might wish to have some of their services
> tested/monitored through our platform should send to me the following
> details for each target they're willing to test:
> 
> Name: <short name of the resource being tested>
> Description: <description of the service being provided>
> Access: <curl call to be executed>
> Frequency: <desired number of queries per given time period>
> 
> Partners who prefer to deploy their own performance testing and
> monitoring solutions would be of course free to do so. They should
> provide full description of their setup and findings and allow third
> parties to run independent tests, in order to confirm their results.
> The following OpenTox requirements, defined [1] during the technical
> meeting in Munich in June 2009 are particularly relevant:
> 
> -- performance;
> -- scalability;
> -- stress;
> -- failover;
> -- reliability;
> -- robustness;
> -- interoperability;
> 
> All partners, performing software development, should ensure that
> their software meets these goals in an optimal way.
> 
> Kind regards,
> Vedrin
> 
> [1] http://www.opentox.org/dev/testing/testingoverview



More information about the Development mailing list