Reg: DHIS2 docker instance

Hi Jason & Paulo,

Hope you are doing good.

It has been a long time! We are writing tests for metadata sync. All the tests are added to the repo here.

In the above repository navigate to the path API_Test/testcases/metadatasync/integration/

there is a generic test “ImportMetadataTest" which we wrote for testing how various types of metadata entities will sync from HQ/central instance to local/field.

There are two ways of running the test

  1. To run this test without any database on HQ and Local.

To test how sync is behaving with respect to various metadata entities on two new instances without any data model on it. All we need is to have metadata versions in this folder - API_Test/testdata/metadatasync/versiondata

We can have any number of versions in the folder. It depends on how user wants metadata sync to happen or what all metadata associations or disassociations user wants to test. For now I kept two version files.

To run the test for Version_1 run this should be run using "env version=“Version_2” mocha ImportMetadataTest.js --timeout 20000” which is can be added to a shell script to run version one after the other like it is in integrationTestsWithoutDB.sh file. This will first import data on HQ/Central instance using import api and then Local/field instance will sync the version from HQ.

Once the version is synced to Local/Field then we are doing two tests. One is asserting the data in

http://local/api/metadata/version/Version_1/data with http://HQ/api/metadata/version/Version_1/data by comparing them.

Later it will compare all the entities(which are present in that version) individually say we have a array of data elements then it will pick all the data elements and compare one by one and continues for other entities as well.

e.g: It will compare http://local/api/dataElements/id with http://HQ/api/dataElements/id

  1. To test how sync is behaving with respect to various metadata entities on two instances where HQ already have n versions[Pre defined database]. We are using the same script to import version by version. It will also do a couple of assertions on top of the metadata when synced. The first assertion being same as above it will compare the data in http://local/api/metadata/version/Version_1/data with http://HQ/api/metadata/version/Version_1/data.

But the next level comparison is a bit different it will compare the entities by fetching the entity data which is present in

http://local/api/metadata/version/Version_1/data with http://local/api/dataElements/id

Here there won’t be entire json for any entity on http://local/api/metadata/version/Version_1/data this will contain very limited details we are just comparing the minimal entities getting them from Local/Field using jsonfilters in api call.

We had this kind of assertions because say user has Version_1 and has a data element abcd1234 and the name might have changed in Version_2 abcd12345 as HQ has got n versions in it so if we want to compare json of it on both HQ and Local we have different names so we took this approach.

Can you please have a look at this and let me know if any changes are required.

Thanks & Regards,

Nalinikanth M

···

On Thu, Aug 18, 2016 at 11:32 AM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hey All,

I am Nalinikanth, QA on the MSF-OCA project and we are using DHIS2. We are building API automation suites as a part of our project. We are working along with Jason. P and Paulo. We have been discussing on how to take this forward and you can find our discussions thread in this mail.

Please do comment or provide feedback if you have any ideas or thoughts around the same.

I am attaching the repos as well for your reference.

https://github.com/msf-oca-his/API_Test

https://github.com/dhis2/api-tests

Feedback from the community would be well appreciated.

Thanks & Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Wed, Aug 10, 2016 at 3:41 PM, Vanya Seth vanyas@thoughtworks.com wrote:

Hi All

It makes sense to make this discussion public. So, that other members of the community can also provide their inputs.

Regards

Vanya


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Tue, Aug 9, 2016 at 1:36 AM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

is it the idea to keep a different repo with the tests for metadata versioning?

Regarding how to setup data I don’t have strong opinions on this. I think we should try one approach and see if it works. The initial idea was to have a docker image already baked with the data we want, for each test execution, that we can control using docker compose.

https://github.com/dhis2/api-tests/blob/master/docker-compose.yml

– Paulo

On Wed, Aug 3, 2016 at 3:21 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason/ Paulo,

Hope you are doing well.

As a part of the API testing we have written some test for the metadata versioning APIs, which is a core feature contributed by us to DHIS2 version 2.24. We did minor changes to the folder structure. We leverage before and after functions to setup and tear down data. Please have a look at the tests here. Please do let us know any feedback on the tests.

I have been through the repo that Paulo was working on, the way he extracted the version in env.js file looks okay but we did it in a slightly different way. That anyway would help us in providing the ability for tests to run across multiple versions of DHIS2.

One more thing to discuss upon is we can do contract testing of APIs which might not need a predefined data in the data base but, in some cases like when we test datavaluesets or any other similar APIs we might need some data which should already be set up. Similarly, we want to leverage the API testing to do integration tests as well. This will require a database set up to be done before the tests run on the system. For that we can have a DHIS2 empty instance on which we can set up the data and remove the database once the tests are run. We are looking at two ways to accomplish this:

  1. Setting the database dump using sql scripts.
  2. We can create data using metadata import API(using import API to set up metadata), where the set up will run before the tests.
    We how ever feel setting up metadata using APIs will be useful as we can leverage it irrespective of the database we are using and it will be able to create data properly across versions. Where as setting up the database using sql might have to be maintained and should be migrated properly for every version of DHIS2 release. So we are a kind of not wanting to implement this way. So we feel the second way of setting up data required for tests makes more sense. Can you please share your thoughts on this as well.

Thanks & Regards,

Nalinikanth M


With Regards
ThoughtWorks Technologies

Hyderabad

–Stay Hungry Stay Foolish!!

On Tue, Jun 28, 2016 at 5:54 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Thank you Paulo, Enjoy your vacation we can discuss once you are back :slight_smile:


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Tue, Jun 28, 2016 at 5:51 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi, I think Jason is on vacation and I’m also leaving tomorrow. Just a heads up that the repo for the tests is now this one. https://github.com/dhis2/api-tests

BR,

Paulo

On Tue, Jun 28, 2016 at 1:00 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason & Paulo,

Hope you are doing good. We were busy with pushing the Metadata sync feature to DHIS2 trunk to make it in time for 2.24 release. We are done with that and I got some time to resume the automation. I was looking at https://github.com/pgracio/dhis2-api-system-test/, the tests are good and I would also like to same kind of test structure. Some clarifications though:

  1. How will we maintain the tests with respect to versioning of APIs?

As we know, now DHIS will be versioning APIs and there is going to be likely support for last three versions of APIs. So, we should be mindful of leveraging these tests for the future versions at the same time keeping them for previous versions as well.

We thought one possible approach, say we wrote tests on 23 APIs and then 24 APIs are released, we can clone the 23 repo and can create a new repo for 24 version, run all the tests and can raise bugs for valid breakages or fix the tests if required(if there is any change in contract of the APIs). So, this way we can have multiple repos for multiple versions of APIs. Only thing we need to take care of is extracting the URL to env file to make it easy to maintain. Or we can have a folder for each version in the single repo.

  1. As we already discussed about having the tests where we can set up required data using APIs which looks good for now. This should actually work fine when we test APIs for data elements, data sets etc. But in a bigger picture if we have to write tests for APIs like datavaluesets(which will give the data values of a data set). The entities involved here are “data elements, data sets, users, organisation units” and there are good number of associations involved in this scenario. So what do you think about such cases? Can we have a small database to preset these associations on which we can write tests and assert.

Understanding the above things would help us in making the tests scalable.

If you have any other things apart from this, we can discuss them as well. Please share your opinions on these things.

Thanks & Regards,

Nalinikanth M


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Mon, Jun 13, 2016 at 6:16 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason,

I see where you are coming from in terms of testing perspective. Different DBs can be a good input to test metadata import and export api in specific. But for a known state of DB to exist for other apis to be tested, DBs that are not compatible with the version being tested would be a problem.

@Paulo

I do agree with you on Option 3 so let us continue this email chain If necessary then we can setup a call. So let us keep the discussion going on here.

Regards,

Nalinikanth M


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Mon, Jun 13, 2016 at 12:22 PM, Jason Pickering jason.p.pickering@gmail.com wrote:

Hi there.

Here is my perspective. The entire purpose of the integration tests are to test these types of scenarios. Is it possible to perform a bulk metadata export from an arbitrary database, is sort of the test I think. Well, in this case, the developer of the API (Morten) tells you not to use this database because it is “old”. Well, it may be old, but it is also on the current version, so if this feature is supposed to work, well, it should work. If not, then we need to figure out why. That is the purpose of the test. I would expect this same test to work on any arbitrary database, so I think its perfectly legitimate, and see no reason why we should not test the SL database. Having said that, I think we should also test others, such as Trainingland, and enable the tests in such a way to allow people to arbitrarily test which ever system they wish. For the main tests, I think we should use the SL database specifically because it is “old” and in many ways, resembles a system which has been around a long time. Specfically for that reason, it should be tested, at least for certain test scenarios.

And having said all of that, we should not be testing scenarios which the feature developers wish us to test. That is not the point of these tests either. Currently the feature devs are writing their own tests, which is never really a good thing. The purpose of having an external team to develop these tests is to test things which maybe the feature devs don’t consider or don’t want to test.

Hope that helps to clarify my thinking here on what the original intent of these integration tests were. Does that help?

Regards,

Jason

P.S. Paulo’s time is very limited on this project, as he is acting as a part-time consultant to HISP Nordic. I suggest that we try and limit the need for calls unless really urgent, especially if Paulo needs to be involved. If you still feel a call is needed, lets try and start with me and then bring in Paulo in as needed. Paulo, you OK with that?


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Mon, Jun 13, 2016 at 8:35 AM, Paulo Grácio paulogracio@gmail.com wrote:

Hi, option 3 seems to me the best approach for now. What do you think?

Today I have a very busy day, but probably tomorrow morning we can have a call. What about 08:00AM CEST?

/Paulo

On Mon, Jun 13, 2016 at 7:55 AM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hey Paulo,

Thanks for your efforts, will try and let you know :slight_smile:

Jason & Paulo,

As Our team see a potential problem in using SL database, even we are confused of how to go ahead with tests, specially on what database.

Here are the options that we are looking at:

Option 1: Set up an empty vanilla instance. It is an empty database where we can set up data using APIs and can tear down once the tests are done. Entire data can be set up using a Json file or data can be created as required for every test.

Option 2: Set up a known state of database eg., say SL database. The state is maintained and we will be setting up the database before starting the execution of tests. As we are using docker every time we will have new instance say fresh SL database.

Option 3: We can have a know state of database with very low metadata in it. Where in we can add new data when required using APIs, as required for every test.

Can we have a call to discuss more on this. A 30 minutes call would do.

Thanks & Regards,

Nalinikanth M


Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+46764147049

On Sat, Jun 11, 2016 at 2:38 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi,

I have build a new image that can be used now to start a database container for 2.23-sierra-leone

image: pgracio/dhis2-db:2.23-sierra-leone

Give it a try and let me know if you have problems.

Regards,

Paulo

On Fri, Jun 10, 2016 at 10:44 AM Jason Pickering jason.p.pickering@gmail.com wrote:

I suggest that we use the SL demo. Reason being, it is stable, and does not change that much. I think that we can start with this. The Trainingland database is still under very active development. However, I don’t feel it makes a big difference. What is important is that we use a database which we know the state of. I think if Paulo can build a docker image tied to a given revision of the database, and we base our tests off of that, that would be the best approach.

Regards,

Jason

On Thu, Jun 9, 2016, 15:24 Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

I was using this as a reference to write the tests. https://github.com/dareid/chakram/blob/master/examples/spotify.js

Currently using an empty database, but we can use training. No strong opinions on this. Normally I prefer to have tests that don’t depend on database state, but in some situations it might be very difficult to create the desired state before running the test.

To make sure we are all on the same page it’s important that we use pull requests, before we merge things to master.

BR,

Paulo

On Thu, Jun 9, 2016 at 2:57 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason,

It is about what state of database we are going to use say training database or sierra leone or any other known state of database or a vanilla instance. Basically what is the state of the database.

How the tests will look like. As, we can write in so many ways, making sure that we all are on same page.

Call for 30min would be enough for this.

Best Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Thu, Jun 9, 2016 at 6:20 PM, Jason Pickering jason.p.pickering@gmail.com wrote:

Hi Nalinkath,

Paulo and I have quite limited time for this activity. Could you outline what the call would be about?

Regards,

Jason

On Thu, Jun 9, 2016, 14:48 Paulo Grácio paulogracio@gmail.com wrote:

which time zone are you in?

On Thu, Jun 9, 2016 at 2:42 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Paulo,

I am working on it. Yeah I am actually looking to discuss on few things with you and Jason. Can we setup a call to discuss on this based on your availability. I am planing to have a call with you and Jason next week.

@Paulo, @Jason

Please let me know you availability.


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Thu, Jun 9, 2016 at 5:54 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

are you doing any work on system test? I had a look at your repo and was considering to merge that with what I have.

BR,

Paulo

On Fri, Jun 3, 2016 at 1:34 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Paulo,

Thanks for your valuable inputs. I will try and will come back to you.

Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Fri, Jun 3, 2016 at 3:22 PM, Paulo Grácio paulogracio@gmail.com wrote:

HI Nalinikanth,

Currently new dhis2 war files for version 2.23 are generated and published by this job http://ci.dhis2.org/job/dhis2-2.23/. One of the final steps is copy-to-dhis2com.sh that makes this new war available for download at https://www.dhis2.org/download/releases/2.23/dhis.war

I think we could have a downstream job that generates a new docker image using the previously generated war file and publish it to docker hub. Automation for this can be found here

https://github.com/pgracio/dhis2-docker/blob/master/docker-build.sh.

Once the docker image is successfully generated we can run system test using docker compose.

https://github.com/pgracio/dhis2-api-system-test/blob/master/docker-compose.yml

All of these can be executed in Jenkins server, if the server as capacity to handle this, so no need to spin up new environments to execute the tests.

I see this as a initial step to introduce system test. With the pipeline flow that I have described, we’ll still deploy the war file if we detect potential errors during system test. A more long term vision for dhis2 pipeline would be

#1 - build dhis2 war file, without copy the file to dhis2.com

#2 - build docker image, without publish to docker hub

#3 - run system test, if success got to #4 else notify of broken tests.

#4 - copy war file to dhis2.com and publish docker image to docker hub.

Feel free to challenge this, it’s just one opinion. I guess dhis2 developers community might have a saying on this.

Best regards,

Paulo

On Fri, Jun 3, 2016 at 7:48 AM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Paulo: It is more on how the test environments are set up, say set up a docker environment and run tests on it. How the tests effect when it is a environment set up using different continuous integration environments say Jenkins/travis/GO etc. This kind of stuff is what I meant by maintaining environments.

Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Thu, Jun 2, 2016 at 12:23 AM, Paulo Grácio paulogracio@gmail.com wrote:

@ Nalinikanth what exactly do you mean with “maintain environments”?

Best regards,

Paulo

On Wed, Jun 1, 2016 at 1:57 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Paulo: Thank you for your quick response Its working fine now.

I had a look at you repo API tests. It looks good and I wrote some tests quite a while ago using the same framework. I tried to extract the data out of the tests to decrease dependency and to make things easy to maintain. You can find them here. Please have a look at them and let me know your opinions on it.

@Jason & @Paulo: May be next week we can have a call to talk about how the tests should look like and how we can maintain environments.

Thanks & Regards,

Nalinikanth M


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Fri, May 27, 2016 at 8:57 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Gracio I’m glad with your response and I had a look at the api test repo that looks good. I am on vacation till Tuesday. Will get back to you with my thoughts on it soon I’m back from vacation. I would love to talk more in the agreement as well, may be we can set up a call later next week or some time when it’s feasible for all of us.

Thanks & Regards,

Nalinikanth M

Sent from my iPhone

On 27-May-2016, at 7:46 PM, Paulo Grácio paulogracio@gmail.com wrote:

@Nalinikanth I have updated the repo, it should be now possible to start the service using training database.

https://github.com/pgracio/dhis2-docker/blob/master/docker-compose.yml

https://hub.docker.com/r/pgracio/dhis2-db/tags/

Let me know if you have problems.

Best regards,

Paulo

On Fri, May 27, 2016 at 1:00 PM Paulo Grácio paulogracio@gmail.com wrote:

@Nalinikanth, as Jason as mentioned I have created this repo to have API System Tests.
https://github.com/pgracio/ dhis2-api-system-test

This is a initial spike with 2 very basic tests. Please have a look to see if we can have a common agreement on how to do the tests. It includes some manual steps but soon I’ll add some automation mechanism to it to run the tests every time a new version is available.

Share your thoughts.

Best regards,

–Paulo

On Fri, May 27, 2016 at 12:18 PM Paul Grácio < paulogracio@gmail.com > wrote:

Hi Nalinikanth,

glad you are using dhis2-docker scripts :slight_smile:

Currently dhis2-db image only works for version 2.21 and 2.20, this needs some care from my side. Guess you are trying to run the latest version, 2.23

@Jason is snapshot database dump that works with version 2.3?

Best regards,

Paul Grácio

On Fri, May 27, 2016 at 11:00 AM Jason Pickering jason.p.pickering@gmail.com wrote:

Hi there.

I have been meaning to mail you about this. Paolo has another repo here

https://github.com/pgracio/ dhis2-api-system-test

which we started last week. It includes some very simply Chakram based tests.

I think this is more or less what we discussed a few weeks back. Paolo will also be working with us on this.

Maybe Paolo can comment more on the database.

I have another repo here

https://github.com/jason-p- pickering / dhis2-docker

which loads the training land database. I think this should point you in the right direction.

At any rate, we should probably start to issue some PRs on Paolo’s repo and then eventually, we will pull this into the main DHIS2 group repo.

Best regards,

Jason

On Fri, May 27, 2016 at 10:54 AM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

To add context, I am Nalinikanth M, QA at ThoughtWorks. We are working on DHIS2 for an MSF project. We wanted to automate a few tests on DHIS2. I got the docker repository from Jason as we were looking for setting up test environments. As a part of our Test plan we want to use Docker instances to run automation tests.

On Fri, May 27, 2016 at 1:55 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Gracio,

We are using the scripts from your repository to set up a docker environment for dhis2. We were able to get the application up on docker and can use the application, but we are unable to get Sierra Leone database on the application. Can you please help us resolve this issue.

P.S. : We are new to docker, we are following your Readme and docker documentation to set things up.

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+46764147049

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Hi Nalinikanth,

is there any reason to keep a separate repo with these tests? It would be nice to have the tests as part of https://github.com/dhis2/api-tests

There is also a pipeline to execute the tests in Travis https://travis-ci.org/dhis2/api-tests

Docker image build is also automated, runs once per day, and is publishing images to Docker Hub https://hub.docker.com/r/dhis2/dhis2-web/tags/

Best regards,

Paulo

···

On Thu, Aug 18, 2016 at 11:32 AM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hey All,

I am Nalinikanth, QA on the MSF-OCA project and we are using DHIS2. We are building API automation suites as a part of our project. We are working along with Jason. P and Paulo. We have been discussing on how to take this forward and you can find our discussions thread in this mail.

Please do comment or provide feedback if you have any ideas or thoughts around the same.

I am attaching the repos as well for your reference.

https://github.com/msf-oca-his/API_Test

https://github.com/dhis2/api-tests

Feedback from the community would be well appreciated.

Thanks & Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Wed, Aug 10, 2016 at 3:41 PM, Vanya Seth vanyas@thoughtworks.com wrote:

Hi All

It makes sense to make this discussion public. So, that other members of the community can also provide their inputs.

Regards

Vanya


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Tue, Aug 9, 2016 at 1:36 AM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

is it the idea to keep a different repo with the tests for metadata versioning?

Regarding how to setup data I don’t have strong opinions on this. I think we should try one approach and see if it works. The initial idea was to have a docker image already baked with the data we want, for each test execution, that we can control using docker compose.

https://github.com/dhis2/api-tests/blob/master/docker-compose.yml

– Paulo

On Wed, Aug 3, 2016 at 3:21 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason/ Paulo,

Hope you are doing well.

As a part of the API testing we have written some test for the metadata versioning APIs, which is a core feature contributed by us to DHIS2 version 2.24. We did minor changes to the folder structure. We leverage before and after functions to setup and tear down data. Please have a look at the tests here. Please do let us know any feedback on the tests.

I have been through the repo that Paulo was working on, the way he extracted the version in env.js file looks okay but we did it in a slightly different way. That anyway would help us in providing the ability for tests to run across multiple versions of DHIS2.

One more thing to discuss upon is we can do contract testing of APIs which might not need a predefined data in the data base but, in some cases like when we test datavaluesets or any other similar APIs we might need some data which should already be set up. Similarly, we want to leverage the API testing to do integration tests as well. This will require a database set up to be done before the tests run on the system. For that we can have a DHIS2 empty instance on which we can set up the data and remove the database once the tests are run. We are looking at two ways to accomplish this:

  1. Setting the database dump using sql scripts.
  2. We can create data using metadata import API(using import API to set up metadata), where the set up will run before the tests.
    We how ever feel setting up metadata using APIs will be useful as we can leverage it irrespective of the database we are using and it will be able to create data properly across versions. Where as setting up the database using sql might have to be maintained and should be migrated properly for every version of DHIS2 release. So we are a kind of not wanting to implement this way. So we feel the second way of setting up data required for tests makes more sense. Can you please share your thoughts on this as well.

Thanks & Regards,

Nalinikanth M


With Regards
ThoughtWorks Technologies

Hyderabad

–Stay Hungry Stay Foolish!!

On Tue, Jun 28, 2016 at 5:54 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Thank you Paulo, Enjoy your vacation we can discuss once you are back :slight_smile:


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Tue, Jun 28, 2016 at 5:51 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi, I think Jason is on vacation and I’m also leaving tomorrow. Just a heads up that the repo for the tests is now this one. https://github.com/dhis2/api-tests

BR,

Paulo

On Tue, Jun 28, 2016 at 1:00 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason & Paulo,

Hope you are doing good. We were busy with pushing the Metadata sync feature to DHIS2 trunk to make it in time for 2.24 release. We are done with that and I got some time to resume the automation. I was looking at https://github.com/pgracio/dhis2-api-system-test/, the tests are good and I would also like to same kind of test structure. Some clarifications though:

  1. How will we maintain the tests with respect to versioning of APIs?

As we know, now DHIS will be versioning APIs and there is going to be likely support for last three versions of APIs. So, we should be mindful of leveraging these tests for the future versions at the same time keeping them for previous versions as well.

We thought one possible approach, say we wrote tests on 23 APIs and then 24 APIs are released, we can clone the 23 repo and can create a new repo for 24 version, run all the tests and can raise bugs for valid breakages or fix the tests if required(if there is any change in contract of the APIs). So, this way we can have multiple repos for multiple versions of APIs. Only thing we need to take care of is extracting the URL to env file to make it easy to maintain. Or we can have a folder for each version in the single repo.

  1. As we already discussed about having the tests where we can set up required data using APIs which looks good for now. This should actually work fine when we test APIs for data elements, data sets etc. But in a bigger picture if we have to write tests for APIs like datavaluesets(which will give the data values of a data set). The entities involved here are “data elements, data sets, users, organisation units” and there are good number of associations involved in this scenario. So what do you think about such cases? Can we have a small database to preset these associations on which we can write tests and assert.

Understanding the above things would help us in making the tests scalable.

If you have any other things apart from this, we can discuss them as well. Please share your opinions on these things.

Thanks & Regards,

Nalinikanth M


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Mon, Jun 13, 2016 at 6:16 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason,

I see where you are coming from in terms of testing perspective. Different DBs can be a good input to test metadata import and export api in specific. But for a known state of DB to exist for other apis to be tested, DBs that are not compatible with the version being tested would be a problem.

@Paulo

I do agree with you on Option 3 so let us continue this email chain If necessary then we can setup a call. So let us keep the discussion going on here.

Regards,

Nalinikanth M


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Mon, Jun 13, 2016 at 12:22 PM, Jason Pickering jason.p.pickering@gmail.com wrote:

Hi there.

Here is my perspective. The entire purpose of the integration tests are to test these types of scenarios. Is it possible to perform a bulk metadata export from an arbitrary database, is sort of the test I think. Well, in this case, the developer of the API (Morten) tells you not to use this database because it is “old”. Well, it may be old, but it is also on the current version, so if this feature is supposed to work, well, it should work. If not, then we need to figure out why. That is the purpose of the test. I would expect this same test to work on any arbitrary database, so I think its perfectly legitimate, and see no reason why we should not test the SL database. Having said that, I think we should also test others, such as Trainingland, and enable the tests in such a way to allow people to arbitrarily test which ever system they wish. For the main tests, I think we should use the SL database specifically because it is “old” and in many ways, resembles a system which has been around a long time. Specfically for that reason, it should be tested, at least for certain test scenarios.

And having said all of that, we should not be testing scenarios which the feature developers wish us to test. That is not the point of these tests either. Currently the feature devs are writing their own tests, which is never really a good thing. The purpose of having an external team to develop these tests is to test things which maybe the feature devs don’t consider or don’t want to test.

Hope that helps to clarify my thinking here on what the original intent of these integration tests were. Does that help?

Regards,

Jason

P.S. Paulo’s time is very limited on this project, as he is acting as a part-time consultant to HISP Nordic. I suggest that we try and limit the need for calls unless really urgent, especially if Paulo needs to be involved. If you still feel a call is needed, lets try and start with me and then bring in Paulo in as needed. Paulo, you OK with that?


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Mon, Jun 13, 2016 at 8:35 AM, Paulo Grácio paulogracio@gmail.com wrote:

Hi, option 3 seems to me the best approach for now. What do you think?

Today I have a very busy day, but probably tomorrow morning we can have a call. What about 08:00AM CEST?

/Paulo

On Mon, Jun 13, 2016 at 7:55 AM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hey Paulo,

Thanks for your efforts, will try and let you know :slight_smile:

Jason & Paulo,

As Our team see a potential problem in using SL database, even we are confused of how to go ahead with tests, specially on what database.

Here are the options that we are looking at:

Option 1: Set up an empty vanilla instance. It is an empty database where we can set up data using APIs and can tear down once the tests are done. Entire data can be set up using a Json file or data can be created as required for every test.

Option 2: Set up a known state of database eg., say SL database. The state is maintained and we will be setting up the database before starting the execution of tests. As we are using docker every time we will have new instance say fresh SL database.

Option 3: We can have a know state of database with very low metadata in it. Where in we can add new data when required using APIs, as required for every test.

Can we have a call to discuss more on this. A 30 minutes call would do.

Thanks & Regards,

Nalinikanth M


Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+46764147049

On Sat, Jun 11, 2016 at 2:38 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi,

I have build a new image that can be used now to start a database container for 2.23-sierra-leone

image: pgracio/dhis2-db:2.23-sierra-leone

Give it a try and let me know if you have problems.

Regards,

Paulo

On Fri, Jun 10, 2016 at 10:44 AM Jason Pickering jason.p.pickering@gmail.com wrote:

I suggest that we use the SL demo. Reason being, it is stable, and does not change that much. I think that we can start with this. The Trainingland database is still under very active development. However, I don’t feel it makes a big difference. What is important is that we use a database which we know the state of. I think if Paulo can build a docker image tied to a given revision of the database, and we base our tests off of that, that would be the best approach.

Regards,

Jason

On Thu, Jun 9, 2016, 15:24 Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

I was using this as a reference to write the tests. https://github.com/dareid/chakram/blob/master/examples/spotify.js

Currently using an empty database, but we can use training. No strong opinions on this. Normally I prefer to have tests that don’t depend on database state, but in some situations it might be very difficult to create the desired state before running the test.

To make sure we are all on the same page it’s important that we use pull requests, before we merge things to master.

BR,

Paulo

On Thu, Jun 9, 2016 at 2:57 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason,

It is about what state of database we are going to use say training database or sierra leone or any other known state of database or a vanilla instance. Basically what is the state of the database.

How the tests will look like. As, we can write in so many ways, making sure that we all are on same page.

Call for 30min would be enough for this.

Best Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Thu, Jun 9, 2016 at 6:20 PM, Jason Pickering jason.p.pickering@gmail.com wrote:

Hi Nalinkath,

Paulo and I have quite limited time for this activity. Could you outline what the call would be about?

Regards,

Jason

On Thu, Jun 9, 2016, 14:48 Paulo Grácio paulogracio@gmail.com wrote:

which time zone are you in?

On Thu, Jun 9, 2016 at 2:42 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Paulo,

I am working on it. Yeah I am actually looking to discuss on few things with you and Jason. Can we setup a call to discuss on this based on your availability. I am planing to have a call with you and Jason next week.

@Paulo, @Jason

Please let me know you availability.


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Thu, Jun 9, 2016 at 5:54 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

are you doing any work on system test? I had a look at your repo and was considering to merge that with what I have.

BR,

Paulo

On Fri, Jun 3, 2016 at 1:34 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Paulo,

Thanks for your valuable inputs. I will try and will come back to you.

Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Fri, Jun 3, 2016 at 3:22 PM, Paulo Grácio paulogracio@gmail.com wrote:

HI Nalinikanth,

Currently new dhis2 war files for version 2.23 are generated and published by this job http://ci.dhis2.org/job/dhis2-2.23/. One of the final steps is copy-to-dhis2com.sh that makes this new war available for download at https://www.dhis2.org/download/releases/2.23/dhis.war

I think we could have a downstream job that generates a new docker image using the previously generated war file and publish it to docker hub. Automation for this can be found here

https://github.com/pgracio/dhis2-docker/blob/master/docker-build.sh.

Once the docker image is successfully generated we can run system test using docker compose.

https://github.com/pgracio/dhis2-api-system-test/blob/master/docker-compose.yml

All of these can be executed in Jenkins server, if the server as capacity to handle this, so no need to spin up new environments to execute the tests.

I see this as a initial step to introduce system test. With the pipeline flow that I have described, we’ll still deploy the war file if we detect potential errors during system test. A more long term vision for dhis2 pipeline would be

#1 - build dhis2 war file, without copy the file to dhis2.com

#2 - build docker image, without publish to docker hub

#3 - run system test, if success got to #4 else notify of broken tests.

#4 - copy war file to dhis2.com and publish docker image to docker hub.

Feel free to challenge this, it’s just one opinion. I guess dhis2 developers community might have a saying on this.

Best regards,

Paulo

On Fri, Jun 3, 2016 at 7:48 AM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Paulo: It is more on how the test environments are set up, say set up a docker environment and run tests on it. How the tests effect when it is a environment set up using different continuous integration environments say Jenkins/travis/GO etc. This kind of stuff is what I meant by maintaining environments.

Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Thu, Jun 2, 2016 at 12:23 AM, Paulo Grácio paulogracio@gmail.com wrote:

@ Nalinikanth what exactly do you mean with “maintain environments”?

Best regards,

Paulo

On Wed, Jun 1, 2016 at 1:57 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Paulo: Thank you for your quick response Its working fine now.

I had a look at you repo API tests. It looks good and I wrote some tests quite a while ago using the same framework. I tried to extract the data out of the tests to decrease dependency and to make things easy to maintain. You can find them here. Please have a look at them and let me know your opinions on it.

@Jason & @Paulo: May be next week we can have a call to talk about how the tests should look like and how we can maintain environments.

Thanks & Regards,

Nalinikanth M


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Fri, May 27, 2016 at 8:57 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Gracio I’m glad with your response and I had a look at the api test repo that looks good. I am on vacation till Tuesday. Will get back to you with my thoughts on it soon I’m back from vacation. I would love to talk more in the agreement as well, may be we can set up a call later next week or some time when it’s feasible for all of us.

Thanks & Regards,

Nalinikanth M

Sent from my iPhone

On 27-May-2016, at 7:46 PM, Paulo Grácio paulogracio@gmail.com wrote:

@Nalinikanth I have updated the repo, it should be now possible to start the service using training database.

https://github.com/pgracio/dhis2-docker/blob/master/docker-compose.yml

https://hub.docker.com/r/pgracio/dhis2-db/tags/

Let me know if you have problems.

Best regards,

Paulo

On Fri, May 27, 2016 at 1:00 PM Paulo Grácio paulogracio@gmail.com wrote:

@Nalinikanth, as Jason as mentioned I have created this repo to have API System Tests.
https://github.com/pgracio/ dhis2-api-system-test

This is a initial spike with 2 very basic tests. Please have a look to see if we can have a common agreement on how to do the tests. It includes some manual steps but soon I’ll add some automation mechanism to it to run the tests every time a new version is available.

Share your thoughts.

Best regards,

–Paulo

On Fri, May 27, 2016 at 12:18 PM Paul Grácio < paulogracio@gmail.com > wrote:

Hi Nalinikanth,

glad you are using dhis2-docker scripts :slight_smile:

Currently dhis2-db image only works for version 2.21 and 2.20, this needs some care from my side. Guess you are trying to run the latest version, 2.23

@Jason is snapshot database dump that works with version 2.3?

Best regards,

Paul Grácio

On Fri, May 27, 2016 at 11:00 AM Jason Pickering jason.p.pickering@gmail.com wrote:

Hi there.

I have been meaning to mail you about this. Paolo has another repo here

https://github.com/pgracio/ dhis2-api-system-test

which we started last week. It includes some very simply Chakram based tests.

I think this is more or less what we discussed a few weeks back. Paolo will also be working with us on this.

Maybe Paolo can comment more on the database.

I have another repo here

https://github.com/jason-p- pickering / dhis2-docker

which loads the training land database. I think this should point you in the right direction.

At any rate, we should probably start to issue some PRs on Paolo’s repo and then eventually, we will pull this into the main DHIS2 group repo.

Best regards,

Jason

On Fri, May 27, 2016 at 10:54 AM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

To add context, I am Nalinikanth M, QA at ThoughtWorks. We are working on DHIS2 for an MSF project. We wanted to automate a few tests on DHIS2. I got the docker repository from Jason as we were looking for setting up test environments. As a part of our Test plan we want to use Docker instances to run automation tests.

On Fri, May 27, 2016 at 1:55 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Gracio,

We are using the scripts from your repository to set up a docker environment for dhis2. We were able to get the application up on docker and can use the application, but we are unable to get Sierra Leone database on the application. Can you please help us resolve this issue.

P.S. : We are new to docker, we are following your Readme and docker documentation to set things up.

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+46764147049

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Hi Paulo,

There is no specific reason these were under development. But these were dependent will have a look and see how can I merge them.

Thanks & Regards,

Nalinikanth M

···

On Wed, Sep 28, 2016 at 10:04 AM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

is there any reason to keep a separate repo with these tests? It would be nice to have the tests as part of https://github.com/dhis2/api-tests

There is also a pipeline to execute the tests in Travis https://travis-ci.org/dhis2/api-tests

Docker image build is also automated, runs once per day, and is publishing images to Docker Hub https://hub.docker.com/r/dhis2/dhis2-web/tags/

Best regards,

Paulo

On Tue, Sep 27, 2016 at 2:41 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason & Paulo,

Hope you are doing good.

It has been a long time! We are writing tests for metadata sync. All the tests are added to the repo here.

In the above repository navigate to the path API_Test/testcases/metadatasync/integration/

there is a generic test “ImportMetadataTest" which we wrote for testing how various types of metadata entities will sync from HQ/central instance to local/field.

There are two ways of running the test

  1. To run this test without any database on HQ and Local.

To test how sync is behaving with respect to various metadata entities on two new instances without any data model on it. All we need is to have metadata versions in this folder - API_Test/testdata/metadatasync/versiondata

We can have any number of versions in the folder. It depends on how user wants metadata sync to happen or what all metadata associations or disassociations user wants to test. For now I kept two version files.

To run the test for Version_1 run this should be run using "env version=“Version_2” mocha ImportMetadataTest.js --timeout 20000” which is can be added to a shell script to run version one after the other like it is in integrationTestsWithoutDB.sh file. This will first import data on HQ/Central instance using import api and then Local/field instance will sync the version from HQ.

Once the version is synced to Local/Field then we are doing two tests. One is asserting the data in

http://local/api/metadata/version/Version_1/data with http://HQ/api/metadata/version/Version_1/data by comparing them.

Later it will compare all the entities(which are present in that version) individually say we have a array of data elements then it will pick all the data elements and compare one by one and continues for other entities as well.

e.g: It will compare http://local/api/dataElements/id with http://HQ/api/dataElements/id

  1. To test how sync is behaving with respect to various metadata entities on two instances where HQ already have n versions[Pre defined database]. We are using the same script to import version by version. It will also do a couple of assertions on top of the metadata when synced. The first assertion being same as above it will compare the data in http://local/api/metadata/version/Version_1/data with http://HQ/api/metadata/version/Version_1/data.

But the next level comparison is a bit different it will compare the entities by fetching the entity data which is present in

http://local/api/metadata/version/Version_1/data with http://local/api/dataElements/id

Here there won’t be entire json for any entity on http://local/api/metadata/version/Version_1/data this will contain very limited details we are just comparing the minimal entities getting them from Local/Field using jsonfilters in api call.

We had this kind of assertions because say user has Version_1 and has a data element abcd1234 and the name might have changed in Version_2 abcd12345 as HQ has got n versions in it so if we want to compare json of it on both HQ and Local we have different names so we took this approach.

Can you please have a look at this and let me know if any changes are required.

Thanks & Regards,

Nalinikanth M

On Thu, Aug 18, 2016 at 11:32 AM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hey All,

I am Nalinikanth, QA on the MSF-OCA project and we are using DHIS2. We are building API automation suites as a part of our project. We are working along with Jason. P and Paulo. We have been discussing on how to take this forward and you can find our discussions thread in this mail.

Please do comment or provide feedback if you have any ideas or thoughts around the same.

I am attaching the repos as well for your reference.

https://github.com/msf-oca-his/API_Test

https://github.com/dhis2/api-tests

Feedback from the community would be well appreciated.

Thanks & Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Wed, Aug 10, 2016 at 3:41 PM, Vanya Seth vanyas@thoughtworks.com wrote:

Hi All

It makes sense to make this discussion public. So, that other members of the community can also provide their inputs.

Regards

Vanya


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Tue, Aug 9, 2016 at 1:36 AM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

is it the idea to keep a different repo with the tests for metadata versioning?

Regarding how to setup data I don’t have strong opinions on this. I think we should try one approach and see if it works. The initial idea was to have a docker image already baked with the data we want, for each test execution, that we can control using docker compose.

https://github.com/dhis2/api-tests/blob/master/docker-compose.yml

– Paulo

On Wed, Aug 3, 2016 at 3:21 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason/ Paulo,

Hope you are doing well.

As a part of the API testing we have written some test for the metadata versioning APIs, which is a core feature contributed by us to DHIS2 version 2.24. We did minor changes to the folder structure. We leverage before and after functions to setup and tear down data. Please have a look at the tests here. Please do let us know any feedback on the tests.

I have been through the repo that Paulo was working on, the way he extracted the version in env.js file looks okay but we did it in a slightly different way. That anyway would help us in providing the ability for tests to run across multiple versions of DHIS2.

One more thing to discuss upon is we can do contract testing of APIs which might not need a predefined data in the data base but, in some cases like when we test datavaluesets or any other similar APIs we might need some data which should already be set up. Similarly, we want to leverage the API testing to do integration tests as well. This will require a database set up to be done before the tests run on the system. For that we can have a DHIS2 empty instance on which we can set up the data and remove the database once the tests are run. We are looking at two ways to accomplish this:

  1. Setting the database dump using sql scripts.
  2. We can create data using metadata import API(using import API to set up metadata), where the set up will run before the tests.
    We how ever feel setting up metadata using APIs will be useful as we can leverage it irrespective of the database we are using and it will be able to create data properly across versions. Where as setting up the database using sql might have to be maintained and should be migrated properly for every version of DHIS2 release. So we are a kind of not wanting to implement this way. So we feel the second way of setting up data required for tests makes more sense. Can you please share your thoughts on this as well.

Thanks & Regards,

Nalinikanth M


With Regards
ThoughtWorks Technologies

Hyderabad

–Stay Hungry Stay Foolish!!

On Tue, Jun 28, 2016 at 5:54 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Thank you Paulo, Enjoy your vacation we can discuss once you are back :slight_smile:


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Tue, Jun 28, 2016 at 5:51 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi, I think Jason is on vacation and I’m also leaving tomorrow. Just a heads up that the repo for the tests is now this one. https://github.com/dhis2/api-tests

BR,

Paulo

On Tue, Jun 28, 2016 at 1:00 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason & Paulo,

Hope you are doing good. We were busy with pushing the Metadata sync feature to DHIS2 trunk to make it in time for 2.24 release. We are done with that and I got some time to resume the automation. I was looking at https://github.com/pgracio/dhis2-api-system-test/, the tests are good and I would also like to same kind of test structure. Some clarifications though:

  1. How will we maintain the tests with respect to versioning of APIs?

As we know, now DHIS will be versioning APIs and there is going to be likely support for last three versions of APIs. So, we should be mindful of leveraging these tests for the future versions at the same time keeping them for previous versions as well.

We thought one possible approach, say we wrote tests on 23 APIs and then 24 APIs are released, we can clone the 23 repo and can create a new repo for 24 version, run all the tests and can raise bugs for valid breakages or fix the tests if required(if there is any change in contract of the APIs). So, this way we can have multiple repos for multiple versions of APIs. Only thing we need to take care of is extracting the URL to env file to make it easy to maintain. Or we can have a folder for each version in the single repo.

  1. As we already discussed about having the tests where we can set up required data using APIs which looks good for now. This should actually work fine when we test APIs for data elements, data sets etc. But in a bigger picture if we have to write tests for APIs like datavaluesets(which will give the data values of a data set). The entities involved here are “data elements, data sets, users, organisation units” and there are good number of associations involved in this scenario. So what do you think about such cases? Can we have a small database to preset these associations on which we can write tests and assert.

Understanding the above things would help us in making the tests scalable.

If you have any other things apart from this, we can discuss them as well. Please share your opinions on these things.

Thanks & Regards,

Nalinikanth M


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Mon, Jun 13, 2016 at 6:16 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason,

I see where you are coming from in terms of testing perspective. Different DBs can be a good input to test metadata import and export api in specific. But for a known state of DB to exist for other apis to be tested, DBs that are not compatible with the version being tested would be a problem.

@Paulo

I do agree with you on Option 3 so let us continue this email chain If necessary then we can setup a call. So let us keep the discussion going on here.

Regards,

Nalinikanth M


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Mon, Jun 13, 2016 at 12:22 PM, Jason Pickering jason.p.pickering@gmail.com wrote:

Hi there.

Here is my perspective. The entire purpose of the integration tests are to test these types of scenarios. Is it possible to perform a bulk metadata export from an arbitrary database, is sort of the test I think. Well, in this case, the developer of the API (Morten) tells you not to use this database because it is “old”. Well, it may be old, but it is also on the current version, so if this feature is supposed to work, well, it should work. If not, then we need to figure out why. That is the purpose of the test. I would expect this same test to work on any arbitrary database, so I think its perfectly legitimate, and see no reason why we should not test the SL database. Having said that, I think we should also test others, such as Trainingland, and enable the tests in such a way to allow people to arbitrarily test which ever system they wish. For the main tests, I think we should use the SL database specifically because it is “old” and in many ways, resembles a system which has been around a long time. Specfically for that reason, it should be tested, at least for certain test scenarios.

And having said all of that, we should not be testing scenarios which the feature developers wish us to test. That is not the point of these tests either. Currently the feature devs are writing their own tests, which is never really a good thing. The purpose of having an external team to develop these tests is to test things which maybe the feature devs don’t consider or don’t want to test.

Hope that helps to clarify my thinking here on what the original intent of these integration tests were. Does that help?

Regards,

Jason

P.S. Paulo’s time is very limited on this project, as he is acting as a part-time consultant to HISP Nordic. I suggest that we try and limit the need for calls unless really urgent, especially if Paulo needs to be involved. If you still feel a call is needed, lets try and start with me and then bring in Paulo in as needed. Paulo, you OK with that?


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Mon, Jun 13, 2016 at 8:35 AM, Paulo Grácio paulogracio@gmail.com wrote:

Hi, option 3 seems to me the best approach for now. What do you think?

Today I have a very busy day, but probably tomorrow morning we can have a call. What about 08:00AM CEST?

/Paulo

On Mon, Jun 13, 2016 at 7:55 AM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hey Paulo,

Thanks for your efforts, will try and let you know :slight_smile:

Jason & Paulo,

As Our team see a potential problem in using SL database, even we are confused of how to go ahead with tests, specially on what database.

Here are the options that we are looking at:

Option 1: Set up an empty vanilla instance. It is an empty database where we can set up data using APIs and can tear down once the tests are done. Entire data can be set up using a Json file or data can be created as required for every test.

Option 2: Set up a known state of database eg., say SL database. The state is maintained and we will be setting up the database before starting the execution of tests. As we are using docker every time we will have new instance say fresh SL database.

Option 3: We can have a know state of database with very low metadata in it. Where in we can add new data when required using APIs, as required for every test.

Can we have a call to discuss more on this. A 30 minutes call would do.

Thanks & Regards,

Nalinikanth M


Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+46764147049

On Sat, Jun 11, 2016 at 2:38 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi,

I have build a new image that can be used now to start a database container for 2.23-sierra-leone

image: pgracio/dhis2-db:2.23-sierra-leone

Give it a try and let me know if you have problems.

Regards,

Paulo

On Fri, Jun 10, 2016 at 10:44 AM Jason Pickering jason.p.pickering@gmail.com wrote:

I suggest that we use the SL demo. Reason being, it is stable, and does not change that much. I think that we can start with this. The Trainingland database is still under very active development. However, I don’t feel it makes a big difference. What is important is that we use a database which we know the state of. I think if Paulo can build a docker image tied to a given revision of the database, and we base our tests off of that, that would be the best approach.

Regards,

Jason

On Thu, Jun 9, 2016, 15:24 Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

I was using this as a reference to write the tests. https://github.com/dareid/chakram/blob/master/examples/spotify.js

Currently using an empty database, but we can use training. No strong opinions on this. Normally I prefer to have tests that don’t depend on database state, but in some situations it might be very difficult to create the desired state before running the test.

To make sure we are all on the same page it’s important that we use pull requests, before we merge things to master.

BR,

Paulo

On Thu, Jun 9, 2016 at 2:57 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason,

It is about what state of database we are going to use say training database or sierra leone or any other known state of database or a vanilla instance. Basically what is the state of the database.

How the tests will look like. As, we can write in so many ways, making sure that we all are on same page.

Call for 30min would be enough for this.

Best Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Thu, Jun 9, 2016 at 6:20 PM, Jason Pickering jason.p.pickering@gmail.com wrote:

Hi Nalinkath,

Paulo and I have quite limited time for this activity. Could you outline what the call would be about?

Regards,

Jason

On Thu, Jun 9, 2016, 14:48 Paulo Grácio paulogracio@gmail.com wrote:

which time zone are you in?

On Thu, Jun 9, 2016 at 2:42 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Paulo,

I am working on it. Yeah I am actually looking to discuss on few things with you and Jason. Can we setup a call to discuss on this based on your availability. I am planing to have a call with you and Jason next week.

@Paulo, @Jason

Please let me know you availability.


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Thu, Jun 9, 2016 at 5:54 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

are you doing any work on system test? I had a look at your repo and was considering to merge that with what I have.

BR,

Paulo

On Fri, Jun 3, 2016 at 1:34 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Paulo,

Thanks for your valuable inputs. I will try and will come back to you.

Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Fri, Jun 3, 2016 at 3:22 PM, Paulo Grácio paulogracio@gmail.com wrote:

HI Nalinikanth,

Currently new dhis2 war files for version 2.23 are generated and published by this job http://ci.dhis2.org/job/dhis2-2.23/. One of the final steps is copy-to-dhis2com.sh that makes this new war available for download at https://www.dhis2.org/download/releases/2.23/dhis.war

I think we could have a downstream job that generates a new docker image using the previously generated war file and publish it to docker hub. Automation for this can be found here

https://github.com/pgracio/dhis2-docker/blob/master/docker-build.sh.

Once the docker image is successfully generated we can run system test using docker compose.

https://github.com/pgracio/dhis2-api-system-test/blob/master/docker-compose.yml

All of these can be executed in Jenkins server, if the server as capacity to handle this, so no need to spin up new environments to execute the tests.

I see this as a initial step to introduce system test. With the pipeline flow that I have described, we’ll still deploy the war file if we detect potential errors during system test. A more long term vision for dhis2 pipeline would be

#1 - build dhis2 war file, without copy the file to dhis2.com

#2 - build docker image, without publish to docker hub

#3 - run system test, if success got to #4 else notify of broken tests.

#4 - copy war file to dhis2.com and publish docker image to docker hub.

Feel free to challenge this, it’s just one opinion. I guess dhis2 developers community might have a saying on this.

Best regards,

Paulo

On Fri, Jun 3, 2016 at 7:48 AM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Paulo: It is more on how the test environments are set up, say set up a docker environment and run tests on it. How the tests effect when it is a environment set up using different continuous integration environments say Jenkins/travis/GO etc. This kind of stuff is what I meant by maintaining environments.

Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Thu, Jun 2, 2016 at 12:23 AM, Paulo Grácio paulogracio@gmail.com wrote:

@ Nalinikanth what exactly do you mean with “maintain environments”?

Best regards,

Paulo

On Wed, Jun 1, 2016 at 1:57 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Paulo: Thank you for your quick response Its working fine now.

I had a look at you repo API tests. It looks good and I wrote some tests quite a while ago using the same framework. I tried to extract the data out of the tests to decrease dependency and to make things easy to maintain. You can find them here. Please have a look at them and let me know your opinions on it.

@Jason & @Paulo: May be next week we can have a call to talk about how the tests should look like and how we can maintain environments.

Thanks & Regards,

Nalinikanth M


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Fri, May 27, 2016 at 8:57 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Gracio I’m glad with your response and I had a look at the api test repo that looks good. I am on vacation till Tuesday. Will get back to you with my thoughts on it soon I’m back from vacation. I would love to talk more in the agreement as well, may be we can set up a call later next week or some time when it’s feasible for all of us.

Thanks & Regards,

Nalinikanth M

Sent from my iPhone

On 27-May-2016, at 7:46 PM, Paulo Grácio paulogracio@gmail.com wrote:

@Nalinikanth I have updated the repo, it should be now possible to start the service using training database.

https://github.com/pgracio/dhis2-docker/blob/master/docker-compose.yml

https://hub.docker.com/r/pgracio/dhis2-db/tags/

Let me know if you have problems.

Best regards,

Paulo

On Fri, May 27, 2016 at 1:00 PM Paulo Grácio paulogracio@gmail.com wrote:

@Nalinikanth, as Jason as mentioned I have created this repo to have API System Tests.
https://github.com/pgracio/ dhis2-api-system-test

This is a initial spike with 2 very basic tests. Please have a look to see if we can have a common agreement on how to do the tests. It includes some manual steps but soon I’ll add some automation mechanism to it to run the tests every time a new version is available.

Share your thoughts.

Best regards,

–Paulo

On Fri, May 27, 2016 at 12:18 PM Paul Grácio < paulogracio@gmail.com > wrote:

Hi Nalinikanth,

glad you are using dhis2-docker scripts :slight_smile:

Currently dhis2-db image only works for version 2.21 and 2.20, this needs some care from my side. Guess you are trying to run the latest version, 2.23

@Jason is snapshot database dump that works with version 2.3?

Best regards,

Paul Grácio

On Fri, May 27, 2016 at 11:00 AM Jason Pickering jason.p.pickering@gmail.com wrote:

Hi there.

I have been meaning to mail you about this. Paolo has another repo here

https://github.com/pgracio/ dhis2-api-system-test

which we started last week. It includes some very simply Chakram based tests.

I think this is more or less what we discussed a few weeks back. Paolo will also be working with us on this.

Maybe Paolo can comment more on the database.

I have another repo here

https://github.com/jason-p- pickering / dhis2-docker

which loads the training land database. I think this should point you in the right direction.

At any rate, we should probably start to issue some PRs on Paolo’s repo and then eventually, we will pull this into the main DHIS2 group repo.

Best regards,

Jason

On Fri, May 27, 2016 at 10:54 AM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

To add context, I am Nalinikanth M, QA at ThoughtWorks. We are working on DHIS2 for an MSF project. We wanted to automate a few tests on DHIS2. I got the docker repository from Jason as we were looking for setting up test environments. As a part of our Test plan we want to use Docker instances to run automation tests.

On Fri, May 27, 2016 at 1:55 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Gracio,

We are using the scripts from your repository to set up a docker environment for dhis2. We were able to get the application up on docker and can use the application, but we are unable to get Sierra Leone database on the application. Can you please help us resolve this issue.

P.S. : We are new to docker, we are following your Readme and docker documentation to set things up.

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+46764147049

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Hey Jason/Paulo,

I have sent a pull request to merge the tests that we have till now.

I got an issue while running a test for all the APIs The test is calling an API with improper authorization. When an API is called with improper authorization it should give 401 status. I used to get this status earlier but now couldn’t get the status it should be either issue with chakram or may be the we are not getting a proper json response from DHIS2. I will merge again once I fixed this issue. please let me know if you have any insights about this.

When running integration tests it should span across two environments so added a new environment file and aligned in the same way so we can pass it during run time for tests. Will be adding more data checks to sync and compare, once they are done will merge it.

And all the contract tests for versioning should be run on a fresh db instance and this should be run in the same order as in versioningContractTests.sh folder we tried keeping this tests independent but couldn’t succeed in achieving that as we don’t have a delete API to delete versions so once a version is created we need to delete it from backend so for now these tests are coupled. The delete API is there in our pipeline once that is developed I will decouple these tests.

Please let me know if you have any questions.

Thanks & Regards,
Nalinikanth M.

···

On Wed, Sep 28, 2016 at 10:17 AM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Paulo,

There is no specific reason these were under development. But these were dependent will have a look and see how can I merge them.

Thanks & Regards,

Nalinikanth M

On Wed, Sep 28, 2016 at 10:04 AM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

is there any reason to keep a separate repo with these tests? It would be nice to have the tests as part of https://github.com/dhis2/api-tests

There is also a pipeline to execute the tests in Travis https://travis-ci.org/dhis2/api-tests

Docker image build is also automated, runs once per day, and is publishing images to Docker Hub https://hub.docker.com/r/dhis2/dhis2-web/tags/

Best regards,

Paulo

On Tue, Sep 27, 2016 at 2:41 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason & Paulo,

Hope you are doing good.

It has been a long time! We are writing tests for metadata sync. All the tests are added to the repo here.

In the above repository navigate to the path API_Test/testcases/metadatasync/integration/

there is a generic test “ImportMetadataTest" which we wrote for testing how various types of metadata entities will sync from HQ/central instance to local/field.

There are two ways of running the test

  1. To run this test without any database on HQ and Local.

To test how sync is behaving with respect to various metadata entities on two new instances without any data model on it. All we need is to have metadata versions in this folder - API_Test/testdata/metadatasync/versiondata

We can have any number of versions in the folder. It depends on how user wants metadata sync to happen or what all metadata associations or disassociations user wants to test. For now I kept two version files.

To run the test for Version_1 run this should be run using "env version=“Version_2” mocha ImportMetadataTest.js --timeout 20000” which is can be added to a shell script to run version one after the other like it is in integrationTestsWithoutDB.sh file. This will first import data on HQ/Central instance using import api and then Local/field instance will sync the version from HQ.

Once the version is synced to Local/Field then we are doing two tests. One is asserting the data in

http://local/api/metadata/version/Version_1/data with http://HQ/api/metadata/version/Version_1/data by comparing them.

Later it will compare all the entities(which are present in that version) individually say we have a array of data elements then it will pick all the data elements and compare one by one and continues for other entities as well.

e.g: It will compare http://local/api/dataElements/id with http://HQ/api/dataElements/id

  1. To test how sync is behaving with respect to various metadata entities on two instances where HQ already have n versions[Pre defined database]. We are using the same script to import version by version. It will also do a couple of assertions on top of the metadata when synced. The first assertion being same as above it will compare the data in http://local/api/metadata/version/Version_1/data with http://HQ/api/metadata/version/Version_1/data.

But the next level comparison is a bit different it will compare the entities by fetching the entity data which is present in

http://local/api/metadata/version/Version_1/data with http://local/api/dataElements/id

Here there won’t be entire json for any entity on http://local/api/metadata/version/Version_1/data this will contain very limited details we are just comparing the minimal entities getting them from Local/Field using jsonfilters in api call.

We had this kind of assertions because say user has Version_1 and has a data element abcd1234 and the name might have changed in Version_2 abcd12345 as HQ has got n versions in it so if we want to compare json of it on both HQ and Local we have different names so we took this approach.

Can you please have a look at this and let me know if any changes are required.

Thanks & Regards,

Nalinikanth M


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Thu, Aug 18, 2016 at 11:32 AM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hey All,

I am Nalinikanth, QA on the MSF-OCA project and we are using DHIS2. We are building API automation suites as a part of our project. We are working along with Jason. P and Paulo. We have been discussing on how to take this forward and you can find our discussions thread in this mail.

Please do comment or provide feedback if you have any ideas or thoughts around the same.

I am attaching the repos as well for your reference.

https://github.com/msf-oca-his/API_Test

https://github.com/dhis2/api-tests

Feedback from the community would be well appreciated.

Thanks & Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Wed, Aug 10, 2016 at 3:41 PM, Vanya Seth vanyas@thoughtworks.com wrote:

Hi All

It makes sense to make this discussion public. So, that other members of the community can also provide their inputs.

Regards

Vanya


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Tue, Aug 9, 2016 at 1:36 AM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

is it the idea to keep a different repo with the tests for metadata versioning?

Regarding how to setup data I don’t have strong opinions on this. I think we should try one approach and see if it works. The initial idea was to have a docker image already baked with the data we want, for each test execution, that we can control using docker compose.

https://github.com/dhis2/api-tests/blob/master/docker-compose.yml

– Paulo

On Wed, Aug 3, 2016 at 3:21 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason/ Paulo,

Hope you are doing well.

As a part of the API testing we have written some test for the metadata versioning APIs, which is a core feature contributed by us to DHIS2 version 2.24. We did minor changes to the folder structure. We leverage before and after functions to setup and tear down data. Please have a look at the tests here. Please do let us know any feedback on the tests.

I have been through the repo that Paulo was working on, the way he extracted the version in env.js file looks okay but we did it in a slightly different way. That anyway would help us in providing the ability for tests to run across multiple versions of DHIS2.

One more thing to discuss upon is we can do contract testing of APIs which might not need a predefined data in the data base but, in some cases like when we test datavaluesets or any other similar APIs we might need some data which should already be set up. Similarly, we want to leverage the API testing to do integration tests as well. This will require a database set up to be done before the tests run on the system. For that we can have a DHIS2 empty instance on which we can set up the data and remove the database once the tests are run. We are looking at two ways to accomplish this:

  1. Setting the database dump using sql scripts.
  2. We can create data using metadata import API(using import API to set up metadata), where the set up will run before the tests.
    We how ever feel setting up metadata using APIs will be useful as we can leverage it irrespective of the database we are using and it will be able to create data properly across versions. Where as setting up the database using sql might have to be maintained and should be migrated properly for every version of DHIS2 release. So we are a kind of not wanting to implement this way. So we feel the second way of setting up data required for tests makes more sense. Can you please share your thoughts on this as well.

Thanks & Regards,

Nalinikanth M


With Regards
ThoughtWorks Technologies

Hyderabad

–Stay Hungry Stay Foolish!!

On Tue, Jun 28, 2016 at 5:54 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Thank you Paulo, Enjoy your vacation we can discuss once you are back :slight_smile:


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Tue, Jun 28, 2016 at 5:51 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi, I think Jason is on vacation and I’m also leaving tomorrow. Just a heads up that the repo for the tests is now this one. https://github.com/dhis2/api-tests

BR,

Paulo

On Tue, Jun 28, 2016 at 1:00 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason & Paulo,

Hope you are doing good. We were busy with pushing the Metadata sync feature to DHIS2 trunk to make it in time for 2.24 release. We are done with that and I got some time to resume the automation. I was looking at https://github.com/pgracio/dhis2-api-system-test/, the tests are good and I would also like to same kind of test structure. Some clarifications though:

  1. How will we maintain the tests with respect to versioning of APIs?

As we know, now DHIS will be versioning APIs and there is going to be likely support for last three versions of APIs. So, we should be mindful of leveraging these tests for the future versions at the same time keeping them for previous versions as well.

We thought one possible approach, say we wrote tests on 23 APIs and then 24 APIs are released, we can clone the 23 repo and can create a new repo for 24 version, run all the tests and can raise bugs for valid breakages or fix the tests if required(if there is any change in contract of the APIs). So, this way we can have multiple repos for multiple versions of APIs. Only thing we need to take care of is extracting the URL to env file to make it easy to maintain. Or we can have a folder for each version in the single repo.

  1. As we already discussed about having the tests where we can set up required data using APIs which looks good for now. This should actually work fine when we test APIs for data elements, data sets etc. But in a bigger picture if we have to write tests for APIs like datavaluesets(which will give the data values of a data set). The entities involved here are “data elements, data sets, users, organisation units” and there are good number of associations involved in this scenario. So what do you think about such cases? Can we have a small database to preset these associations on which we can write tests and assert.

Understanding the above things would help us in making the tests scalable.

If you have any other things apart from this, we can discuss them as well. Please share your opinions on these things.

Thanks & Regards,

Nalinikanth M


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Mon, Jun 13, 2016 at 6:16 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason,

I see where you are coming from in terms of testing perspective. Different DBs can be a good input to test metadata import and export api in specific. But for a known state of DB to exist for other apis to be tested, DBs that are not compatible with the version being tested would be a problem.

@Paulo

I do agree with you on Option 3 so let us continue this email chain If necessary then we can setup a call. So let us keep the discussion going on here.

Regards,

Nalinikanth M


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Mon, Jun 13, 2016 at 12:22 PM, Jason Pickering jason.p.pickering@gmail.com wrote:

Hi there.

Here is my perspective. The entire purpose of the integration tests are to test these types of scenarios. Is it possible to perform a bulk metadata export from an arbitrary database, is sort of the test I think. Well, in this case, the developer of the API (Morten) tells you not to use this database because it is “old”. Well, it may be old, but it is also on the current version, so if this feature is supposed to work, well, it should work. If not, then we need to figure out why. That is the purpose of the test. I would expect this same test to work on any arbitrary database, so I think its perfectly legitimate, and see no reason why we should not test the SL database. Having said that, I think we should also test others, such as Trainingland, and enable the tests in such a way to allow people to arbitrarily test which ever system they wish. For the main tests, I think we should use the SL database specifically because it is “old” and in many ways, resembles a system which has been around a long time. Specfically for that reason, it should be tested, at least for certain test scenarios.

And having said all of that, we should not be testing scenarios which the feature developers wish us to test. That is not the point of these tests either. Currently the feature devs are writing their own tests, which is never really a good thing. The purpose of having an external team to develop these tests is to test things which maybe the feature devs don’t consider or don’t want to test.

Hope that helps to clarify my thinking here on what the original intent of these integration tests were. Does that help?

Regards,

Jason

P.S. Paulo’s time is very limited on this project, as he is acting as a part-time consultant to HISP Nordic. I suggest that we try and limit the need for calls unless really urgent, especially if Paulo needs to be involved. If you still feel a call is needed, lets try and start with me and then bring in Paulo in as needed. Paulo, you OK with that?


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Mon, Jun 13, 2016 at 8:35 AM, Paulo Grácio paulogracio@gmail.com wrote:

Hi, option 3 seems to me the best approach for now. What do you think?

Today I have a very busy day, but probably tomorrow morning we can have a call. What about 08:00AM CEST?

/Paulo

On Mon, Jun 13, 2016 at 7:55 AM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hey Paulo,

Thanks for your efforts, will try and let you know :slight_smile:

Jason & Paulo,

As Our team see a potential problem in using SL database, even we are confused of how to go ahead with tests, specially on what database.

Here are the options that we are looking at:

Option 1: Set up an empty vanilla instance. It is an empty database where we can set up data using APIs and can tear down once the tests are done. Entire data can be set up using a Json file or data can be created as required for every test.

Option 2: Set up a known state of database eg., say SL database. The state is maintained and we will be setting up the database before starting the execution of tests. As we are using docker every time we will have new instance say fresh SL database.

Option 3: We can have a know state of database with very low metadata in it. Where in we can add new data when required using APIs, as required for every test.

Can we have a call to discuss more on this. A 30 minutes call would do.

Thanks & Regards,

Nalinikanth M


Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+46764147049

On Sat, Jun 11, 2016 at 2:38 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi,

I have build a new image that can be used now to start a database container for 2.23-sierra-leone

image: pgracio/dhis2-db:2.23-sierra-leone

Give it a try and let me know if you have problems.

Regards,

Paulo

On Fri, Jun 10, 2016 at 10:44 AM Jason Pickering jason.p.pickering@gmail.com wrote:

I suggest that we use the SL demo. Reason being, it is stable, and does not change that much. I think that we can start with this. The Trainingland database is still under very active development. However, I don’t feel it makes a big difference. What is important is that we use a database which we know the state of. I think if Paulo can build a docker image tied to a given revision of the database, and we base our tests off of that, that would be the best approach.

Regards,

Jason

On Thu, Jun 9, 2016, 15:24 Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

I was using this as a reference to write the tests. https://github.com/dareid/chakram/blob/master/examples/spotify.js

Currently using an empty database, but we can use training. No strong opinions on this. Normally I prefer to have tests that don’t depend on database state, but in some situations it might be very difficult to create the desired state before running the test.

To make sure we are all on the same page it’s important that we use pull requests, before we merge things to master.

BR,

Paulo

On Thu, Jun 9, 2016 at 2:57 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason,

It is about what state of database we are going to use say training database or sierra leone or any other known state of database or a vanilla instance. Basically what is the state of the database.

How the tests will look like. As, we can write in so many ways, making sure that we all are on same page.

Call for 30min would be enough for this.

Best Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Thu, Jun 9, 2016 at 6:20 PM, Jason Pickering jason.p.pickering@gmail.com wrote:

Hi Nalinkath,

Paulo and I have quite limited time for this activity. Could you outline what the call would be about?

Regards,

Jason

On Thu, Jun 9, 2016, 14:48 Paulo Grácio paulogracio@gmail.com wrote:

which time zone are you in?

On Thu, Jun 9, 2016 at 2:42 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Paulo,

I am working on it. Yeah I am actually looking to discuss on few things with you and Jason. Can we setup a call to discuss on this based on your availability. I am planing to have a call with you and Jason next week.

@Paulo, @Jason

Please let me know you availability.


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Thu, Jun 9, 2016 at 5:54 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

are you doing any work on system test? I had a look at your repo and was considering to merge that with what I have.

BR,

Paulo

On Fri, Jun 3, 2016 at 1:34 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Paulo,

Thanks for your valuable inputs. I will try and will come back to you.

Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Fri, Jun 3, 2016 at 3:22 PM, Paulo Grácio paulogracio@gmail.com wrote:

HI Nalinikanth,

Currently new dhis2 war files for version 2.23 are generated and published by this job http://ci.dhis2.org/job/dhis2-2.23/. One of the final steps is copy-to-dhis2com.sh that makes this new war available for download at https://www.dhis2.org/download/releases/2.23/dhis.war

I think we could have a downstream job that generates a new docker image using the previously generated war file and publish it to docker hub. Automation for this can be found here

https://github.com/pgracio/dhis2-docker/blob/master/docker-build.sh.

Once the docker image is successfully generated we can run system test using docker compose.

https://github.com/pgracio/dhis2-api-system-test/blob/master/docker-compose.yml

All of these can be executed in Jenkins server, if the server as capacity to handle this, so no need to spin up new environments to execute the tests.

I see this as a initial step to introduce system test. With the pipeline flow that I have described, we’ll still deploy the war file if we detect potential errors during system test. A more long term vision for dhis2 pipeline would be

#1 - build dhis2 war file, without copy the file to dhis2.com

#2 - build docker image, without publish to docker hub

#3 - run system test, if success got to #4 else notify of broken tests.

#4 - copy war file to dhis2.com and publish docker image to docker hub.

Feel free to challenge this, it’s just one opinion. I guess dhis2 developers community might have a saying on this.

Best regards,

Paulo

On Fri, Jun 3, 2016 at 7:48 AM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Paulo: It is more on how the test environments are set up, say set up a docker environment and run tests on it. How the tests effect when it is a environment set up using different continuous integration environments say Jenkins/travis/GO etc. This kind of stuff is what I meant by maintaining environments.

Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Thu, Jun 2, 2016 at 12:23 AM, Paulo Grácio paulogracio@gmail.com wrote:

@ Nalinikanth what exactly do you mean with “maintain environments”?

Best regards,

Paulo

On Wed, Jun 1, 2016 at 1:57 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Paulo: Thank you for your quick response Its working fine now.

I had a look at you repo API tests. It looks good and I wrote some tests quite a while ago using the same framework. I tried to extract the data out of the tests to decrease dependency and to make things easy to maintain. You can find them here. Please have a look at them and let me know your opinions on it.

@Jason & @Paulo: May be next week we can have a call to talk about how the tests should look like and how we can maintain environments.

Thanks & Regards,

Nalinikanth M


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Fri, May 27, 2016 at 8:57 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Gracio I’m glad with your response and I had a look at the api test repo that looks good. I am on vacation till Tuesday. Will get back to you with my thoughts on it soon I’m back from vacation. I would love to talk more in the agreement as well, may be we can set up a call later next week or some time when it’s feasible for all of us.

Thanks & Regards,

Nalinikanth M

Sent from my iPhone

On 27-May-2016, at 7:46 PM, Paulo Grácio paulogracio@gmail.com wrote:

@Nalinikanth I have updated the repo, it should be now possible to start the service using training database.

https://github.com/pgracio/dhis2-docker/blob/master/docker-compose.yml

https://hub.docker.com/r/pgracio/dhis2-db/tags/

Let me know if you have problems.

Best regards,

Paulo

On Fri, May 27, 2016 at 1:00 PM Paulo Grácio paulogracio@gmail.com wrote:

@Nalinikanth, as Jason as mentioned I have created this repo to have API System Tests.
https://github.com/pgracio/ dhis2-api-system-test

This is a initial spike with 2 very basic tests. Please have a look to see if we can have a common agreement on how to do the tests. It includes some manual steps but soon I’ll add some automation mechanism to it to run the tests every time a new version is available.

Share your thoughts.

Best regards,

–Paulo

On Fri, May 27, 2016 at 12:18 PM Paul Grácio < paulogracio@gmail.com > wrote:

Hi Nalinikanth,

glad you are using dhis2-docker scripts :slight_smile:

Currently dhis2-db image only works for version 2.21 and 2.20, this needs some care from my side. Guess you are trying to run the latest version, 2.23

@Jason is snapshot database dump that works with version 2.3?

Best regards,

Paul Grácio

On Fri, May 27, 2016 at 11:00 AM Jason Pickering jason.p.pickering@gmail.com wrote:

Hi there.

I have been meaning to mail you about this. Paolo has another repo here

https://github.com/pgracio/ dhis2-api-system-test

which we started last week. It includes some very simply Chakram based tests.

I think this is more or less what we discussed a few weeks back. Paolo will also be working with us on this.

Maybe Paolo can comment more on the database.

I have another repo here

https://github.com/jason-p- pickering / dhis2-docker

which loads the training land database. I think this should point you in the right direction.

At any rate, we should probably start to issue some PRs on Paolo’s repo and then eventually, we will pull this into the main DHIS2 group repo.

Best regards,

Jason

On Fri, May 27, 2016 at 10:54 AM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

To add context, I am Nalinikanth M, QA at ThoughtWorks. We are working on DHIS2 for an MSF project. We wanted to automate a few tests on DHIS2. I got the docker repository from Jason as we were looking for setting up test environments. As a part of our Test plan we want to use Docker instances to run automation tests.

On Fri, May 27, 2016 at 1:55 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Gracio,

We are using the scripts from your repository to set up a docker environment for dhis2. We were able to get the application up on docker and can use the application, but we are unable to get Sierra Leone database on the application. Can you please help us resolve this issue.

P.S. : We are new to docker, we are following your Readme and docker documentation to set things up.

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+46764147049

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Hi,

try to use postman or curl to see if you get the expected response back. Or just enable debug in chakram to print request and response details to the console.
http://dareid.github.io/chakram/jsdoc/module-chakram.html#.startDebug

Before merge try to get a green pipeline :slight_smile: not sure it’s broken because of your changes… please have a look.

BR,

Paulo

···

On Wed, Sep 28, 2016 at 10:17 AM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Paulo,

There is no specific reason these were under development. But these were dependent will have a look and see how can I merge them.

Thanks & Regards,

Nalinikanth M


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Wed, Sep 28, 2016 at 10:04 AM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

is there any reason to keep a separate repo with these tests? It would be nice to have the tests as part of https://github.com/dhis2/api-tests

There is also a pipeline to execute the tests in Travis https://travis-ci.org/dhis2/api-tests

Docker image build is also automated, runs once per day, and is publishing images to Docker Hub https://hub.docker.com/r/dhis2/dhis2-web/tags/

Best regards,

Paulo

On Tue, Sep 27, 2016 at 2:41 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason & Paulo,

Hope you are doing good.

It has been a long time! We are writing tests for metadata sync. All the tests are added to the repo here.

In the above repository navigate to the path API_Test/testcases/metadatasync/integration/

there is a generic test “ImportMetadataTest" which we wrote for testing how various types of metadata entities will sync from HQ/central instance to local/field.

There are two ways of running the test

  1. To run this test without any database on HQ and Local.

To test how sync is behaving with respect to various metadata entities on two new instances without any data model on it. All we need is to have metadata versions in this folder - API_Test/testdata/metadatasync/versiondata

We can have any number of versions in the folder. It depends on how user wants metadata sync to happen or what all metadata associations or disassociations user wants to test. For now I kept two version files.

To run the test for Version_1 run this should be run using "env version=“Version_2” mocha ImportMetadataTest.js --timeout 20000” which is can be added to a shell script to run version one after the other like it is in integrationTestsWithoutDB.sh file. This will first import data on HQ/Central instance using import api and then Local/field instance will sync the version from HQ.

Once the version is synced to Local/Field then we are doing two tests. One is asserting the data in

http://local/api/metadata/version/Version_1/data with http://HQ/api/metadata/version/Version_1/data by comparing them.

Later it will compare all the entities(which are present in that version) individually say we have a array of data elements then it will pick all the data elements and compare one by one and continues for other entities as well.

e.g: It will compare http://local/api/dataElements/id with http://HQ/api/dataElements/id

  1. To test how sync is behaving with respect to various metadata entities on two instances where HQ already have n versions[Pre defined database]. We are using the same script to import version by version. It will also do a couple of assertions on top of the metadata when synced. The first assertion being same as above it will compare the data in http://local/api/metadata/version/Version_1/data with http://HQ/api/metadata/version/Version_1/data.

But the next level comparison is a bit different it will compare the entities by fetching the entity data which is present in

http://local/api/metadata/version/Version_1/data with http://local/api/dataElements/id

Here there won’t be entire json for any entity on http://local/api/metadata/version/Version_1/data this will contain very limited details we are just comparing the minimal entities getting them from Local/Field using jsonfilters in api call.

We had this kind of assertions because say user has Version_1 and has a data element abcd1234 and the name might have changed in Version_2 abcd12345 as HQ has got n versions in it so if we want to compare json of it on both HQ and Local we have different names so we took this approach.

Can you please have a look at this and let me know if any changes are required.

Thanks & Regards,

Nalinikanth M


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Thu, Aug 18, 2016 at 11:32 AM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hey All,

I am Nalinikanth, QA on the MSF-OCA project and we are using DHIS2. We are building API automation suites as a part of our project. We are working along with Jason. P and Paulo. We have been discussing on how to take this forward and you can find our discussions thread in this mail.

Please do comment or provide feedback if you have any ideas or thoughts around the same.

I am attaching the repos as well for your reference.

https://github.com/msf-oca-his/API_Test

https://github.com/dhis2/api-tests

Feedback from the community would be well appreciated.

Thanks & Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Wed, Aug 10, 2016 at 3:41 PM, Vanya Seth vanyas@thoughtworks.com wrote:

Hi All

It makes sense to make this discussion public. So, that other members of the community can also provide their inputs.

Regards

Vanya


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Tue, Aug 9, 2016 at 1:36 AM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

is it the idea to keep a different repo with the tests for metadata versioning?

Regarding how to setup data I don’t have strong opinions on this. I think we should try one approach and see if it works. The initial idea was to have a docker image already baked with the data we want, for each test execution, that we can control using docker compose.

https://github.com/dhis2/api-tests/blob/master/docker-compose.yml

– Paulo

On Wed, Aug 3, 2016 at 3:21 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason/ Paulo,

Hope you are doing well.

As a part of the API testing we have written some test for the metadata versioning APIs, which is a core feature contributed by us to DHIS2 version 2.24. We did minor changes to the folder structure. We leverage before and after functions to setup and tear down data. Please have a look at the tests here. Please do let us know any feedback on the tests.

I have been through the repo that Paulo was working on, the way he extracted the version in env.js file looks okay but we did it in a slightly different way. That anyway would help us in providing the ability for tests to run across multiple versions of DHIS2.

One more thing to discuss upon is we can do contract testing of APIs which might not need a predefined data in the data base but, in some cases like when we test datavaluesets or any other similar APIs we might need some data which should already be set up. Similarly, we want to leverage the API testing to do integration tests as well. This will require a database set up to be done before the tests run on the system. For that we can have a DHIS2 empty instance on which we can set up the data and remove the database once the tests are run. We are looking at two ways to accomplish this:

  1. Setting the database dump using sql scripts.
  2. We can create data using metadata import API(using import API to set up metadata), where the set up will run before the tests.
    We how ever feel setting up metadata using APIs will be useful as we can leverage it irrespective of the database we are using and it will be able to create data properly across versions. Where as setting up the database using sql might have to be maintained and should be migrated properly for every version of DHIS2 release. So we are a kind of not wanting to implement this way. So we feel the second way of setting up data required for tests makes more sense. Can you please share your thoughts on this as well.

Thanks & Regards,

Nalinikanth M


With Regards
ThoughtWorks Technologies

Hyderabad

–Stay Hungry Stay Foolish!!

On Tue, Jun 28, 2016 at 5:54 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Thank you Paulo, Enjoy your vacation we can discuss once you are back :slight_smile:


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Tue, Jun 28, 2016 at 5:51 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi, I think Jason is on vacation and I’m also leaving tomorrow. Just a heads up that the repo for the tests is now this one. https://github.com/dhis2/api-tests

BR,

Paulo

On Tue, Jun 28, 2016 at 1:00 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason & Paulo,

Hope you are doing good. We were busy with pushing the Metadata sync feature to DHIS2 trunk to make it in time for 2.24 release. We are done with that and I got some time to resume the automation. I was looking at https://github.com/pgracio/dhis2-api-system-test/, the tests are good and I would also like to same kind of test structure. Some clarifications though:

  1. How will we maintain the tests with respect to versioning of APIs?

As we know, now DHIS will be versioning APIs and there is going to be likely support for last three versions of APIs. So, we should be mindful of leveraging these tests for the future versions at the same time keeping them for previous versions as well.

We thought one possible approach, say we wrote tests on 23 APIs and then 24 APIs are released, we can clone the 23 repo and can create a new repo for 24 version, run all the tests and can raise bugs for valid breakages or fix the tests if required(if there is any change in contract of the APIs). So, this way we can have multiple repos for multiple versions of APIs. Only thing we need to take care of is extracting the URL to env file to make it easy to maintain. Or we can have a folder for each version in the single repo.

  1. As we already discussed about having the tests where we can set up required data using APIs which looks good for now. This should actually work fine when we test APIs for data elements, data sets etc. But in a bigger picture if we have to write tests for APIs like datavaluesets(which will give the data values of a data set). The entities involved here are “data elements, data sets, users, organisation units” and there are good number of associations involved in this scenario. So what do you think about such cases? Can we have a small database to preset these associations on which we can write tests and assert.

Understanding the above things would help us in making the tests scalable.

If you have any other things apart from this, we can discuss them as well. Please share your opinions on these things.

Thanks & Regards,

Nalinikanth M


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Mon, Jun 13, 2016 at 6:16 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason,

I see where you are coming from in terms of testing perspective. Different DBs can be a good input to test metadata import and export api in specific. But for a known state of DB to exist for other apis to be tested, DBs that are not compatible with the version being tested would be a problem.

@Paulo

I do agree with you on Option 3 so let us continue this email chain If necessary then we can setup a call. So let us keep the discussion going on here.

Regards,

Nalinikanth M


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Mon, Jun 13, 2016 at 12:22 PM, Jason Pickering jason.p.pickering@gmail.com wrote:

Hi there.

Here is my perspective. The entire purpose of the integration tests are to test these types of scenarios. Is it possible to perform a bulk metadata export from an arbitrary database, is sort of the test I think. Well, in this case, the developer of the API (Morten) tells you not to use this database because it is “old”. Well, it may be old, but it is also on the current version, so if this feature is supposed to work, well, it should work. If not, then we need to figure out why. That is the purpose of the test. I would expect this same test to work on any arbitrary database, so I think its perfectly legitimate, and see no reason why we should not test the SL database. Having said that, I think we should also test others, such as Trainingland, and enable the tests in such a way to allow people to arbitrarily test which ever system they wish. For the main tests, I think we should use the SL database specifically because it is “old” and in many ways, resembles a system which has been around a long time. Specfically for that reason, it should be tested, at least for certain test scenarios.

And having said all of that, we should not be testing scenarios which the feature developers wish us to test. That is not the point of these tests either. Currently the feature devs are writing their own tests, which is never really a good thing. The purpose of having an external team to develop these tests is to test things which maybe the feature devs don’t consider or don’t want to test.

Hope that helps to clarify my thinking here on what the original intent of these integration tests were. Does that help?

Regards,

Jason

P.S. Paulo’s time is very limited on this project, as he is acting as a part-time consultant to HISP Nordic. I suggest that we try and limit the need for calls unless really urgent, especially if Paulo needs to be involved. If you still feel a call is needed, lets try and start with me and then bring in Paulo in as needed. Paulo, you OK with that?


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Mon, Jun 13, 2016 at 8:35 AM, Paulo Grácio paulogracio@gmail.com wrote:

Hi, option 3 seems to me the best approach for now. What do you think?

Today I have a very busy day, but probably tomorrow morning we can have a call. What about 08:00AM CEST?

/Paulo

On Mon, Jun 13, 2016 at 7:55 AM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hey Paulo,

Thanks for your efforts, will try and let you know :slight_smile:

Jason & Paulo,

As Our team see a potential problem in using SL database, even we are confused of how to go ahead with tests, specially on what database.

Here are the options that we are looking at:

Option 1: Set up an empty vanilla instance. It is an empty database where we can set up data using APIs and can tear down once the tests are done. Entire data can be set up using a Json file or data can be created as required for every test.

Option 2: Set up a known state of database eg., say SL database. The state is maintained and we will be setting up the database before starting the execution of tests. As we are using docker every time we will have new instance say fresh SL database.

Option 3: We can have a know state of database with very low metadata in it. Where in we can add new data when required using APIs, as required for every test.

Can we have a call to discuss more on this. A 30 minutes call would do.

Thanks & Regards,

Nalinikanth M


Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+46764147049

On Sat, Jun 11, 2016 at 2:38 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi,

I have build a new image that can be used now to start a database container for 2.23-sierra-leone

image: pgracio/dhis2-db:2.23-sierra-leone

Give it a try and let me know if you have problems.

Regards,

Paulo

On Fri, Jun 10, 2016 at 10:44 AM Jason Pickering jason.p.pickering@gmail.com wrote:

I suggest that we use the SL demo. Reason being, it is stable, and does not change that much. I think that we can start with this. The Trainingland database is still under very active development. However, I don’t feel it makes a big difference. What is important is that we use a database which we know the state of. I think if Paulo can build a docker image tied to a given revision of the database, and we base our tests off of that, that would be the best approach.

Regards,

Jason

On Thu, Jun 9, 2016, 15:24 Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

I was using this as a reference to write the tests. https://github.com/dareid/chakram/blob/master/examples/spotify.js

Currently using an empty database, but we can use training. No strong opinions on this. Normally I prefer to have tests that don’t depend on database state, but in some situations it might be very difficult to create the desired state before running the test.

To make sure we are all on the same page it’s important that we use pull requests, before we merge things to master.

BR,

Paulo

On Thu, Jun 9, 2016 at 2:57 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason,

It is about what state of database we are going to use say training database or sierra leone or any other known state of database or a vanilla instance. Basically what is the state of the database.

How the tests will look like. As, we can write in so many ways, making sure that we all are on same page.

Call for 30min would be enough for this.

Best Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Thu, Jun 9, 2016 at 6:20 PM, Jason Pickering jason.p.pickering@gmail.com wrote:

Hi Nalinkath,

Paulo and I have quite limited time for this activity. Could you outline what the call would be about?

Regards,

Jason

On Thu, Jun 9, 2016, 14:48 Paulo Grácio paulogracio@gmail.com wrote:

which time zone are you in?

On Thu, Jun 9, 2016 at 2:42 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Paulo,

I am working on it. Yeah I am actually looking to discuss on few things with you and Jason. Can we setup a call to discuss on this based on your availability. I am planing to have a call with you and Jason next week.

@Paulo, @Jason

Please let me know you availability.


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Thu, Jun 9, 2016 at 5:54 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

are you doing any work on system test? I had a look at your repo and was considering to merge that with what I have.

BR,

Paulo

On Fri, Jun 3, 2016 at 1:34 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Paulo,

Thanks for your valuable inputs. I will try and will come back to you.

Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Fri, Jun 3, 2016 at 3:22 PM, Paulo Grácio paulogracio@gmail.com wrote:

HI Nalinikanth,

Currently new dhis2 war files for version 2.23 are generated and published by this job http://ci.dhis2.org/job/dhis2-2.23/. One of the final steps is copy-to-dhis2com.sh that makes this new war available for download at https://www.dhis2.org/download/releases/2.23/dhis.war

I think we could have a downstream job that generates a new docker image using the previously generated war file and publish it to docker hub. Automation for this can be found here

https://github.com/pgracio/dhis2-docker/blob/master/docker-build.sh.

Once the docker image is successfully generated we can run system test using docker compose.

https://github.com/pgracio/dhis2-api-system-test/blob/master/docker-compose.yml

All of these can be executed in Jenkins server, if the server as capacity to handle this, so no need to spin up new environments to execute the tests.

I see this as a initial step to introduce system test. With the pipeline flow that I have described, we’ll still deploy the war file if we detect potential errors during system test. A more long term vision for dhis2 pipeline would be

#1 - build dhis2 war file, without copy the file to dhis2.com

#2 - build docker image, without publish to docker hub

#3 - run system test, if success got to #4 else notify of broken tests.

#4 - copy war file to dhis2.com and publish docker image to docker hub.

Feel free to challenge this, it’s just one opinion. I guess dhis2 developers community might have a saying on this.

Best regards,

Paulo

On Fri, Jun 3, 2016 at 7:48 AM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Paulo: It is more on how the test environments are set up, say set up a docker environment and run tests on it. How the tests effect when it is a environment set up using different continuous integration environments say Jenkins/travis/GO etc. This kind of stuff is what I meant by maintaining environments.

Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Thu, Jun 2, 2016 at 12:23 AM, Paulo Grácio paulogracio@gmail.com wrote:

@ Nalinikanth what exactly do you mean with “maintain environments”?

Best regards,

Paulo

On Wed, Jun 1, 2016 at 1:57 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Paulo: Thank you for your quick response Its working fine now.

I had a look at you repo API tests. It looks good and I wrote some tests quite a while ago using the same framework. I tried to extract the data out of the tests to decrease dependency and to make things easy to maintain. You can find them here. Please have a look at them and let me know your opinions on it.

@Jason & @Paulo: May be next week we can have a call to talk about how the tests should look like and how we can maintain environments.

Thanks & Regards,

Nalinikanth M


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Fri, May 27, 2016 at 8:57 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Gracio I’m glad with your response and I had a look at the api test repo that looks good. I am on vacation till Tuesday. Will get back to you with my thoughts on it soon I’m back from vacation. I would love to talk more in the agreement as well, may be we can set up a call later next week or some time when it’s feasible for all of us.

Thanks & Regards,

Nalinikanth M

Sent from my iPhone

On 27-May-2016, at 7:46 PM, Paulo Grácio paulogracio@gmail.com wrote:

@Nalinikanth I have updated the repo, it should be now possible to start the service using training database.

https://github.com/pgracio/dhis2-docker/blob/master/docker-compose.yml

https://hub.docker.com/r/pgracio/dhis2-db/tags/

Let me know if you have problems.

Best regards,

Paulo

On Fri, May 27, 2016 at 1:00 PM Paulo Grácio paulogracio@gmail.com wrote:

@Nalinikanth, as Jason as mentioned I have created this repo to have API System Tests.
https://github.com/pgracio/ dhis2-api-system-test

This is a initial spike with 2 very basic tests. Please have a look to see if we can have a common agreement on how to do the tests. It includes some manual steps but soon I’ll add some automation mechanism to it to run the tests every time a new version is available.

Share your thoughts.

Best regards,

–Paulo

On Fri, May 27, 2016 at 12:18 PM Paul Grácio < paulogracio@gmail.com > wrote:

Hi Nalinikanth,

glad you are using dhis2-docker scripts :slight_smile:

Currently dhis2-db image only works for version 2.21 and 2.20, this needs some care from my side. Guess you are trying to run the latest version, 2.23

@Jason is snapshot database dump that works with version 2.3?

Best regards,

Paul Grácio

On Fri, May 27, 2016 at 11:00 AM Jason Pickering jason.p.pickering@gmail.com wrote:

Hi there.

I have been meaning to mail you about this. Paolo has another repo here

https://github.com/pgracio/ dhis2-api-system-test

which we started last week. It includes some very simply Chakram based tests.

I think this is more or less what we discussed a few weeks back. Paolo will also be working with us on this.

Maybe Paolo can comment more on the database.

I have another repo here

https://github.com/jason-p- pickering / dhis2-docker

which loads the training land database. I think this should point you in the right direction.

At any rate, we should probably start to issue some PRs on Paolo’s repo and then eventually, we will pull this into the main DHIS2 group repo.

Best regards,

Jason

On Fri, May 27, 2016 at 10:54 AM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

To add context, I am Nalinikanth M, QA at ThoughtWorks. We are working on DHIS2 for an MSF project. We wanted to automate a few tests on DHIS2. I got the docker repository from Jason as we were looking for setting up test environments. As a part of our Test plan we want to use Docker instances to run automation tests.

On Fri, May 27, 2016 at 1:55 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Gracio,

We are using the scripts from your repository to set up a docker environment for dhis2. We were able to get the application up on docker and can use the application, but we are unable to get Sierra Leone database on the application. Can you please help us resolve this issue.

P.S. : We are new to docker, we are following your Readme and docker documentation to set things up.

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+46764147049

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Hey Paulo,

Forgot to mention I changed two variables in env.js to properRequestParams from auth as it has got all the headers not just authorization details. If it is okay I will go ahead and fix your tests as well else I will do changes in my tests.

Thanks & Regards,
Nalinikanth M

···

On Thu, Oct 6, 2016 at 5:25 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi,

try to use postman or curl to see if you get the expected response back. Or just enable debug in chakram to print request and response details to the console.
http://dareid.github.io/chakram/jsdoc/module-chakram.html#.startDebug

Before merge try to get a green pipeline :slight_smile: not sure it’s broken because of your changes… please have a look.

BR,

Paulo

On Thu, Oct 6, 2016 at 1:46 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hey Jason/Paulo,

I have sent a pull request to merge the tests that we have till now.

I got an issue while running a test for all the APIs The test is calling an API with improper authorization. When an API is called with improper authorization it should give 401 status. I used to get this status earlier but now couldn’t get the status it should be either issue with chakram or may be the we are not getting a proper json response from DHIS2. I will merge again once I fixed this issue. please let me know if you have any insights about this.

When running integration tests it should span across two environments so added a new environment file and aligned in the same way so we can pass it during run time for tests. Will be adding more data checks to sync and compare, once they are done will merge it.

And all the contract tests for versioning should be run on a fresh db instance and this should be run in the same order as in versioningContractTests.sh folder we tried keeping this tests independent but couldn’t succeed in achieving that as we don’t have a delete API to delete versions so once a version is created we need to delete it from backend so for now these tests are coupled. The delete API is there in our pipeline once that is developed I will decouple these tests.

Please let me know if you have any questions.

Thanks & Regards,
Nalinikanth M.

On Wed, Sep 28, 2016 at 10:17 AM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Paulo,

There is no specific reason these were under development. But these were dependent will have a look and see how can I merge them.

Thanks & Regards,

Nalinikanth M

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Wed, Sep 28, 2016 at 10:04 AM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

is there any reason to keep a separate repo with these tests? It would be nice to have the tests as part of https://github.com/dhis2/api-tests

There is also a pipeline to execute the tests in Travis https://travis-ci.org/dhis2/api-tests

Docker image build is also automated, runs once per day, and is publishing images to Docker Hub https://hub.docker.com/r/dhis2/dhis2-web/tags/

Best regards,

Paulo

On Tue, Sep 27, 2016 at 2:41 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason & Paulo,

Hope you are doing good.

It has been a long time! We are writing tests for metadata sync. All the tests are added to the repo here.

In the above repository navigate to the path API_Test/testcases/metadatasync/integration/

there is a generic test “ImportMetadataTest" which we wrote for testing how various types of metadata entities will sync from HQ/central instance to local/field.

There are two ways of running the test

  1. To run this test without any database on HQ and Local.

To test how sync is behaving with respect to various metadata entities on two new instances without any data model on it. All we need is to have metadata versions in this folder - API_Test/testdata/metadatasync/versiondata

We can have any number of versions in the folder. It depends on how user wants metadata sync to happen or what all metadata associations or disassociations user wants to test. For now I kept two version files.

To run the test for Version_1 run this should be run using "env version=“Version_2” mocha ImportMetadataTest.js --timeout 20000” which is can be added to a shell script to run version one after the other like it is in integrationTestsWithoutDB.sh file. This will first import data on HQ/Central instance using import api and then Local/field instance will sync the version from HQ.

Once the version is synced to Local/Field then we are doing two tests. One is asserting the data in

http://local/api/metadata/version/Version_1/data with http://HQ/api/metadata/version/Version_1/data by comparing them.

Later it will compare all the entities(which are present in that version) individually say we have a array of data elements then it will pick all the data elements and compare one by one and continues for other entities as well.

e.g: It will compare http://local/api/dataElements/id with http://HQ/api/dataElements/id

  1. To test how sync is behaving with respect to various metadata entities on two instances where HQ already have n versions[Pre defined database]. We are using the same script to import version by version. It will also do a couple of assertions on top of the metadata when synced. The first assertion being same as above it will compare the data in http://local/api/metadata/version/Version_1/data with http://HQ/api/metadata/version/Version_1/data.

But the next level comparison is a bit different it will compare the entities by fetching the entity data which is present in

http://local/api/metadata/version/Version_1/data with http://local/api/dataElements/id

Here there won’t be entire json for any entity on http://local/api/metadata/version/Version_1/data this will contain very limited details we are just comparing the minimal entities getting them from Local/Field using jsonfilters in api call.

We had this kind of assertions because say user has Version_1 and has a data element abcd1234 and the name might have changed in Version_2 abcd12345 as HQ has got n versions in it so if we want to compare json of it on both HQ and Local we have different names so we took this approach.

Can you please have a look at this and let me know if any changes are required.

Thanks & Regards,

Nalinikanth M

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Thu, Aug 18, 2016 at 11:32 AM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hey All,

I am Nalinikanth, QA on the MSF-OCA project and we are using DHIS2. We are building API automation suites as a part of our project. We are working along with Jason. P and Paulo. We have been discussing on how to take this forward and you can find our discussions thread in this mail.

Please do comment or provide feedback if you have any ideas or thoughts around the same.

I am attaching the repos as well for your reference.

https://github.com/msf-oca-his/API_Test

https://github.com/dhis2/api-tests

Feedback from the community would be well appreciated.

Thanks & Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Wed, Aug 10, 2016 at 3:41 PM, Vanya Seth vanyas@thoughtworks.com wrote:

Hi All

It makes sense to make this discussion public. So, that other members of the community can also provide their inputs.

Regards

Vanya


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Tue, Aug 9, 2016 at 1:36 AM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

is it the idea to keep a different repo with the tests for metadata versioning?

Regarding how to setup data I don’t have strong opinions on this. I think we should try one approach and see if it works. The initial idea was to have a docker image already baked with the data we want, for each test execution, that we can control using docker compose.

https://github.com/dhis2/api-tests/blob/master/docker-compose.yml

– Paulo

On Wed, Aug 3, 2016 at 3:21 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason/ Paulo,

Hope you are doing well.

As a part of the API testing we have written some test for the metadata versioning APIs, which is a core feature contributed by us to DHIS2 version 2.24. We did minor changes to the folder structure. We leverage before and after functions to setup and tear down data. Please have a look at the tests here. Please do let us know any feedback on the tests.

I have been through the repo that Paulo was working on, the way he extracted the version in env.js file looks okay but we did it in a slightly different way. That anyway would help us in providing the ability for tests to run across multiple versions of DHIS2.

One more thing to discuss upon is we can do contract testing of APIs which might not need a predefined data in the data base but, in some cases like when we test datavaluesets or any other similar APIs we might need some data which should already be set up. Similarly, we want to leverage the API testing to do integration tests as well. This will require a database set up to be done before the tests run on the system. For that we can have a DHIS2 empty instance on which we can set up the data and remove the database once the tests are run. We are looking at two ways to accomplish this:

  1. Setting the database dump using sql scripts.
  2. We can create data using metadata import API(using import API to set up metadata), where the set up will run before the tests.
    We how ever feel setting up metadata using APIs will be useful as we can leverage it irrespective of the database we are using and it will be able to create data properly across versions. Where as setting up the database using sql might have to be maintained and should be migrated properly for every version of DHIS2 release. So we are a kind of not wanting to implement this way. So we feel the second way of setting up data required for tests makes more sense. Can you please share your thoughts on this as well.

Thanks & Regards,

Nalinikanth M


With Regards
ThoughtWorks Technologies

Hyderabad

–Stay Hungry Stay Foolish!!

On Tue, Jun 28, 2016 at 5:54 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Thank you Paulo, Enjoy your vacation we can discuss once you are back :slight_smile:


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Tue, Jun 28, 2016 at 5:51 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi, I think Jason is on vacation and I’m also leaving tomorrow. Just a heads up that the repo for the tests is now this one. https://github.com/dhis2/api-tests

BR,

Paulo

On Tue, Jun 28, 2016 at 1:00 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason & Paulo,

Hope you are doing good. We were busy with pushing the Metadata sync feature to DHIS2 trunk to make it in time for 2.24 release. We are done with that and I got some time to resume the automation. I was looking at https://github.com/pgracio/dhis2-api-system-test/, the tests are good and I would also like to same kind of test structure. Some clarifications though:

  1. How will we maintain the tests with respect to versioning of APIs?

As we know, now DHIS will be versioning APIs and there is going to be likely support for last three versions of APIs. So, we should be mindful of leveraging these tests for the future versions at the same time keeping them for previous versions as well.

We thought one possible approach, say we wrote tests on 23 APIs and then 24 APIs are released, we can clone the 23 repo and can create a new repo for 24 version, run all the tests and can raise bugs for valid breakages or fix the tests if required(if there is any change in contract of the APIs). So, this way we can have multiple repos for multiple versions of APIs. Only thing we need to take care of is extracting the URL to env file to make it easy to maintain. Or we can have a folder for each version in the single repo.

  1. As we already discussed about having the tests where we can set up required data using APIs which looks good for now. This should actually work fine when we test APIs for data elements, data sets etc. But in a bigger picture if we have to write tests for APIs like datavaluesets(which will give the data values of a data set). The entities involved here are “data elements, data sets, users, organisation units” and there are good number of associations involved in this scenario. So what do you think about such cases? Can we have a small database to preset these associations on which we can write tests and assert.

Understanding the above things would help us in making the tests scalable.

If you have any other things apart from this, we can discuss them as well. Please share your opinions on these things.

Thanks & Regards,

Nalinikanth M


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Mon, Jun 13, 2016 at 6:16 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason,

I see where you are coming from in terms of testing perspective. Different DBs can be a good input to test metadata import and export api in specific. But for a known state of DB to exist for other apis to be tested, DBs that are not compatible with the version being tested would be a problem.

@Paulo

I do agree with you on Option 3 so let us continue this email chain If necessary then we can setup a call. So let us keep the discussion going on here.

Regards,

Nalinikanth M


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Mon, Jun 13, 2016 at 12:22 PM, Jason Pickering jason.p.pickering@gmail.com wrote:

Hi there.

Here is my perspective. The entire purpose of the integration tests are to test these types of scenarios. Is it possible to perform a bulk metadata export from an arbitrary database, is sort of the test I think. Well, in this case, the developer of the API (Morten) tells you not to use this database because it is “old”. Well, it may be old, but it is also on the current version, so if this feature is supposed to work, well, it should work. If not, then we need to figure out why. That is the purpose of the test. I would expect this same test to work on any arbitrary database, so I think its perfectly legitimate, and see no reason why we should not test the SL database. Having said that, I think we should also test others, such as Trainingland, and enable the tests in such a way to allow people to arbitrarily test which ever system they wish. For the main tests, I think we should use the SL database specifically because it is “old” and in many ways, resembles a system which has been around a long time. Specfically for that reason, it should be tested, at least for certain test scenarios.

And having said all of that, we should not be testing scenarios which the feature developers wish us to test. That is not the point of these tests either. Currently the feature devs are writing their own tests, which is never really a good thing. The purpose of having an external team to develop these tests is to test things which maybe the feature devs don’t consider or don’t want to test.

Hope that helps to clarify my thinking here on what the original intent of these integration tests were. Does that help?

Regards,

Jason

P.S. Paulo’s time is very limited on this project, as he is acting as a part-time consultant to HISP Nordic. I suggest that we try and limit the need for calls unless really urgent, especially if Paulo needs to be involved. If you still feel a call is needed, lets try and start with me and then bring in Paulo in as needed. Paulo, you OK with that?


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Mon, Jun 13, 2016 at 8:35 AM, Paulo Grácio paulogracio@gmail.com wrote:

Hi, option 3 seems to me the best approach for now. What do you think?

Today I have a very busy day, but probably tomorrow morning we can have a call. What about 08:00AM CEST?

/Paulo

On Mon, Jun 13, 2016 at 7:55 AM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hey Paulo,

Thanks for your efforts, will try and let you know :slight_smile:

Jason & Paulo,

As Our team see a potential problem in using SL database, even we are confused of how to go ahead with tests, specially on what database.

Here are the options that we are looking at:

Option 1: Set up an empty vanilla instance. It is an empty database where we can set up data using APIs and can tear down once the tests are done. Entire data can be set up using a Json file or data can be created as required for every test.

Option 2: Set up a known state of database eg., say SL database. The state is maintained and we will be setting up the database before starting the execution of tests. As we are using docker every time we will have new instance say fresh SL database.

Option 3: We can have a know state of database with very low metadata in it. Where in we can add new data when required using APIs, as required for every test.

Can we have a call to discuss more on this. A 30 minutes call would do.

Thanks & Regards,

Nalinikanth M


Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+46764147049

On Sat, Jun 11, 2016 at 2:38 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi,

I have build a new image that can be used now to start a database container for 2.23-sierra-leone

image: pgracio/dhis2-db:2.23-sierra-leone

Give it a try and let me know if you have problems.

Regards,

Paulo

On Fri, Jun 10, 2016 at 10:44 AM Jason Pickering jason.p.pickering@gmail.com wrote:

I suggest that we use the SL demo. Reason being, it is stable, and does not change that much. I think that we can start with this. The Trainingland database is still under very active development. However, I don’t feel it makes a big difference. What is important is that we use a database which we know the state of. I think if Paulo can build a docker image tied to a given revision of the database, and we base our tests off of that, that would be the best approach.

Regards,

Jason

On Thu, Jun 9, 2016, 15:24 Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

I was using this as a reference to write the tests. https://github.com/dareid/chakram/blob/master/examples/spotify.js

Currently using an empty database, but we can use training. No strong opinions on this. Normally I prefer to have tests that don’t depend on database state, but in some situations it might be very difficult to create the desired state before running the test.

To make sure we are all on the same page it’s important that we use pull requests, before we merge things to master.

BR,

Paulo

On Thu, Jun 9, 2016 at 2:57 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason,

It is about what state of database we are going to use say training database or sierra leone or any other known state of database or a vanilla instance. Basically what is the state of the database.

How the tests will look like. As, we can write in so many ways, making sure that we all are on same page.

Call for 30min would be enough for this.

Best Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Thu, Jun 9, 2016 at 6:20 PM, Jason Pickering jason.p.pickering@gmail.com wrote:

Hi Nalinkath,

Paulo and I have quite limited time for this activity. Could you outline what the call would be about?

Regards,

Jason

On Thu, Jun 9, 2016, 14:48 Paulo Grácio paulogracio@gmail.com wrote:

which time zone are you in?

On Thu, Jun 9, 2016 at 2:42 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Paulo,

I am working on it. Yeah I am actually looking to discuss on few things with you and Jason. Can we setup a call to discuss on this based on your availability. I am planing to have a call with you and Jason next week.

@Paulo, @Jason

Please let me know you availability.


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Thu, Jun 9, 2016 at 5:54 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

are you doing any work on system test? I had a look at your repo and was considering to merge that with what I have.

BR,

Paulo

On Fri, Jun 3, 2016 at 1:34 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Paulo,

Thanks for your valuable inputs. I will try and will come back to you.

Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Fri, Jun 3, 2016 at 3:22 PM, Paulo Grácio paulogracio@gmail.com wrote:

HI Nalinikanth,

Currently new dhis2 war files for version 2.23 are generated and published by this job http://ci.dhis2.org/job/dhis2-2.23/. One of the final steps is copy-to-dhis2com.sh that makes this new war available for download at https://www.dhis2.org/download/releases/2.23/dhis.war

I think we could have a downstream job that generates a new docker image using the previously generated war file and publish it to docker hub. Automation for this can be found here

https://github.com/pgracio/dhis2-docker/blob/master/docker-build.sh.

Once the docker image is successfully generated we can run system test using docker compose.

https://github.com/pgracio/dhis2-api-system-test/blob/master/docker-compose.yml

All of these can be executed in Jenkins server, if the server as capacity to handle this, so no need to spin up new environments to execute the tests.

I see this as a initial step to introduce system test. With the pipeline flow that I have described, we’ll still deploy the war file if we detect potential errors during system test. A more long term vision for dhis2 pipeline would be

#1 - build dhis2 war file, without copy the file to dhis2.com

#2 - build docker image, without publish to docker hub

#3 - run system test, if success got to #4 else notify of broken tests.

#4 - copy war file to dhis2.com and publish docker image to docker hub.

Feel free to challenge this, it’s just one opinion. I guess dhis2 developers community might have a saying on this.

Best regards,

Paulo

On Fri, Jun 3, 2016 at 7:48 AM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Paulo: It is more on how the test environments are set up, say set up a docker environment and run tests on it. How the tests effect when it is a environment set up using different continuous integration environments say Jenkins/travis/GO etc. This kind of stuff is what I meant by maintaining environments.

Regards,

Nalinikanth


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Thu, Jun 2, 2016 at 12:23 AM, Paulo Grácio paulogracio@gmail.com wrote:

@ Nalinikanth what exactly do you mean with “maintain environments”?

Best regards,

Paulo

On Wed, Jun 1, 2016 at 1:57 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Paulo: Thank you for your quick response Its working fine now.

I had a look at you repo API tests. It looks good and I wrote some tests quite a while ago using the same framework. I tried to extract the data out of the tests to decrease dependency and to make things easy to maintain. You can find them here. Please have a look at them and let me know your opinions on it.

@Jason & @Paulo: May be next week we can have a call to talk about how the tests should look like and how we can maintain environments.

Thanks & Regards,

Nalinikanth M


Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

On Fri, May 27, 2016 at 8:57 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Gracio I’m glad with your response and I had a look at the api test repo that looks good. I am on vacation till Tuesday. Will get back to you with my thoughts on it soon I’m back from vacation. I would love to talk more in the agreement as well, may be we can set up a call later next week or some time when it’s feasible for all of us.

Thanks & Regards,

Nalinikanth M

Sent from my iPhone

On 27-May-2016, at 7:46 PM, Paulo Grácio paulogracio@gmail.com wrote:

@Nalinikanth I have updated the repo, it should be now possible to start the service using training database.

https://github.com/pgracio/dhis2-docker/blob/master/docker-compose.yml

https://hub.docker.com/r/pgracio/dhis2-db/tags/

Let me know if you have problems.

Best regards,

Paulo

On Fri, May 27, 2016 at 1:00 PM Paulo Grácio paulogracio@gmail.com wrote:

@Nalinikanth, as Jason as mentioned I have created this repo to have API System Tests.
https://github.com/pgracio/ dhis2-api-system-test

This is a initial spike with 2 very basic tests. Please have a look to see if we can have a common agreement on how to do the tests. It includes some manual steps but soon I’ll add some automation mechanism to it to run the tests every time a new version is available.

Share your thoughts.

Best regards,

–Paulo

On Fri, May 27, 2016 at 12:18 PM Paul Grácio < paulogracio@gmail.com > wrote:

Hi Nalinikanth,

glad you are using dhis2-docker scripts :slight_smile:

Currently dhis2-db image only works for version 2.21 and 2.20, this needs some care from my side. Guess you are trying to run the latest version, 2.23

@Jason is snapshot database dump that works with version 2.3?

Best regards,

Paul Grácio

On Fri, May 27, 2016 at 11:00 AM Jason Pickering jason.p.pickering@gmail.com wrote:

Hi there.

I have been meaning to mail you about this. Paolo has another repo here

https://github.com/pgracio/ dhis2-api-system-test

which we started last week. It includes some very simply Chakram based tests.

I think this is more or less what we discussed a few weeks back. Paolo will also be working with us on this.

Maybe Paolo can comment more on the database.

I have another repo here

https://github.com/jason-p- pickering / dhis2-docker

which loads the training land database. I think this should point you in the right direction.

At any rate, we should probably start to issue some PRs on Paolo’s repo and then eventually, we will pull this into the main DHIS2 group repo.

Best regards,

Jason

On Fri, May 27, 2016 at 10:54 AM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

To add context, I am Nalinikanth M, QA at ThoughtWorks. We are working on DHIS2 for an MSF project. We wanted to automate a few tests on DHIS2. I got the docker repository from Jason as we were looking for setting up test environments. As a part of our Test plan we want to use Docker instances to run automation tests.

On Fri, May 27, 2016 at 1:55 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Gracio,

We are using the scripts from your repository to set up a docker environment for dhis2. We were able to get the application up on docker and can use the application, but we are unable to get Sierra Leone database on the application. Can you please help us resolve this issue.

P.S. : We are new to docker, we are following your Readme and docker documentation to set things up.

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+46764147049

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Hi Nalinikanth,

please fix those tests. It’s important that when you do this kind of refactor you change all the dependencies. Also to improve the quality of the review of the PRs it would be nicer to have smaller PRs instead of a big bang approach.

– Paulo

···

On Thu, Oct 6, 2016 at 5:25 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi,

try to use postman or curl to see if you get the expected response back. Or just enable debug in chakram to print request and response details to the console.
http://dareid.github.io/chakram/jsdoc/module-chakram.html#.startDebug

Before merge try to get a green pipeline :slight_smile: not sure it’s broken because of your changes… please have a look.

BR,

Paulo

On Thu, Oct 6, 2016 at 1:46 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hey Jason/Paulo,

I have sent a pull request to merge the tests that we have till now.

I got an issue while running a test for all the APIs The test is calling an API with improper authorization. When an API is called with improper authorization it should give 401 status. I used to get this status earlier but now couldn’t get the status it should be either issue with chakram or may be the we are not getting a proper json response from DHIS2. I will merge again once I fixed this issue. please let me know if you have any insights about this.

When running integration tests it should span across two environments so added a new environment file and aligned in the same way so we can pass it during run time for tests. Will be adding more data checks to sync and compare, once they are done will merge it.

And all the contract tests for versioning should be run on a fresh db instance and this should be run in the same order as in versioningContractTests.sh folder we tried keeping this tests independent but couldn’t succeed in achieving that as we don’t have a delete API to delete versions so once a version is created we need to delete it from backend so for now these tests are coupled. The delete API is there in our pipeline once that is developed I will decouple these tests.

Please let me know if you have any questions.

Thanks & Regards,
Nalinikanth M.

On Wed, Sep 28, 2016 at 10:17 AM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Paulo,

There is no specific reason these were under development. But these were dependent will have a look and see how can I merge them.

Thanks & Regards,

Nalinikanth M

On Wed, Sep 28, 2016 at 10:04 AM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

is there any reason to keep a separate repo with these tests? It would be nice to have the tests as part of https://github.com/dhis2/api-tests

There is also a pipeline to execute the tests in Travis https://travis-ci.org/dhis2/api-tests

Docker image build is also automated, runs once per day, and is publishing images to Docker Hub https://hub.docker.com/r/dhis2/dhis2-web/tags/

Best regards,

Paulo

On Tue, Sep 27, 2016 at 2:41 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason & Paulo,

Hope you are doing good.

It has been a long time! We are writing tests for metadata sync. All the tests are added to the repo here.

In the above repository navigate to the path API_Test/testcases/metadatasync/integration/

there is a generic test “ImportMetadataTest" which we wrote for testing how various types of metadata entities will sync from HQ/central instance to local/field.

There are two ways of running the test

  1. To run this test without any database on HQ and Local.

To test how sync is behaving with respect to various metadata entities on two new instances without any data model on it. All we need is to have metadata versions in this folder - API_Test/testdata/metadatasync/versiondata

We can have any number of versions in the folder. It depends on how user wants metadata sync to happen or what all metadata associations or disassociations user wants to test. For now I kept two version files.

To run the test for Version_1 run this should be run using "env version=“Version_2” mocha ImportMetadataTest.js --timeout 20000” which is can be added to a shell script to run version one after the other like it is in integrationTestsWithoutDB.sh file. This will first import data on HQ/Central instance using import api and then Local/field instance will sync the version from HQ.

Once the version is synced to Local/Field then we are doing two tests. One is asserting the data in

http://local/api/metadata/version/Version_1/data with http://HQ/api/metadata/version/Version_1/data by comparing them.

Later it will compare all the entities(which are present in that version) individually say we have a array of data elements then it will pick all the data elements and compare one by one and continues for other entities as well.

e.g: It will compare http://local/api/dataElements/id with http://HQ/api/dataElements/id

  1. To test how sync is behaving with respect to various metadata entities on two instances where HQ already have n versions[Pre defined database]. We are using the same script to import version by version. It will also do a couple of assertions on top of the metadata when synced. The first assertion being same as above it will compare the data in http://local/api/metadata/version/Version_1/data with http://HQ/api/metadata/version/Version_1/data.

But the next level comparison is a bit different it will compare the entities by fetching the entity data which is present in

http://local/api/metadata/version/Version_1/data with http://local/api/dataElements/id

Here there won’t be entire json for any entity on http://local/api/metadata/version/Version_1/data this will contain very limited details we are just comparing the minimal entities getting them from Local/Field using jsonfilters in api call.

We had this kind of assertions because say user has Version_1 and has a data element abcd1234 and the name might have changed in Version_2 abcd12345 as HQ has got n versions in it so if we want to compare json of it on both HQ and Local we have different names so we took this approach.

Can you please have a look at this and let me know if any changes are required.

Thanks & Regards,

Nalinikanth M

On Thu, Aug 18, 2016 at 11:32 AM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hey All,

I am Nalinikanth, QA on the MSF-OCA project and we are using DHIS2. We are building API automation suites as a part of our project. We are working along with Jason. P and Paulo. We have been discussing on how to take this forward and you can find our discussions thread in this mail.

Please do comment or provide feedback if you have any ideas or thoughts around the same.

I am attaching the repos as well for your reference.

https://github.com/msf-oca-his/API_Test

https://github.com/dhis2/api-tests

Feedback from the community would be well appreciated.

Thanks & Regards,

Nalinikanth

On Wed, Aug 10, 2016 at 3:41 PM, Vanya Seth vanyas@thoughtworks.com wrote:

Hi All

It makes sense to make this discussion public. So, that other members of the community can also provide their inputs.

Regards

Vanya

On Tue, Aug 9, 2016 at 1:36 AM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

is it the idea to keep a different repo with the tests for metadata versioning?

Regarding how to setup data I don’t have strong opinions on this. I think we should try one approach and see if it works. The initial idea was to have a docker image already baked with the data we want, for each test execution, that we can control using docker compose.

https://github.com/dhis2/api-tests/blob/master/docker-compose.yml

– Paulo

On Wed, Aug 3, 2016 at 3:21 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason/ Paulo,

Hope you are doing well.

As a part of the API testing we have written some test for the metadata versioning APIs, which is a core feature contributed by us to DHIS2 version 2.24. We did minor changes to the folder structure. We leverage before and after functions to setup and tear down data. Please have a look at the tests here. Please do let us know any feedback on the tests.

I have been through the repo that Paulo was working on, the way he extracted the version in env.js file looks okay but we did it in a slightly different way. That anyway would help us in providing the ability for tests to run across multiple versions of DHIS2.

One more thing to discuss upon is we can do contract testing of APIs which might not need a predefined data in the data base but, in some cases like when we test datavaluesets or any other similar APIs we might need some data which should already be set up. Similarly, we want to leverage the API testing to do integration tests as well. This will require a database set up to be done before the tests run on the system. For that we can have a DHIS2 empty instance on which we can set up the data and remove the database once the tests are run. We are looking at two ways to accomplish this:

  1. Setting the database dump using sql scripts.
  2. We can create data using metadata import API(using import API to set up metadata), where the set up will run before the tests.
    We how ever feel setting up metadata using APIs will be useful as we can leverage it irrespective of the database we are using and it will be able to create data properly across versions. Where as setting up the database using sql might have to be maintained and should be migrated properly for every version of DHIS2 release. So we are a kind of not wanting to implement this way. So we feel the second way of setting up data required for tests makes more sense. Can you please share your thoughts on this as well.

Thanks & Regards,

Nalinikanth M

On Tue, Jun 28, 2016 at 5:54 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Thank you Paulo, Enjoy your vacation we can discuss once you are back :slight_smile:

On Tue, Jun 28, 2016 at 5:51 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi, I think Jason is on vacation and I’m also leaving tomorrow. Just a heads up that the repo for the tests is now this one. https://github.com/dhis2/api-tests

BR,

Paulo

On Tue, Jun 28, 2016 at 1:00 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason & Paulo,

Hope you are doing good. We were busy with pushing the Metadata sync feature to DHIS2 trunk to make it in time for 2.24 release. We are done with that and I got some time to resume the automation. I was looking at https://github.com/pgracio/dhis2-api-system-test/, the tests are good and I would also like to same kind of test structure. Some clarifications though:

  1. How will we maintain the tests with respect to versioning of APIs?

As we know, now DHIS will be versioning APIs and there is going to be likely support for last three versions of APIs. So, we should be mindful of leveraging these tests for the future versions at the same time keeping them for previous versions as well.

We thought one possible approach, say we wrote tests on 23 APIs and then 24 APIs are released, we can clone the 23 repo and can create a new repo for 24 version, run all the tests and can raise bugs for valid breakages or fix the tests if required(if there is any change in contract of the APIs). So, this way we can have multiple repos for multiple versions of APIs. Only thing we need to take care of is extracting the URL to env file to make it easy to maintain. Or we can have a folder for each version in the single repo.

  1. As we already discussed about having the tests where we can set up required data using APIs which looks good for now. This should actually work fine when we test APIs for data elements, data sets etc. But in a bigger picture if we have to write tests for APIs like datavaluesets(which will give the data values of a data set). The entities involved here are “data elements, data sets, users, organisation units” and there are good number of associations involved in this scenario. So what do you think about such cases? Can we have a small database to preset these associations on which we can write tests and assert.

Understanding the above things would help us in making the tests scalable.

If you have any other things apart from this, we can discuss them as well. Please share your opinions on these things.

Thanks & Regards,

Nalinikanth M

On Mon, Jun 13, 2016 at 6:16 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason,

I see where you are coming from in terms of testing perspective. Different DBs can be a good input to test metadata import and export api in specific. But for a known state of DB to exist for other apis to be tested, DBs that are not compatible with the version being tested would be a problem.

@Paulo

I do agree with you on Option 3 so let us continue this email chain If necessary then we can setup a call. So let us keep the discussion going on here.

Regards,

Nalinikanth M

On Mon, Jun 13, 2016 at 12:22 PM, Jason Pickering jason.p.pickering@gmail.com wrote:

Hi there.

Here is my perspective. The entire purpose of the integration tests are to test these types of scenarios. Is it possible to perform a bulk metadata export from an arbitrary database, is sort of the test I think. Well, in this case, the developer of the API (Morten) tells you not to use this database because it is “old”. Well, it may be old, but it is also on the current version, so if this feature is supposed to work, well, it should work. If not, then we need to figure out why. That is the purpose of the test. I would expect this same test to work on any arbitrary database, so I think its perfectly legitimate, and see no reason why we should not test the SL database. Having said that, I think we should also test others, such as Trainingland, and enable the tests in such a way to allow people to arbitrarily test which ever system they wish. For the main tests, I think we should use the SL database specifically because it is “old” and in many ways, resembles a system which has been around a long time. Specfically for that reason, it should be tested, at least for certain test scenarios.

And having said all of that, we should not be testing scenarios which the feature developers wish us to test. That is not the point of these tests either. Currently the feature devs are writing their own tests, which is never really a good thing. The purpose of having an external team to develop these tests is to test things which maybe the feature devs don’t consider or don’t want to test.

Hope that helps to clarify my thinking here on what the original intent of these integration tests were. Does that help?

Regards,

Jason

P.S. Paulo’s time is very limited on this project, as he is acting as a part-time consultant to HISP Nordic. I suggest that we try and limit the need for calls unless really urgent, especially if Paulo needs to be involved. If you still feel a call is needed, lets try and start with me and then bring in Paulo in as needed. Paulo, you OK with that?

On Mon, Jun 13, 2016 at 8:35 AM, Paulo Grácio paulogracio@gmail.com wrote:

Hi, option 3 seems to me the best approach for now. What do you think?

Today I have a very busy day, but probably tomorrow morning we can have a call. What about 08:00AM CEST?

/Paulo

On Mon, Jun 13, 2016 at 7:55 AM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hey Paulo,

Thanks for your efforts, will try and let you know :slight_smile:

Jason & Paulo,

As Our team see a potential problem in using SL database, even we are confused of how to go ahead with tests, specially on what database.

Here are the options that we are looking at:

Option 1: Set up an empty vanilla instance. It is an empty database where we can set up data using APIs and can tear down once the tests are done. Entire data can be set up using a Json file or data can be created as required for every test.

Option 2: Set up a known state of database eg., say SL database. The state is maintained and we will be setting up the database before starting the execution of tests. As we are using docker every time we will have new instance say fresh SL database.

Option 3: We can have a know state of database with very low metadata in it. Where in we can add new data when required using APIs, as required for every test.

Can we have a call to discuss more on this. A 30 minutes call would do.

Thanks & Regards,

Nalinikanth M

On Sat, Jun 11, 2016 at 2:38 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi,

I have build a new image that can be used now to start a database container for 2.23-sierra-leone

image: pgracio/dhis2-db:2.23-sierra-leone

Give it a try and let me know if you have problems.

Regards,

Paulo

On Fri, Jun 10, 2016 at 10:44 AM Jason Pickering jason.p.pickering@gmail.com wrote:

I suggest that we use the SL demo. Reason being, it is stable, and does not change that much. I think that we can start with this. The Trainingland database is still under very active development. However, I don’t feel it makes a big difference. What is important is that we use a database which we know the state of. I think if Paulo can build a docker image tied to a given revision of the database, and we base our tests off of that, that would be the best approach.

Regards,

Jason

On Thu, Jun 9, 2016, 15:24 Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

I was using this as a reference to write the tests. https://github.com/dareid/chakram/blob/master/examples/spotify.js

Currently using an empty database, but we can use training. No strong opinions on this. Normally I prefer to have tests that don’t depend on database state, but in some situations it might be very difficult to create the desired state before running the test.

To make sure we are all on the same page it’s important that we use pull requests, before we merge things to master.

BR,

Paulo

On Thu, Jun 9, 2016 at 2:57 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason,

It is about what state of database we are going to use say training database or sierra leone or any other known state of database or a vanilla instance. Basically what is the state of the database.

How the tests will look like. As, we can write in so many ways, making sure that we all are on same page.

Call for 30min would be enough for this.

Best Regards,

Nalinikanth

On Thu, Jun 9, 2016 at 6:20 PM, Jason Pickering jason.p.pickering@gmail.com wrote:

Hi Nalinkath,

Paulo and I have quite limited time for this activity. Could you outline what the call would be about?

Regards,

Jason

On Thu, Jun 9, 2016, 14:48 Paulo Grácio paulogracio@gmail.com wrote:

which time zone are you in?

On Thu, Jun 9, 2016 at 2:42 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Paulo,

I am working on it. Yeah I am actually looking to discuss on few things with you and Jason. Can we setup a call to discuss on this based on your availability. I am planing to have a call with you and Jason next week.

@Paulo, @Jason

Please let me know you availability.

On Thu, Jun 9, 2016 at 5:54 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

are you doing any work on system test? I had a look at your repo and was considering to merge that with what I have.

BR,

Paulo

On Fri, Jun 3, 2016 at 1:34 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Paulo,

Thanks for your valuable inputs. I will try and will come back to you.

Regards,

Nalinikanth

On Fri, Jun 3, 2016 at 3:22 PM, Paulo Grácio paulogracio@gmail.com wrote:

HI Nalinikanth,

Currently new dhis2 war files for version 2.23 are generated and published by this job http://ci.dhis2.org/job/dhis2-2.23/. One of the final steps is copy-to-dhis2com.sh that makes this new war available for download at https://www.dhis2.org/download/releases/2.23/dhis.war

I think we could have a downstream job that generates a new docker image using the previously generated war file and publish it to docker hub. Automation for this can be found here

https://github.com/pgracio/dhis2-docker/blob/master/docker-build.sh.

Once the docker image is successfully generated we can run system test using docker compose.

https://github.com/pgracio/dhis2-api-system-test/blob/master/docker-compose.yml

All of these can be executed in Jenkins server, if the server as capacity to handle this, so no need to spin up new environments to execute the tests.

I see this as a initial step to introduce system test. With the pipeline flow that I have described, we’ll still deploy the war file if we detect potential errors during system test. A more long term vision for dhis2 pipeline would be

#1 - build dhis2 war file, without copy the file to dhis2.com

#2 - build docker image, without publish to docker hub

#3 - run system test, if success got to #4 else notify of broken tests.

#4 - copy war file to dhis2.com and publish docker image to docker hub.

Feel free to challenge this, it’s just one opinion. I guess dhis2 developers community might have a saying on this.

Best regards,

Paulo

On Fri, Jun 3, 2016 at 7:48 AM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Paulo: It is more on how the test environments are set up, say set up a docker environment and run tests on it. How the tests effect when it is a environment set up using different continuous integration environments say Jenkins/travis/GO etc. This kind of stuff is what I meant by maintaining environments.

Regards,

Nalinikanth

On Thu, Jun 2, 2016 at 12:23 AM, Paulo Grácio paulogracio@gmail.com wrote:

@ Nalinikanth what exactly do you mean with “maintain environments”?

Best regards,

Paulo

On Wed, Jun 1, 2016 at 1:57 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Paulo: Thank you for your quick response Its working fine now.

I had a look at you repo API tests. It looks good and I wrote some tests quite a while ago using the same framework. I tried to extract the data out of the tests to decrease dependency and to make things easy to maintain. You can find them here. Please have a look at them and let me know your opinions on it.

@Jason & @Paulo: May be next week we can have a call to talk about how the tests should look like and how we can maintain environments.

Thanks & Regards,

Nalinikanth M

On Fri, May 27, 2016 at 8:57 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Gracio I’m glad with your response and I had a look at the api test repo that looks good. I am on vacation till Tuesday. Will get back to you with my thoughts on it soon I’m back from vacation. I would love to talk more in the agreement as well, may be we can set up a call later next week or some time when it’s feasible for all of us.

Thanks & Regards,

Nalinikanth M

Sent from my iPhone

On 27-May-2016, at 7:46 PM, Paulo Grácio paulogracio@gmail.com wrote:

@Nalinikanth I have updated the repo, it should be now possible to start the service using training database.

https://github.com/pgracio/dhis2-docker/blob/master/docker-compose.yml

https://hub.docker.com/r/pgracio/dhis2-db/tags/

Let me know if you have problems.

Best regards,

Paulo

On Fri, May 27, 2016 at 1:00 PM Paulo Grácio paulogracio@gmail.com wrote:

@Nalinikanth, as Jason as mentioned I have created this repo to have API System Tests.
https://github.com/pgracio/ dhis2-api-system-test

This is a initial spike with 2 very basic tests. Please have a look to see if we can have a common agreement on how to do the tests. It includes some manual steps but soon I’ll add some automation mechanism to it to run the tests every time a new version is available.

Share your thoughts.

Best regards,

–Paulo

On Fri, May 27, 2016 at 12:18 PM Paul Grácio < paulogracio@gmail.com > wrote:

Hi Nalinikanth,

glad you are using dhis2-docker scripts :slight_smile:

Currently dhis2-db image only works for version 2.21 and 2.20, this needs some care from my side. Guess you are trying to run the latest version, 2.23

@Jason is snapshot database dump that works with version 2.3?

Best regards,

Paul Grácio

On Fri, May 27, 2016 at 11:00 AM Jason Pickering jason.p.pickering@gmail.com wrote:

Hi there.

I have been meaning to mail you about this. Paolo has another repo here

https://github.com/pgracio/ dhis2-api-system-test

which we started last week. It includes some very simply Chakram based tests.

I think this is more or less what we discussed a few weeks back. Paolo will also be working with us on this.

Maybe Paolo can comment more on the database.

I have another repo here

https://github.com/jason-p- pickering / dhis2-docker

which loads the training land database. I think this should point you in the right direction.

At any rate, we should probably start to issue some PRs on Paolo’s repo and then eventually, we will pull this into the main DHIS2 group repo.

Best regards,

Jason

On Fri, May 27, 2016 at 10:54 AM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

To add context, I am Nalinikanth M, QA at ThoughtWorks. We are working on DHIS2 for an MSF project. We wanted to automate a few tests on DHIS2. I got the docker repository from Jason as we were looking for setting up test environments. As a part of our Test plan we want to use Docker instances to run automation tests.

On Fri, May 27, 2016 at 1:55 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Gracio,

We are using the scripts from your repository to set up a docker environment for dhis2. We were able to get the application up on docker and can use the application, but we are unable to get Sierra Leone database on the application. Can you please help us resolve this issue.

P.S. : We are new to docker, we are following your Readme and docker documentation to set things up.

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+46764147049

Thanks & Regards,

Nalinikanth M

Quality Analyst

Hi Paulo,

I have fixed the tests and committed them.

···

On Thu, Oct 6, 2016 at 5:45 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

please fix those tests. It’s important that when you do this kind of refactor you change all the dependencies. Also to improve the quality of the review of the PRs it would be nicer to have smaller PRs instead of a big bang approach.

– Paulo

On Thu, Oct 6, 2016 at 2:11 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hey Paulo,

Forgot to mention I changed two variables in env.js to properRequestParams from auth as it has got all the headers not just authorization details. If it is okay I will go ahead and fix your tests as well else I will do changes in my tests.

Thanks & Regards,
Nalinikanth M

On Thu, Oct 6, 2016 at 5:25 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi,

try to use postman or curl to see if you get the expected response back. Or just enable debug in chakram to print request and response details to the console.
http://dareid.github.io/chakram/jsdoc/module-chakram.html#.startDebug

Before merge try to get a green pipeline :slight_smile: not sure it’s broken because of your changes… please have a look.

BR,

Paulo

On Thu, Oct 6, 2016 at 1:46 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hey Jason/Paulo,

I have sent a pull request to merge the tests that we have till now.

I got an issue while running a test for all the APIs The test is calling an API with improper authorization. When an API is called with improper authorization it should give 401 status. I used to get this status earlier but now couldn’t get the status it should be either issue with chakram or may be the we are not getting a proper json response from DHIS2. I will merge again once I fixed this issue. please let me know if you have any insights about this.

When running integration tests it should span across two environments so added a new environment file and aligned in the same way so we can pass it during run time for tests. Will be adding more data checks to sync and compare, once they are done will merge it.

And all the contract tests for versioning should be run on a fresh db instance and this should be run in the same order as in versioningContractTests.sh folder we tried keeping this tests independent but couldn’t succeed in achieving that as we don’t have a delete API to delete versions so once a version is created we need to delete it from backend so for now these tests are coupled. The delete API is there in our pipeline once that is developed I will decouple these tests.

Please let me know if you have any questions.

Thanks & Regards,
Nalinikanth M.

On Wed, Sep 28, 2016 at 10:17 AM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Paulo,

There is no specific reason these were under development. But these were dependent will have a look and see how can I merge them.

Thanks & Regards,

Nalinikanth M

On Wed, Sep 28, 2016 at 10:04 AM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

is there any reason to keep a separate repo with these tests? It would be nice to have the tests as part of https://github.com/dhis2/api-tests

There is also a pipeline to execute the tests in Travis https://travis-ci.org/dhis2/api-tests

Docker image build is also automated, runs once per day, and is publishing images to Docker Hub https://hub.docker.com/r/dhis2/dhis2-web/tags/

Best regards,

Paulo

On Tue, Sep 27, 2016 at 2:41 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason & Paulo,

Hope you are doing good.

It has been a long time! We are writing tests for metadata sync. All the tests are added to the repo here.

In the above repository navigate to the path API_Test/testcases/metadatasync/integration/

there is a generic test “ImportMetadataTest" which we wrote for testing how various types of metadata entities will sync from HQ/central instance to local/field.

There are two ways of running the test

  1. To run this test without any database on HQ and Local.

To test how sync is behaving with respect to various metadata entities on two new instances without any data model on it. All we need is to have metadata versions in this folder - API_Test/testdata/metadatasync/versiondata

We can have any number of versions in the folder. It depends on how user wants metadata sync to happen or what all metadata associations or disassociations user wants to test. For now I kept two version files.

To run the test for Version_1 run this should be run using "env version=“Version_2” mocha ImportMetadataTest.js --timeout 20000” which is can be added to a shell script to run version one after the other like it is in integrationTestsWithoutDB.sh file. This will first import data on HQ/Central instance using import api and then Local/field instance will sync the version from HQ.

Once the version is synced to Local/Field then we are doing two tests. One is asserting the data in

http://local/api/metadata/version/Version_1/data with http://HQ/api/metadata/version/Version_1/data by comparing them.

Later it will compare all the entities(which are present in that version) individually say we have a array of data elements then it will pick all the data elements and compare one by one and continues for other entities as well.

e.g: It will compare http://local/api/dataElements/id with http://HQ/api/dataElements/id

  1. To test how sync is behaving with respect to various metadata entities on two instances where HQ already have n versions[Pre defined database]. We are using the same script to import version by version. It will also do a couple of assertions on top of the metadata when synced. The first assertion being same as above it will compare the data in http://local/api/metadata/version/Version_1/data with http://HQ/api/metadata/version/Version_1/data.

But the next level comparison is a bit different it will compare the entities by fetching the entity data which is present in

http://local/api/metadata/version/Version_1/data with http://local/api/dataElements/id

Here there won’t be entire json for any entity on http://local/api/metadata/version/Version_1/data this will contain very limited details we are just comparing the minimal entities getting them from Local/Field using jsonfilters in api call.

We had this kind of assertions because say user has Version_1 and has a data element abcd1234 and the name might have changed in Version_2 abcd12345 as HQ has got n versions in it so if we want to compare json of it on both HQ and Local we have different names so we took this approach.

Can you please have a look at this and let me know if any changes are required.

Thanks & Regards,

Nalinikanth M

On Thu, Aug 18, 2016 at 11:32 AM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hey All,

I am Nalinikanth, QA on the MSF-OCA project and we are using DHIS2. We are building API automation suites as a part of our project. We are working along with Jason. P and Paulo. We have been discussing on how to take this forward and you can find our discussions thread in this mail.

Please do comment or provide feedback if you have any ideas or thoughts around the same.

I am attaching the repos as well for your reference.

https://github.com/msf-oca-his/API_Test

https://github.com/dhis2/api-tests

Feedback from the community would be well appreciated.

Thanks & Regards,

Nalinikanth

On Wed, Aug 10, 2016 at 3:41 PM, Vanya Seth vanyas@thoughtworks.com wrote:

Hi All

It makes sense to make this discussion public. So, that other members of the community can also provide their inputs.

Regards

Vanya

On Tue, Aug 9, 2016 at 1:36 AM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

is it the idea to keep a different repo with the tests for metadata versioning?

Regarding how to setup data I don’t have strong opinions on this. I think we should try one approach and see if it works. The initial idea was to have a docker image already baked with the data we want, for each test execution, that we can control using docker compose.

https://github.com/dhis2/api-tests/blob/master/docker-compose.yml

– Paulo

On Wed, Aug 3, 2016 at 3:21 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason/ Paulo,

Hope you are doing well.

As a part of the API testing we have written some test for the metadata versioning APIs, which is a core feature contributed by us to DHIS2 version 2.24. We did minor changes to the folder structure. We leverage before and after functions to setup and tear down data. Please have a look at the tests here. Please do let us know any feedback on the tests.

I have been through the repo that Paulo was working on, the way he extracted the version in env.js file looks okay but we did it in a slightly different way. That anyway would help us in providing the ability for tests to run across multiple versions of DHIS2.

One more thing to discuss upon is we can do contract testing of APIs which might not need a predefined data in the data base but, in some cases like when we test datavaluesets or any other similar APIs we might need some data which should already be set up. Similarly, we want to leverage the API testing to do integration tests as well. This will require a database set up to be done before the tests run on the system. For that we can have a DHIS2 empty instance on which we can set up the data and remove the database once the tests are run. We are looking at two ways to accomplish this:

  1. Setting the database dump using sql scripts.
  2. We can create data using metadata import API(using import API to set up metadata), where the set up will run before the tests.
    We how ever feel setting up metadata using APIs will be useful as we can leverage it irrespective of the database we are using and it will be able to create data properly across versions. Where as setting up the database using sql might have to be maintained and should be migrated properly for every version of DHIS2 release. So we are a kind of not wanting to implement this way. So we feel the second way of setting up data required for tests makes more sense. Can you please share your thoughts on this as well.

Thanks & Regards,

Nalinikanth M

On Tue, Jun 28, 2016 at 5:54 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Thank you Paulo, Enjoy your vacation we can discuss once you are back :slight_smile:

On Tue, Jun 28, 2016 at 5:51 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi, I think Jason is on vacation and I’m also leaving tomorrow. Just a heads up that the repo for the tests is now this one. https://github.com/dhis2/api-tests

BR,

Paulo

On Tue, Jun 28, 2016 at 1:00 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason & Paulo,

Hope you are doing good. We were busy with pushing the Metadata sync feature to DHIS2 trunk to make it in time for 2.24 release. We are done with that and I got some time to resume the automation. I was looking at https://github.com/pgracio/dhis2-api-system-test/, the tests are good and I would also like to same kind of test structure. Some clarifications though:

  1. How will we maintain the tests with respect to versioning of APIs?

As we know, now DHIS will be versioning APIs and there is going to be likely support for last three versions of APIs. So, we should be mindful of leveraging these tests for the future versions at the same time keeping them for previous versions as well.

We thought one possible approach, say we wrote tests on 23 APIs and then 24 APIs are released, we can clone the 23 repo and can create a new repo for 24 version, run all the tests and can raise bugs for valid breakages or fix the tests if required(if there is any change in contract of the APIs). So, this way we can have multiple repos for multiple versions of APIs. Only thing we need to take care of is extracting the URL to env file to make it easy to maintain. Or we can have a folder for each version in the single repo.

  1. As we already discussed about having the tests where we can set up required data using APIs which looks good for now. This should actually work fine when we test APIs for data elements, data sets etc. But in a bigger picture if we have to write tests for APIs like datavaluesets(which will give the data values of a data set). The entities involved here are “data elements, data sets, users, organisation units” and there are good number of associations involved in this scenario. So what do you think about such cases? Can we have a small database to preset these associations on which we can write tests and assert.

Understanding the above things would help us in making the tests scalable.

If you have any other things apart from this, we can discuss them as well. Please share your opinions on these things.

Thanks & Regards,

Nalinikanth M

On Mon, Jun 13, 2016 at 6:16 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason,

I see where you are coming from in terms of testing perspective. Different DBs can be a good input to test metadata import and export api in specific. But for a known state of DB to exist for other apis to be tested, DBs that are not compatible with the version being tested would be a problem.

@Paulo

I do agree with you on Option 3 so let us continue this email chain If necessary then we can setup a call. So let us keep the discussion going on here.

Regards,

Nalinikanth M

On Mon, Jun 13, 2016 at 12:22 PM, Jason Pickering jason.p.pickering@gmail.com wrote:

Hi there.

Here is my perspective. The entire purpose of the integration tests are to test these types of scenarios. Is it possible to perform a bulk metadata export from an arbitrary database, is sort of the test I think. Well, in this case, the developer of the API (Morten) tells you not to use this database because it is “old”. Well, it may be old, but it is also on the current version, so if this feature is supposed to work, well, it should work. If not, then we need to figure out why. That is the purpose of the test. I would expect this same test to work on any arbitrary database, so I think its perfectly legitimate, and see no reason why we should not test the SL database. Having said that, I think we should also test others, such as Trainingland, and enable the tests in such a way to allow people to arbitrarily test which ever system they wish. For the main tests, I think we should use the SL database specifically because it is “old” and in many ways, resembles a system which has been around a long time. Specfically for that reason, it should be tested, at least for certain test scenarios.

And having said all of that, we should not be testing scenarios which the feature developers wish us to test. That is not the point of these tests either. Currently the feature devs are writing their own tests, which is never really a good thing. The purpose of having an external team to develop these tests is to test things which maybe the feature devs don’t consider or don’t want to test.

Hope that helps to clarify my thinking here on what the original intent of these integration tests were. Does that help?

Regards,

Jason

P.S. Paulo’s time is very limited on this project, as he is acting as a part-time consultant to HISP Nordic. I suggest that we try and limit the need for calls unless really urgent, especially if Paulo needs to be involved. If you still feel a call is needed, lets try and start with me and then bring in Paulo in as needed. Paulo, you OK with that?

On Mon, Jun 13, 2016 at 8:35 AM, Paulo Grácio paulogracio@gmail.com wrote:

Hi, option 3 seems to me the best approach for now. What do you think?

Today I have a very busy day, but probably tomorrow morning we can have a call. What about 08:00AM CEST?

/Paulo

On Mon, Jun 13, 2016 at 7:55 AM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hey Paulo,

Thanks for your efforts, will try and let you know :slight_smile:

Jason & Paulo,

As Our team see a potential problem in using SL database, even we are confused of how to go ahead with tests, specially on what database.

Here are the options that we are looking at:

Option 1: Set up an empty vanilla instance. It is an empty database where we can set up data using APIs and can tear down once the tests are done. Entire data can be set up using a Json file or data can be created as required for every test.

Option 2: Set up a known state of database eg., say SL database. The state is maintained and we will be setting up the database before starting the execution of tests. As we are using docker every time we will have new instance say fresh SL database.

Option 3: We can have a know state of database with very low metadata in it. Where in we can add new data when required using APIs, as required for every test.

Can we have a call to discuss more on this. A 30 minutes call would do.

Thanks & Regards,

Nalinikanth M

On Sat, Jun 11, 2016 at 2:38 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi,

I have build a new image that can be used now to start a database container for 2.23-sierra-leone

image: pgracio/dhis2-db:2.23-sierra-leone

Give it a try and let me know if you have problems.

Regards,

Paulo

On Fri, Jun 10, 2016 at 10:44 AM Jason Pickering jason.p.pickering@gmail.com wrote:

I suggest that we use the SL demo. Reason being, it is stable, and does not change that much. I think that we can start with this. The Trainingland database is still under very active development. However, I don’t feel it makes a big difference. What is important is that we use a database which we know the state of. I think if Paulo can build a docker image tied to a given revision of the database, and we base our tests off of that, that would be the best approach.

Regards,

Jason

On Thu, Jun 9, 2016, 15:24 Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

I was using this as a reference to write the tests. https://github.com/dareid/chakram/blob/master/examples/spotify.js

Currently using an empty database, but we can use training. No strong opinions on this. Normally I prefer to have tests that don’t depend on database state, but in some situations it might be very difficult to create the desired state before running the test.

To make sure we are all on the same page it’s important that we use pull requests, before we merge things to master.

BR,

Paulo

On Thu, Jun 9, 2016 at 2:57 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Jason,

It is about what state of database we are going to use say training database or sierra leone or any other known state of database or a vanilla instance. Basically what is the state of the database.

How the tests will look like. As, we can write in so many ways, making sure that we all are on same page.

Call for 30min would be enough for this.

Best Regards,

Nalinikanth

On Thu, Jun 9, 2016 at 6:20 PM, Jason Pickering jason.p.pickering@gmail.com wrote:

Hi Nalinkath,

Paulo and I have quite limited time for this activity. Could you outline what the call would be about?

Regards,

Jason

On Thu, Jun 9, 2016, 14:48 Paulo Grácio paulogracio@gmail.com wrote:

which time zone are you in?

On Thu, Jun 9, 2016 at 2:42 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Paulo,

I am working on it. Yeah I am actually looking to discuss on few things with you and Jason. Can we setup a call to discuss on this based on your availability. I am planing to have a call with you and Jason next week.

@Paulo, @Jason

Please let me know you availability.

On Thu, Jun 9, 2016 at 5:54 PM, Paulo Grácio paulogracio@gmail.com wrote:

Hi Nalinikanth,

are you doing any work on system test? I had a look at your repo and was considering to merge that with what I have.

BR,

Paulo

On Fri, Jun 3, 2016 at 1:34 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Paulo,

Thanks for your valuable inputs. I will try and will come back to you.

Regards,

Nalinikanth

On Fri, Jun 3, 2016 at 3:22 PM, Paulo Grácio paulogracio@gmail.com wrote:

HI Nalinikanth,

Currently new dhis2 war files for version 2.23 are generated and published by this job http://ci.dhis2.org/job/dhis2-2.23/. One of the final steps is copy-to-dhis2com.sh that makes this new war available for download at https://www.dhis2.org/download/releases/2.23/dhis.war

I think we could have a downstream job that generates a new docker image using the previously generated war file and publish it to docker hub. Automation for this can be found here

https://github.com/pgracio/dhis2-docker/blob/master/docker-build.sh.

Once the docker image is successfully generated we can run system test using docker compose.

https://github.com/pgracio/dhis2-api-system-test/blob/master/docker-compose.yml

All of these can be executed in Jenkins server, if the server as capacity to handle this, so no need to spin up new environments to execute the tests.

I see this as a initial step to introduce system test. With the pipeline flow that I have described, we’ll still deploy the war file if we detect potential errors during system test. A more long term vision for dhis2 pipeline would be

#1 - build dhis2 war file, without copy the file to dhis2.com

#2 - build docker image, without publish to docker hub

#3 - run system test, if success got to #4 else notify of broken tests.

#4 - copy war file to dhis2.com and publish docker image to docker hub.

Feel free to challenge this, it’s just one opinion. I guess dhis2 developers community might have a saying on this.

Best regards,

Paulo

On Fri, Jun 3, 2016 at 7:48 AM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Paulo: It is more on how the test environments are set up, say set up a docker environment and run tests on it. How the tests effect when it is a environment set up using different continuous integration environments say Jenkins/travis/GO etc. This kind of stuff is what I meant by maintaining environments.

Regards,

Nalinikanth

On Thu, Jun 2, 2016 at 12:23 AM, Paulo Grácio paulogracio@gmail.com wrote:

@ Nalinikanth what exactly do you mean with “maintain environments”?

Best regards,

Paulo

On Wed, Jun 1, 2016 at 1:57 PM Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Paulo: Thank you for your quick response Its working fine now.

I had a look at you repo API tests. It looks good and I wrote some tests quite a while ago using the same framework. I tried to extract the data out of the tests to decrease dependency and to make things easy to maintain. You can find them here. Please have a look at them and let me know your opinions on it.

@Jason & @Paulo: May be next week we can have a call to talk about how the tests should look like and how we can maintain environments.

Thanks & Regards,

Nalinikanth M

On Fri, May 27, 2016 at 8:57 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

@Gracio I’m glad with your response and I had a look at the api test repo that looks good. I am on vacation till Tuesday. Will get back to you with my thoughts on it soon I’m back from vacation. I would love to talk more in the agreement as well, may be we can set up a call later next week or some time when it’s feasible for all of us.

Thanks & Regards,

Nalinikanth M

Sent from my iPhone

On 27-May-2016, at 7:46 PM, Paulo Grácio paulogracio@gmail.com wrote:

@Nalinikanth I have updated the repo, it should be now possible to start the service using training database.

https://github.com/pgracio/dhis2-docker/blob/master/docker-compose.yml

https://hub.docker.com/r/pgracio/dhis2-db/tags/

Let me know if you have problems.

Best regards,

Paulo

On Fri, May 27, 2016 at 1:00 PM Paulo Grácio paulogracio@gmail.com wrote:

@Nalinikanth, as Jason as mentioned I have created this repo to have API System Tests.
https://github.com/pgracio/ dhis2-api-system-test

This is a initial spike with 2 very basic tests. Please have a look to see if we can have a common agreement on how to do the tests. It includes some manual steps but soon I’ll add some automation mechanism to it to run the tests every time a new version is available.

Share your thoughts.

Best regards,

–Paulo

On Fri, May 27, 2016 at 12:18 PM Paul Grácio < paulogracio@gmail.com > wrote:

Hi Nalinikanth,

glad you are using dhis2-docker scripts :slight_smile:

Currently dhis2-db image only works for version 2.21 and 2.20, this needs some care from my side. Guess you are trying to run the latest version, 2.23

@Jason is snapshot database dump that works with version 2.3?

Best regards,

Paul Grácio

On Fri, May 27, 2016 at 11:00 AM Jason Pickering jason.p.pickering@gmail.com wrote:

Hi there.

I have been meaning to mail you about this. Paolo has another repo here

https://github.com/pgracio/ dhis2-api-system-test

which we started last week. It includes some very simply Chakram based tests.

I think this is more or less what we discussed a few weeks back. Paolo will also be working with us on this.

Maybe Paolo can comment more on the database.

I have another repo here

https://github.com/jason-p- pickering / dhis2-docker

which loads the training land database. I think this should point you in the right direction.

At any rate, we should probably start to issue some PRs on Paolo’s repo and then eventually, we will pull this into the main DHIS2 group repo.

Best regards,

Jason

On Fri, May 27, 2016 at 10:54 AM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

To add context, I am Nalinikanth M, QA at ThoughtWorks. We are working on DHIS2 for an MSF project. We wanted to automate a few tests on DHIS2. I got the docker repository from Jason as we were looking for setting up test environments. As a part of our Test plan we want to use Docker instances to run automation tests.

On Fri, May 27, 2016 at 1:55 PM, Nalinikanth Meesala nalinim@thoughtworks.com wrote:

Hi Gracio,

We are using the scripts from your repository to set up a docker environment for dhis2. We were able to get the application up on docker and can use the application, but we are unable to get Sierra Leone database on the application. Can you please help us resolve this issue.

P.S. : We are new to docker, we are following your Readme and docker documentation to set things up.

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks

Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+46764147049

Thanks & Regards,

Nalinikanth M

Quality Analyst

Thanks & Regards,

Nalinikanth M

Quality Analyst

Email
nalinim@thoughtworks.com
Telephone
+91 9052234588
ThoughtWorks