Many times in the past I have created tests for applications that use a complex database structure to produce some results. Every time I have resorted to setting up a static data structure and then make my tests agains that collection of data.
The problem with my old approach is that it is not very stable. What if someone tampers with your data? What if you want to move the tests to a UAT environment, or worse yet - production?
The ideal scenario, I think, is to have the test
1. Set all the necessary data up
2. Do the test
3. Clean up the data
But that sounds a lot simpler than it is.
Setting the data up
Using Soap UI I created a bunch of JDBC test steps to create every tiny bit of data that I needed. I put all of these steps in a disabled test case that I trigger manually from the actual cases.
I divided the data into two categories
- Stable data, or "master data"
This data will most probably exist in all environments, but not necessarily with the same ids.
Therefore a needed a bunch of "find XXX" test steps. I then transferred all those values into properties I could access later. - Dynamic data
This is the data I will perform my actual tests against. Its important that I have full control over this data to create valid tests and to be able to trust the results. For this I created a bunch of JDBC steps that insert data into different tables.
One thing I discovered is that having GUIDs as primary keys, as opposed to having auto-ids, makes testing easier. I created some Groovy test steps that produced new GUIDs for all new entities. I stored these values as properties for later access.
As you can see on the right - there were a whole bunch of steps to make this happen...
Testing against the data
This is the easy part. Since all values I needed were transferred into properties I could use them in my actual tests. I actually put the setup data test case in a disabled test suite, and that is quite useful. In my tests I don't want to care about where the data comes from or how it is created. I only want to be provided with some values that will make my tests do what I want them to. Properties in Soap UI is your friend here.
Cleaning up
To clean up I created another test case in my disabled suite that took all of the generated Ids and deleted them from the database. No trace whatsoever! But... the infrastructure to do this is quite tiresome. Read on...
Maintaining tests
I wanted to create my data once and then do many tests against it. In fact, I wanted to create some base data that would be available for all tests and then some per-test data. The per-test data could be setup using a test step in my test. It fits nicely under a GIVEN step. (See my post on BDD in Soap UI) But the clean up fits nowhere in the test. Soap UI gives you Setup and TearDown possibilities on many levels so for this I put a TearDown step on my test case.
I created a disabled test case that took care of the cleaning and I made a Groovy script call to execute it.
While this is nice and dandy for the per-test data, the global data was still an issue. I could put a setup and teardown script on the test suite level, but that requires anyone that uses these tests to never run any case in isolation. That would mean that global data would not be available or linger on long after the test was finished.
So, I put all of the data creation and cleaning on all the test cases. That did nothing good to my performance... What I am considering now is to have some sort of flagging for the setup so that a test case will not set global data up if it has already been setup.
BUT... I think flags like "IsSetup" is kind of a smell. So, here I am - I have stable tests that perform crappy. Do I care? Well, if the trade-off is between performance and stability, I choose stability any day. But I would really like to find a better way of doing this. Maybe it is not supposed to be done at all? Maybe I am grasping for test Utopia?
With those indecisive words, I bid you good night.
Ps. Any other suggestions on how to do these kind of tests, or motivations on why I shouldn't at all, are appreciated. Ds.
No comments:
Post a Comment