ASTRUM: Test Automation Tool

ASTRUM
Test Automation Tool

Automation testing is one of the buzzwords these days. It is a test technique in which testers write scripts in line with the analyzed requirements and compare the actual result with the expected result with the help of certain tools. Test scenarios are written and then run with the automation tool without the need for any manual intervention. Astrum is one of these test automation tools and we will discuss how we use this tool and the effects of automation in this post.

First of all, the same test may need to be run repeatedly in line with constantly developing and changing systems. With the test automation, the necessary scenarios for the test are recorded and re-executed in the desired situations. So, this enables us to make necessary controls in a shorter amount of time.

Secondly, performing the same tests manually to know if an added feature affects previous scenarios, whether or not it is still workable, whether or not it causes errors, can waste time and effort. As you can imagine, by automating the tests, we spend our remaining time and energy to provide our customers with a better product and performance.

To sum up, we run test scenarios quickly and repeatedly with ease in automation. By running our tests 24/7, we determine the errors caused by any reason early so that we can intervene on time and prevent them.

Why did we need to Automate Tests at Medianova?

Before, we used to run our tests manually. Running test cases that needed to be run multiple times manually was especially a waste of time and effort. In the manual method, we used to see an increase in effort directly proportional to the number of tests run. In the tests created with the automation test tool, we saw that with a few hours of extra effort in the first step, the scenarios can be ready to run again.

By automating the tests, we can also reduce the errors caused by human error. We can run tests that require multiple datasets in a shorter time with less effort. We can run more than 95% of the tests we do manually in automation. Thus, we can save time by reducing the cost by detecting the errors that may occur in the system as a result of an improvement earlier.

To illustrate, the average running time of 500 scenarios on automation is 20 minutes. Once our tests are run, we can clearly see the results and intervene early. Previously, our time to perform these tests manually was much longer. Astrum enabled us to provide our customers with a higher quality product in a shorter time.

In the table below, you can see the manual control of a sample test data and the running times of the same test in automation, including the preparation time in the test phase, according to the analysis.

We actually prepare our automation test while writing our test scenarios for the first run in automation. We can run the scenarios we have prepared more than once at the same time. In cases where we run more than one test at the same time, human intervention decreases in the tests we automate, while this is not valid in the tests we perform manually. The same effort is spent on each harness.

As can be seen from the table, when we run our test multiple times at the same time, the difference between the time we spend in automation and the time we perform this process manually increases significantly as the number of runs increases.

In addition, while preparing and sharing test reports requires the help of various reporting tools in manual execution, this reporting process is included in the test run process in test tools where the automation package is run. In other words, we do not spend extra effort on reporting.

Writing Test Scenarios in Conversational Language

While writing the scripts, we run the tests by giving commands in conversational language. In this way, we ensure that even users who do not have deep knowledge can easily adapt, comment on the scenario and make the work easily transferable.

Furthermore, we use easy-to-read and repeatable flexible syntaxes based on the “markdown” format when writing scripts. Hence, we automate these scenarios while our analysis team creates them. We make sure that the syntax is simple and flexible. We simulate many features such as clicking buttons, selecting menu options, calling object methods and properties by specifying keywords in our scenarios.

Let’s show this with an example:

## Login Success Scenario

  1. * Open a web browser using "Chrome"
  2. * Open "Login" page
  3. * Write "example@medianova.com" to "email" field
  4. * Write "examplepwd" to "password" field
  5. * Click to "Login" button

As you can see, we included the login process in only 5 lines in the automation process.

We enable our commands in the scenario we create with keywords to be easily used by non-technical users and to write powerful and robust automation test scenarios. To summarize, we can easily say that we have increased our product quality by integrating the scenarios we have created into the CI / CD processes of our existing projects, and therefore we have minimized our mistakes in the projects that have been put live.

In short, Astrum is of great importance to us in order to provide a more stable service to our customers and we see Astrum as an important product that will take Medianova one step further. We will be diving into the technical details of Astrum in our next post.

Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors