Tuesday, July 31, 2018

30 Days of Automation in Testing | Testing Made Faster

Speeding up automated checks & execution time

The problem(s):

  1. Too many functions happening at the same time
  2. A lot of verbose syntax and unnecessary UI waits anticipating lags from the back-end.

  3. Too much of the Step > verifyThis() > doThis() > Step_

  4. Another thing that choked the script was adding assertion / verification checks for each step of the test.
    What ended up happening was an unnecessary delay in test execution often leading to false-negatives or script failures.

  5. Spaghetti code
  6. As newbs to automation, the mistake I've seen is poorly written tests with a superfluous amount of steps and repetitive code to accomplish the simplest of tasks.
    I was guilty of this for a while. Until I learned how to abstract my scripts into distinct code-blocks.

The Solution:

So what worked for me was changing how I approached test composition with a strong focus on making it legible so that anyone can look at the test and know what's going on.

Before, a lot of times, a test (written in JS, for example) would look something like this:

driver.setUp(); driver.getUrl(“http://example.com”); driver.findElementById(’//a[@text=“register”]’).clearText(); driver.findElementById(’//input[@id=“textBox”]’).clearText(); driver.findElementById(’//input[@id=“textBox”]’).sendKeys(“verbose code”); driver.findElementById(’//button["@id=“submit”]’).click(); var pageTitle = driver.findElementByTag(’//h1[@class=“pageHeader”]’).getText(); assert pageTitle == “Success Page” driver.tearDown();

As you can see, the test is a bit hard to read and not very descriptive of what is going on. Having applied the Single Responsibility Principle, I rewrote the test to look something like this:

Using Katalon

WebUI.openBrowser( pageUrl );
WebUI.click( registrationLink );
onRegistrationForm.CompleteAndSubmitForm();
onProfilePage.VerifyInfo();

Notice the following:

  • pageUrl - I didn't need to explicitly write out the site url, I can declare a variable and reference it

  • registrationLink - as stated, I declare my variable elsewhere and call it here

  • onRegistrationForm || onProfilePage - these are classes I create in a separate file and import it as part of a function (package).

  • SubmitForm () || VerifyInfo () - Each class has a set of methods. A class can have many functions, but there should not be any occurrences of multiple classes on the same file (_hence SRP_). The exception being helpers .. but that's another topic.

  • I also reduced unnecessary element checks on the test and have them happening as separate actions in the aforementioned classes. The end result makes it easier to maintain the test by fixing only the function that needs it.

Conclusion:

While the "Solution" example was written in Groovy, I've applied similar concepts when writing tests in JS or Python. I like that the test is readable and that anyone can see what I'm testing in seconds.

It makes it easy to pair the test scenario with the acceptance criteria to ensure the proper workflows are being tested.

It also makes it super-simple for anyone inheriting my project to see what I'm doing and pick-up where I left off. An import thing for teams.

Security Testing | Security 4 No0bs - Personal Wiki on Information Security Stuff


Running into a lot of great articles, stashing them here:

Monday, July 23, 2018

30 Days of Automation in Testing | Tests & Testability

Dear Reader,

Regarding the subject of testability, there are many opinions that yield similar single conclusion: code, as it is written, ought to be testable.

Where the divergence happens is in how the aforementioned code is written. Some take to abstractions, where chunks of code are split off into separate sections.

Others keep the code in-line leading to verbose composition.
Then along comes the Single Responsibility Principle.

What the Single Responsibility Principle (SRP) taught me, a noOb Automation Engineer

Not too long ago, an awesome teammate (Dev) introduced me to SRP and the code concepts behind it.

We were working on a project and our synergy was off-the-charts. I would automate a lot of the test efforts thereby maximizing the turn-around time to fixing bugs and he would help with code reviews and solutions when I hit a wall.

Along the way, he passed along the tidbit of information regarding SRP and it opened my mind to a new way of thinking about how I need to restructure my tests.

Code Quality

One of the first take-aways from learning about SRP and S.O.L.I.D. principles (click link for more details) is that test and the code they're written ought to be neatly composed and clearly structured.
  • A class should only have one reason for change.
  • A class can have many functions, but not the other way around.
  • A module should never consist of multiple classes.

Code Efficiency

Another take-way was modularization of tests.
  • Increased Speed - By keeping modules separate, the speed with which tests fire become greatly improved.
  • Greater Reliability - modularizing tests increased reliability at deployment and improved as more and more tests were added to the test bed.
  • Flexibility - as the project evolved, tests can be updated / removed with minimal to no impact to the rest of the test suite

Code Legibility

  • Improving code efficiency parlayed into better code legibility. By abstracting tests into methods and keeping classes separate, I went from a verbose test filled with "spaghetti code", method calls, variables, and the like, to something that reads like:
    • openBrowser(url)
    • onTheHomePage.clickLogin()
    • onTheLoginPage.enterCredentials()
    • onTheLoginPage.assertLoginIsSuccessful(message)
    • closeBrowser()

Code Maintenance

As more and more tests were added, and the test suite ballooned to well over 40 tests, the need to maintain this volume of tests, and test resources that went along with it, began increase proportionately. With the application SRP, test maintenance became less daunting.
  • Test Data - Data that feed the variables for a test step now came from an external data source instead of being hard-coded.
  • Test Scripts - with the ease of legibility, tests could now be shared and discussed with non-technical members of the team. Feedback yielded additional action in record time.
  • Test Reporting - with the improved structure, tests could now better reflect what was being tested making reports and error handling much more succinct.

Conclusion (tl;dr)

Single-Responsibility Principle & S.O.L.I.D proved instrumental in how best to structure my tests scripts into a clean, concise, legible test harness that proved to be efficient and effective at articulating workflows, providing timely feedback to the Developers, and finding bugs in a reliable manner.
Resources:
  • SRP Wiki - https://en.wikipedia.org/wiki/Single_responsibility_principle
  • https://code.tutsplus.com/tutorials/solid-part-1-the-single-responsibility-principle--net-36074
  • https://codeburst.io/understanding-solid-principles-single-responsibility-b7c7ec0bf80

Thursday, July 19, 2018

30 Days of Automation in Testing | Type of Automation Testing Supported

Type of Automation Testing Supported

Original Post: what-types-of-testing-can-automation-support-you-with

  • UI / Front-End

    • Normal Flow - automation can streamline the end-to-end testing of typical workflows and critical paths

    • Exception Handling & Validation - with a data source, testing is more efficient when the process for generating error states and system failures is automated

    • Boundary Testing - the process of filling forms with copious data over time is best completed via an automated script that can dynamically enter the data consistently

  • ...

  • API / Back-End

    - Basic Assertions that data is present / POST - data matches input / GET - requests are not null

  • Integration

    - systematic testing of all features as they've been assembled with regards to the rest of the other features in the DOM
  • ...

  • Security

    - Vulnerability Assessments, Proxy/MITM Attacks, Brute Force Attacks, DDOS

30 Days of Automation in Testing | What skills a team needs to succeed with automation in testing

What skills a team needs to succeed with automation in testing

Original Post: share-what-skills-a-team-needs-to-succeed-with-automation-in-testing

Dear reader,

With regards to success in automation, teams succeed when each of the players knows their role. As it pertains to QA Engineers, there are definite skills that help, as listed below:

Technical Skills


  • Fundamental Knowledge in the SDLC - knowing what it takes to plan, develop and deploy. Also knowing the concepts of TDD vs BDD.
  • Requirements Gathering - while not necessarily technical, there are some technical aspects in terms of understanding the best script language to choose and why, how to work with the CI, and so on.
  • Programming Language - as has been stated, familiarity of a computer language is key. It also helps to work in the language of the developers so as to become part of the build process.
  • Knowing the basic differences between Unit and Integration tests, as well as Front-End vs Back-End testing.
  • Page Object Modeling
  • Object-Oriented Programming
  • Continuous Integration - understanding how to deploy to a CI and maintain the scripts as the project grows.
  • GIT - I can't stress this enough that all QA Engineers need to have even the most basic of knowledge of the GIT workflow.

I can honestly tell you a lot of team success also relies on the inter-personal dynamics and non-technical aspects; the intangibles that make or break team unity:

Non-Technical Skills

  • Be Adaptable - as a QA Engineer, it is paramount that you are adaptable and willing to work outside of your comfort zone. Sure you know Python, but does that help the Devs if everyone is working in JS? Also, there must be the willingness to learn a new framework if the one you have experience with doesn't work for your current project.
  • Be A Leader / Mentor - as a subject-matter expert, you must be willing to flex your "coach" muscles and pass on what you have learned. Its good that the junior members of the team can write page objects, but its on you to be able to coach them on why IDs are better than XPATH or how to better compose a test case.
    A good leader motivates the rest of the team and raise any flags that impede success.
    A good leader can also advocate for automation if it doesn't exist, and can evangelize a particular framework or script language if the current one is not working. It takes the right manner of communication and tenacity to express your beliefs and what you honestly feel will work for the team. Have proof of concept if necessary.
  • Be Creative - as someone who was tasked to work with proprietary test framework last year, an application that required knowing JASMINE (_something I've never been exposed to_) it took a minute to gain a foothold. When I did, I found myself working with the team Automation Engineer who built the framework to create the kind of tests that were versatile and resilient to change. The end result was a fluid test harness that was tremendously advantageous as the project grew in size.
    The lesson learned: creativity goes a long way when it comes to writing the kind of tests you want in a way to make it easier to maintain.
    Example: if you know you have many tests relying on test data, find a way to map the test to a data-source and structure your test in a way that uses variables rather than hardcoded data. Simple, right?! But now imagine this is on a platform you've never seen before. What are you to do?
  • Be Positive - the hardest part of automation is the loop of **write test -> run -> fix** especially as project requirements change. It can be frustrating when the environment goes down causing you setbacks, or when page structure is updated and tests that once were passing are now failing.
    Overall, there are 1,000 different things that can impede your progress. But what you cannot do is get negative about it. The challenge is to persevere when things are tough. This is key in a team situation when your progress is contingent on the success of others.
  • Be Passionate - As the saying goes, _a chain is only as strong as its weakest link_; everyone on the project has their part to play. You may be called upon to test manually or give a demo on work performed. Whatever the case, be committed to the craft. Give 100% in everything you do.

Tuesday, July 17, 2018

30 Days of Automation in Testing | Favorite Automation Solutions

Day 13: Best Automation Solutions

Original post: share-your-favourite-automation-tool-and-why

In my experience, my top **3** the following have worked:

CodeceptJS - for those with a javascript background, codeceptJS is a great UI framework with a syntax that is easy to learn. The frame work can be adapted for customized usage or with other libraries.

Katalon - written in Groovy, this framework is hands-down my favorite in terms of reliability and ease-of-use. I came upon it while researching an automation test solution for my company. Having previous experience with similar proprietary testing solutions written in Jasmine, adapting what I learned was effortless.

  • Can it test cross-browser?

    YES

  • Can it test on mobile devices?

    YES

  • Can it test web services (API)?

    YES

  • Does it take screenshots and post reports?

    YES

    &

    YES

  • Can it run in a CI?

    YES

  • Can it work with GitHub ?

    YES

  • Can others collaborate to a project?

    YES


The tutorials are intuitive and the fact that you can even employ javascript within makes this a robust platform.

Python

-- I have not yet found an out-of-the box solution for anything written in Python, but I've written custom tests that have proven super-reliable and quick to run.
My learnings for Python are in their infancy, but I have found using a lot of success with http://selenium-python.readthedocs.io/index.html

For API tests, Postman and FrisbyJS have been a godsend.

30 Days of Automation in Testing | Advice For People Getting Started

Best Possible Advice


..from someone self-taught

Dear Reader, If someone were to ask me how did I get started in test automation, my answer would end up being some derivative of, "I just installed Selenium and poked around for a bit." Then as I got more curious, I started asking more questions.

My top 5 list of things I would recommend are:

  • Always Be Curious

  • Coding skills are great and all, but never accept that you know everything. Keep learning new things and asking the question, "how do I do ?"
    Also its worth learning how the pieces fit. Not just how the feature works, but how the feature integrates with the other sections, or how the page interacts with the database.

  • Always Be Learning

  • Never accept that you are good at just 1 test framework or language. Be open to try a new language or new way of doing things.

  • Do As Devs Do

  • And by that I mean learn how to be a part of code reviews and the integration process.
    If you code in their language, you may be able to learn more and ask the kind of questions / test in a way that adds more value to your role on the team.

  • Advocate for automation

  • If your QA team or company does not have an automation process, or your project could benefit from this, be a strong advocate for implementing this type of a solution. Work with your team and propose a viable solution that can benefit the project development lifecycle.

  • Pass On What You Have Learned

  • Once you get to a strong level of confidence in your skillset, pass on what you have learned! Be prepared to give a presentation on your findings, or be ready to mentor others who may be interested in learning.

Monday, July 16, 2018

30 Days of Automation Testing

Now for something completely different ...

30 days of automation in testing challenge

challenge .. accepted!

Dear Reader,

I will be dedicating the next few posts to the 30 Day Automation Challenge offered by Ministry of Testing. I remain fully vested in all things Information Security and I have things to share regarding my progress with learning Python, finally getting through Kali Linux (Beginner), and finishing the first part of the Lynda.com courses on Penetration Testing. So much to get to.

teaser: What I thought of Kali Linux and the associated learnings


But then I'm also into automation and will be posting all my progress in the next few weeks. Promise!!

For more information on the 30 Days of Automation Challenge, click here:
https://ministryoftesting.com/dojo/lessons/30-days-of-automation-in-testing

Up next, the first of the challenges:
* What types of testing can automation support you with? Share an example.
(credit: https://club.ministryoftesting.com/t/30-days-of-automation-in-testing-day-1-definitions/16221/16)