Skip to main content

Diligence

User Interface Testing

We transform our user stories in Jira into a test plan. The test plan is then scripted to provide an automated, system-level testing of the user interface and overall customer experience. This can be run headlessly in an automated pipeline.

As with the code-level tests above, system-level tests are written to prove the existence of bugs before they are fixed. In this way we build a library of tests that grow over time and provide ever more reassurance.

User Acceptance Testing

We test the system in real-world scenarios with real-world users, to validate that it meets their requirements and expectations.

Non-functional Testing

Some or all of the below may be necessary

Stress TestingAssess the system's performance and stability under extreme workloads.


Penetration TestingSimulate a cyber-attack against the system to identify vulnerabilities.


Failover TestingDeliberately cause components of a system to fail, ensuring that backup components automatically take over (with minimal disruption to usage).


Chaos EngineeringIntroduce random failures into the system to assess its resilience and fault tolerance.


Bug reporting

Bug reports should include the following attributes to assist in understanding and speedy resolution:

Title.

DescriptionIncluding what the testing objective was, what the tester expected to happen and what happened


Steps to ReproduceA step-by-step guide to reproduce the issue.


EnvironmentWhere the bug was encountered.


Severity and Priority

Any relevant attachmentsScreenshots, error messages, logs, etc