Table of Contents
- How we test releases of Telligent's products
- Quality strategy
- Manual testing
- Automated testing
- Unit tests
- Integration tests
- Functional tests
- Using our own product day in, day out
- Delivering well-tested product releases
- Moving forward
The following sections discuss Telligent's strategy for testing and verifying the quality of its products.
We have significantly evolved how we test our products throughout development, packaging and release. Our latest releases of Telligent Community and Analytics demonstrate high quality because of our significantly improved test and release processes.
Our quality assurance strategy is centered on continually evolving our testing process to prevent bugs from being created as the software is developed or from being encountered by our customers. We do extensive manual testing and use a number of automated tools to validate that our products function properly. We are also using our products in real-world scenarios as they're being actively developed, which helps us to identify problems before our customers do.
Our manual test strategy uses extensive and continually updated test plans to cover our products with formal test cases. We take care to test our products on all supported platforms and browsers. We also make use of our Quality Assurance team's experience by unleashing them in ad hoc exploratory testing to carefully examine different areas of our products.
Manual test plans lay out individual test cases to check correct functionality around our various products. We divide our test plans into two parts: P0 and Depth. P0 tests are "line of death" checks which ensure proper behavior of critical features or functionality. We run P0 testing on our components on a weekly basis. Depth tests are much more time-consuming and check important but less-critical functionality.
We've worked hard to evolve our test plans for the latest releases. We have exploded the number of test cases we cover, and we have created new test plans for components which weren't formally covered previously.
For the latest release, Telligent Community 9.0, we repeatedly executed over 3,000 individual manual test cases. These cases covered more areas than any previous release.
Zimbra makes our test plans available to anyone who is interested in seeing exactly how we're testing our software. The test plans can also help customers validate their own installations of Telligent products, both for out-of-the-box and customized installations.
Ad hoc testing encourages our team to use their experience and knowledge of the system to explore areas of the system which may not fall under automated or manual test plans. Ad hoc testing takes place during many different phases of our work including manual test passes, bug validation, and general exploratory testing.
Telligent products run on several different platforms, so we cover a number of different combinations of platforms and browsers during our testing. See Supported software versions for servers, applications, and browsers for the versions we have tested and provide support for in this release.
For automated testing, we use a variety of tools to create a suite of tests that are repeatable and can be run with the click of a metaphorical button. These suites ensure features work as advertised, and also provide a safety net against regressions. These automated tests are run against our system on a regular basis - in some cases, several times each day.
We have a range of automated tests across all our products and components. We use unit tests to check small, discrete functionality within components of our software. Integration tests check behavior between system components such as REST Web services. Functional tests ensure all user features operate correctly.
Developers are responsible for writing unit tests to validate portions of the work they do. Our latest release had over 1450 unit tests.
9.0 has had over 3,000 manual tests and over 16,000 automated cases.
The QA team performs testing via automated browser sessions. These tests also guarantee proper functionality and provide another regression safety net - but it is through the user interface itself instead of at the Web services layer like the REST integration tests.
Our web services team creates integration tests around every piece of software they write. These integration tests exercise the Web services components to ensure proper behavior and to fend off any regression bugs.
The QA team performs testing via automated browser sessions. These tests also guarantee proper functionality and provide another regression safety net, but it is through the user interface itself instead of at the web services layer like the REST Integration tests.
We use our own product as the backbone for our corporate intranet. We update our intranet site nearly every two weeks with the most current version we are developing. This helps us in a number of ways.
First, we get immediate feedback from real users on new features: Are the features working as expected? Do they fill a real need? Second, we get users who exercise the system in ways our test plan doesn’t cover.This helps us find bugs we might have otherwise missed, and it helps us expand our test plan with those new use cases.
Telligent's intranet, like every other corporation’s, provides critical functionality necessary to the success of the company. We couldn’t roll out our regular releases and be successful if our quality was too low – our own colleagues wouldn’t be able to get their daily work done and we’d rapidly lose their trust.
As pointed out above, we are constantly testing our products as they are being developed; however, we focus carefully on testing during our release-for-delivery process. Every product package gets additional attention as it is completed and readied for release to our customers.
In addition to the regular testing already discussed, our major releases undergo several additional test passes where the entire product development team focuses on executing detailed test passes on each of our Release Candidate packages. This final vetting ensures we’re getting additional levels of exposure to the entire system.
Telligent also releases hot fixes for our products. These versions hold incremental fixes we release to our customers in between regular product versions. Our hot fix releases undergo the same level of testing as our regular releases do; however, we also take care to specifically validate each issue we’re resolving in that release.
We’re committed to continuing the expansion of our testing, both manual and automated. We’re aggressively using customer feedback to evolve our testing plans as we resolve issues identified in our support forums, custom services, and formal support chains. Additionally, each new unit of development drives test cases specific to that work.