Test Collab outage 8 January 2020, Issue rectified

We just recovered our hosted application from an outage. As some of you might have noticed, there was a lag between ‘creating a test case’ and that test case showing up on test case manage page. This was due to unexpected replication lag between database slave server and master server. The issued was identified and rectified within 2 hours of first report of incident from client. We apologize for any inconvenience.

Should you get your testers certified?

Testers certifications has been a thing of debate. There are some points to be considered to settle this:

  • Why you need this?
  • Will this prove to be a paradigm shift for your organization?
  • Are the testers in your team ready for this? 
  • Is it going to be a costly affair and is it really worth investing time and money?

Are these the concerns eating out your mind? Then read on as we analyze the need by answering a few simple questions…

Continue reading

2

Using source code visualizations as a coverage map for testing

I’m not a big fan of tracing or linking dead text requirements documents back to test cases unless it is absolutely required. This got me thinking what else can be used as a reference map for testing….?

We all could use some sort of map while testing exploratory’ily(?) So doing some searches I randomly stumbled across this post by Fred Beringer and it struck me that source code visualization can really be useful in exploratory testing. The main problem while exploratory testing is that we could miss critical areas of application. But what if we have such a map?

Continue reading

When to stop testing or stop documenting?

As product managers, every now and then we have to make decision whether to continue testing that feature or move on? It doesn’t just apply to testing efforts, but also to test case coverage & documentation, i.e. to continue writing more test cases for a particular feature or move on to next?

How do you decide in such cases?

Maybe we can borrow a concept or two from behavioural psychology: maximizing and satisficing.

Maximizing is when you’re trying to do as much as possible within given means.
Strictly in psychology terms, Maximization is a style of decision-making characterized by seeking the best option through an exhaustive search through alternatives.

Satisfising is when you’re trying to do just good enough to be satisfied (MVP for ideas and decisions).
In psychology terms, Satisfising is a style of decision-making which would mean exploring through options until you find one just enough or acceptable enough.

So which one to use when?

Look how often your customer spend time on this feature. If it’s, say 80% – 90%, it’s certainly calls for maximizer behavior – and you want to put everything you have in your arsenal at this, in terms of testing and documenting.

If the feature is occasionally used, you can get away with satisfising. You just need to do enough so that you know it works as desired – nothing more!

Obviously this principle can be applied on all types of optimization problems: at work and at life in general.

 

Free test management plans launched for Test Collab

Today we have released pricing plans for Test Collab. These free forever plans will help development and quality assurance teams to improve their test management without worrying about any payments or subscription ever!

It has always been our mission to making quality assurance affordable but we realized  that there are not many free test management software available, especially cloud based. There are few open source test management software available which you have to download and host on your own server but it becomes quite cumbersome and complicated. Our free plans are available on cloud so your team can focus on testing rather than managing servers and software installation.

We have also made testing more affordable for bigger teams as we have launched $10/user pricing for large teams.

If you have an active free trial or expired account, you can find the option to downgrade/upgrade your account under ‘Update Plan’ page. There are however some limits in free plans which will be available on our pricing page. Signup for now for free plan.

 

 

GDPR Compliance update

We are in process of rolling out organization-wide update (internal and external) to comply with upcoming deadline of GDPR. Test Collab will be fully compliant with GDPR before the 25th May 2018 deadline.

If you have any questions or suggestions regarding GDPR compliance, reach us out from ‘Contact us’ link below.

New reports: one more step towards test intelligence

We’ve always been fascinated with extracting useful insights from the data - we’ve been reviewing a lot of our clients at Test Collab over last 9 months. Reviewing their problems, possible solutions and if we can positively impact the productivity given the data they have.


Each and every test case, its result of every execution ever, time spent and who assigned/executed a test - gave us a lot of data to work with.


There are potentially lot of useful insights that can be extracted with so much data. With this release we’ve attempted to scratch the surface - and there will be a lot more to come. For now, we believe you’re going to love these new reports.


You’ll find quite a few new charts on project overview page, milestone view page and new test case status report page (under Reports tab). It'll be unfair to not mention some of them here, given how useful they can be:

1. Test Case Last Run Statuses

Want to know overall health of your project? Just look at this chart and you'll have high-level understanding of how good your project is, testing-wise that is. This report shows you last state of all your test cases. 

Pro-tip: As you add new features, your unexecuted test cases will go up. When working on new project, keep an eye on 'unexecuted' test cases and schedule execution on such cases every few days. Alternatively, drill down suite-wise to see statuses in further details.

2. Time spent on test cases

This is one of my personal favourite. Over time, there are some test cases that take a lot of your developer's / tester's time. This chart will help you locate such outliers and let you further analyse why. You can plot multiple time metrics for time spent on each cases: average time, overall, maximum, minimum etc.

Pro-tip #1: Use this chart to locate outliers test cases, then run 5-whys analysis as to what took them so long.

Pro-tip #2: Alternatively, you can use this data to decide which test cases should be automated before the rest. ​

3. Error prone Test Cases

This is a distribution chart made of failure rates of your test cases. Highly useful when you want to pinpoint the troublesome cases of your project. If you think testing all cases all-the-time is good strategy, this chart will be an eye-opener. 

Sample use case: When developing new features, schedule testing of cases with overall high failure rates at relatively early stage. This will give your testers / developers more time to find cause of such failures resulting in less surprises on release date.

Pro-tip: We've observed cases with high failure rates are often a sign of either outdated test case documentation or some big underlying problem. Pay special attention to the cases above threshold failure rate.

4. Cases passed by suites

This heatmap chart shows you all the test suites of your project, color-coded as the percent of test cases passed. Green'ier the suite = Better passing % of test cases. The area of the block represents how many test cases there are in this suite relative to other suites. Larger area = Higher number of cases in the suite

Pro-tip: Sometimes a single module / set of features negatively impacts your project while other modules might be functioning as expected. This chart will help you locate such problem areas and act on them. Start paying attention to light green regions, find out what's lowering the score, is it development, testing or documentation?

5. Milestone burndown

This is useful when you're doing sprints or deadline-driven releases. Quickly see your tasks left as a burndown in timeline with ideal vs actual effort by your team. You'll also get instant feedback as a team of how your efforts are contributing towards the end goal and how fast.

Several other new reports which aren't mentioned here, but are released with this version: Test case assigned vs unassigned, defects reported over time, test run results over time and some new metrics.

As mentioned above, this is just beginning of large milestone - if you have some ideas for new reports / charts / widgets / metrics etc, please get in touch and let us know. 

Test Collab outage 5 December 2017: Issue rectified

As some of you may have noticed that we had an outage for quite a few hours. The issue was caused by manual error on production server at December 5, 22:00 PM. We were able to bring systems back up at December 6, 03:11:20 IST. We’ll be working on new measures to make sure such errors are not caused in future.

Test Collab’s new version: JIRA Cloud bi-directional integration, automatic screenshot upload and 4 new integrations

We are proud to announce the launch of Test Collab 1.16. This release aims to improve tester’s productivity and adds new integrations with your favorite tools.

 

JIRA Cloud bi-directional Integration

Test Collab can now be used inside your JIRA cloud instance. You can create/manage test cases and test executions directly from your JIRA cloud instance. JIRA cloud Plugin can be obtained from JIRA marketplace
If you are a JIRA self-hosted user, check this listing

 

Automatic Screenshot Upload

This feature is going to save a lot of time for testers. While testing when a bug is encountered, a tester usually has to attach a screenshot to a failed test case along with necessary information about the error. We’ve automated a part of that workflow and now attaching/annotating screenshot is done in single key-press. Forget about saving file in a directory and then attaching to your case, just click Print-Screen and it’s automatically uploaded.
Check out this video below to see it in action:

 

New Integrations

We’ve also release support for new integrations with these tools: Trello, Asana, YouTrack and Team Foundation server. You can see list of all our integrations.

Would you like to suggest a new ideas for our next release? Please get in touch.

1 2 3 6