Beta Testing Process

Continuing the discussion from QA Practices (General Discussion):

I made a first draft for a process to conduct beta testing prior release of new ERPNext Versions. Please everybody interested in this (@imllc, @imllc, @IAGAdmin, @cpurbaugh, @rmehta i.e) take a look.

I am looking forward for your input and a vivid discussion (on github, using issues and PR’s I think)


So very awesome thank you!!!


it’s a draft yet though, so I am looking forward for contributions (issues, PR’s)

Hi guys,

First of all, I really like to spot the discussion of this sort on the forum. Solid release cycle (and beta testing as its part) will be one of the key advantages to sustain ERP Next’s success in long run.

I would like to drop my five cents on the outstanding questions already mentioned in erpnext-beta-testing-process/ at develop · vrms/erpnext-beta-testing-process · GitHub

• should testers focus on one part of the system or just play around with everything (so there will be testers who only look into accounting, while others only look into ‘Selling’, etc)?
GV: Ideally, each product release should be beta tested with respect to at least the following coverage

  1. exhaustive testing of components/modules affected by the specific functional changes (this will be different from release to release)
  2. selective sanity checks or limited-scope regression testing of every major module/component
    GV: In activities under #1, it would be preferred if all beta testers are involved (btw, they should be trained/walked through the details of new functional changes via joint webinar or similar video training event prior to starting every new beta testing cycle)
    GV: With respect to #2, we may think about specialization. However, we may also want to consider switching people from module to module from time to time, to keep their experience and motivation fresh

• should the beta test be monitored by anyone or just run as it goes?
GV: it should be definitely monitored and orchestrated. The important part of that is to regularly assess the lessons learned and translate it to both the future product road map and process improvements.

• would it make sense to create a standard form for the default tests where just succes/fail will be recorded?
GV: every beta test cycle will have to be managed via a centralized check list facility. Check list will be different every new release. Yes, it will make sense to facilitate it via online check list tracker tools (either something you come up with in house or some ready-to-use systems like AppTest, Zephir or similar to go)
GV: for the needs of better regression testing, we may want to think about test cases documentation. This will help individual beta testers to verify it. The same kind of tools like AppTest or Zephir could be used to manage such test case documentation.

is it possible that cloud-uers who participate in the beta get a copy of their instances for testing purposes ( If this is desired, need to add a procedure to apply for the beta program
GV: I would certainly consider this way of doing with cloud-based beta testers (if there is a relevant resource to implement it).

For which level of new Version should this be applied. (I mean only major updates like 6 > 7, or also somethin minor like 6.1.2 > 6.1.3)?
GV: I would say, it could be relevant for major release updates (like 6 → 7 or 7.0 → 7.1)

Useful references
You may find it useful to review the resources below

  1. Very detailed methodology charter on a beta testing effort for a large software product:
  2. More tips on how to enagage and keep your beta testers motivated and involved into regular beta testing efforts:
1 Like

PR with my feedbacks done

Thanks @vrms for getting this started.

I totally agree for better testing, but my solution would be to invest in automated testing.

We already have decent test coverage for the server side, but currently the problems are on the client side, JS fixes do not usually have test cases. We have not been able to setup Selenium. (Maybe I personally need to spend sometime, but help from the community would be awesome)

Since the product is also large, doing coverage will be a big challenge. For new fixes, I have been more vocal about adding Test Cases, so over time we can cover the gap.

But the most important thing in my view is to setup Selenium based automated UI testing.

1 Like

Automated tests are ok but not enough. I’d like to see with my eyes any effect to data and usability after migration as we use ERPNext everyday and don’t want to deal with upset colleagues!

For instance the recent Time Log → Time Sheet conversion. Your initial patch didn’t look after the past data in regards to creation date, last modified date, etc. We use this data for legal lodgements and are available for audit. We need the history! Otherwise auditors may think we made up the data!

Another example, School. We don’t need this domain at the moment and don’t want to see it as it’s out of scope, and it was everywhere after the initial migration test.

Another example. Newsletter. We use it every week and there is a our receptionist in this role. After the migration test we realised it disappeared from CRM and moved to Setup. I cannot give her access as System Manager!

Another example the header and footer in Financial Reports. after the migration they disappeared. We cannot keep on exporting in Excel and add them manually.

and so on.

We are willing to help you in this manual user testing.

1 Like

generally I agree that manual and automated test should run in parallel. If we see after a while that one of both become obsolete we can always cut back.

Also I think we should get started with something simple (and not yet perfect probably) and improve this over time rather then looking for the perfect solution right away.

There have not been any comments on the timeline (about a month in total, divided in 2 Phases in my draft). So I suggest just to define this as drafted and then see whether it needs to be adjusted after having gone through a first cycle

how long does that take (I mean the test and the fixes)?

I could imagine this might be the first things that’s done prior a beta release (day 1 in my draft)?

Right now there is decent gap between develop and master. And before any major release, its always good to try it out on a test instance, before you upgrade production.

Form our end, we will try and build detailed release notes!

We use master to run tests and migrate as we need a stable version.

It would be good if you can create an intermediate branch called beta-release as per @vrms 's model to involve the community to run tests and give feedbacks and help to you before releasing to master.

1 Like

I assume the procedure is that we decide on a model together with @rmehta (representing Frappe) and THEN all stakeholders in this process will act upon that (including the branching changes that might become necessary).

@rmehta is that also your understanding?

1 Like

@rmehta, @IAGAdmin, @vrms, Everyone: I still believe the right combination of sufficient client- and server-side automated test coverage and focus-group manual beta- testing will work better then relying just on one of them. Since the time of Agile manifesto proclamation, TDD and embedding automation testing into software development practices became a paramount. However, only a few of us notice the slight yet important difference: automation testing delivers AUTOMATION TOOLS for testing, and it does not AUTOMATE TESTING per se. - this sounds a little offensive to all Agile believers :slight_smile: but there is a certain rationale in some of the arguments there.

The areas where manual beta testing of ERP Next relasese will compliment front- and back-end automation are as follows

  • meeting usability requirements - none of Celenium-based click-though scripting would detect problems with users being not comfortable to use our software the way we designed it
  • meeting expectations of business stakeholders - in other words, answering questions like, “Do we have ‘bugs’ in our requirements to the new features?” (the features can work correctly and without the technical issues but they can still be out of point/value from the business stakeholder’s point of view)
  • testing the quality of the product documentation
  • testing on real-world data sets with replicas of databases of real customers (this is especially critical for migration from one major release to another one - you probably remember a lot of pain with migration to v.7 recently; this happened despite the fact the server-side automation test coverage on ERP Next is simply brilliant)

Apart of that, there is a reason for partner service provider organizations and third-party software developers to vote for beta testing process, too. Their custom apps will have to be tested with the newest pre-release branches (for instance, we struggled with one of our apps during V.7 migration simply because we did not expect the changes in email queue infrastructure and de-comissioning the bulk email facility - this could be definitely captured if the rounds of beta testing cycled for v.7)

Does it make sense? :slight_smile:


Yes, totally

At the speed at which we are moving, it would be great if service providers can spend 15-20% of their time contributing features. This would greatly reduce collisions and also speed up development. This also includes test cases :slight_smile:

@rmehta at the conference I suggested you guys need to develop less and guide the community to get more involved.

With your current approach I feel you are creating a barrier instead and your speed is not sustainable for a long term plan.

For the Beta Testing Process, which is part of the community involvement, I would like to see some facts now as I’m not longer interested in the discussion if it doesn’t lead to anything.

As IAG, we’ll keep contributing as usual either during our migrations to your latest stable master branch (I set a budget once/twice migrations per year), plus any our development which is interesting for the community. Next one will be the E-mail Inbox app, as soon as we are able to finish the migration to 7.1. Then we’ll be back to the bar code scanner, and so on.

Hope my feedback helps. We cannot do more than this.

1 Like

@IAGAdmin Thanks for the feedback and leadership!

If you are contributing your features, you should be good. I took your advice and we are clearly very responsive for giving feedback on PRs and merging issues.

I am really happy people think ERPNext is good as is, but I still see so many things could be better. As long as your features are in the core, contributed, it will be our responsibility to migrate them. On the ERPNext cloud we manage close to 5000 ERPnext instances and most of the changes are bug reported by the users.

I think in the medium run, the best strategy would be to keep contributing and staying on latest. A lot of good stuff is yet to come!

1 Like

@rmehta We’ll be able to stay close to the latest if you guys introduce the community beta testing process.

I’m not interested in the new stuff if the old ones have been compromised or got worst, as we need to fix them first before migrating. Secondly, we need to train and help users to get back to normal after migration. Then we can explore the new stuff.

This is one reason why I don’t feel comfortable to host our data in the ERPNext cloud. It’s too risky for us.


How about discussing this on the Developer Call Tomorrow?

@nabinhait is our release manager - he as already committed to starting a release-candidate branch !

Nabin - please lead the discussion tomorrow!

I am not sure whether I am misreading anything but somehow I don’t get the feeling you are on board with this generally. Can’t really think of any reason why though