Two years ago we decided to use Sonar for a beginning Java Enterprise project in one of my teams. In the following you may read the top 10 lessons learned from introducing and working with Sonar in a medium size Java project over a period of two years.
Select the Most Important Rules and Start Small
Start just with the most important rules. At the moment there are about 800 rules from Findbugs, PMD and Checkstyle in Sonar. This is too much for the beginning, because a lot of the PMD and Checkstyle rules are more focused on bad practices and/or coding style. A developer accepts a warning concerning a possible bug easier than some esoteric coding style or design discussions.
Continuous Management Attention
After some month the use of Sonar was accepted by the team, but in the fist weeks some of them just ignored the results. As manager or technical lead it is important to regularly look into the Sonar Dashboard, find some real issues and ask the developers about their opinion. The best is to find real bugs in the code. Do this once a weak, so everybody in the team recognize that this topic has attention from you.
Figure 1: Sonar Dashboard for a Java project
Zero Warning Policy
If the project has hundreds or thousands of warnings, nobody will see the new ones. This is the reason why we run a zero warning policy. This is not always reachable, but it should be at least a target for the team. I prefer to disable a rule in cases we can not fix all warnings fast.
Understand and Discuss the Rules
A warning from static code analysis is always an opportunity to learn more about the language and solutions. Sometimes it is hard to understand the root cause and/or possible influence. For difficult rules isolate the code which produces the warning and find a positive fixed sample. This may help other developers to improve their skills. Never fix something if you don't understand the why. You may introduce a real bug.
Sonar Works for Developers - Not Vice Versa!
Deactivate warnings which have a too high false true rate
or no benefit
for the team. One guideline is: "The tool should work for the developers and not the developers for the tool." This will also improve the acceptance from the team.
Don't Populate Sonar from Developer Workplaces
In the beginning we populated the Sonar database directly from the developer workplaces. Even for small teams this may lead to inconsistent warnings in the database and the manual effort is too high. An alternative is a direct integration in the central build environment. We did this and leaned that there are too much limitations for the test automation directly on the build machine. It is also annoying to increase the build time due to extensive static analysis.
Use a Central Integration Test Server
Our solution was a separated integration test server where we could also install some 3rd party components and databases needed for integration and end to end tests. This is server is completely automated and runs usually once per day or on demand.
Automate a Daily Analysis
A complete automation of build, deploy and test should be the target. The most important task of static analysis is to give the developer fast feedback. The more often the better.
Use the Eclipse Sonar Plug-in
The best is to have this feedback not within a day and/or after a central build. With the Sonar Eclipse Plug-in the developer makes the analysis during development with the same rule set than the server uses. This leads to even better acceptance and no extra work in the next day.
When we started with Sonar this plug-in was not available and we had to maintain the rules for Findbugs, PMD and Checkstyle manually for each client. This plug-in is the best improvement during the last two years. Many thanks for that!
Avoid Single Optimization Targets
Don’t give your team targets like "90% code coverage for the unit test cases" or "Code duplication smaller than 5%". This will lead not to better software. They will just write a lot of stupid test code to fulfill this special target. It is better that the team uses the data and foster the discussion about lessons learned.
|| Jan 11, 2013
|| Markus Sprunck
|| first version