Introduction



The Center of Excellence Excellerat worked on the sustainability of large softwares dedicated to High Performance Computing in the frame of WP7.2 « Standardization ». The many relevant standards found are gathered here in an interactive fashion. While you rate yourself your code maturity, discover the many ways to improve the maturity of your code. Answer these questions to get a report on the strengths and weaknesses of your project.


Scale of the project

What is the number of statements in your code?


How many active developers (>50% of productive time) are actually working on the code ?

People can read and understand carefully 50 statements per hour in average. Therefore, one guy of the team can read 90.0% of the code in one week.
The team can read 90.0% of the code in one week. As there will be more potential revisions on a larger code, the codebase management will get more complex, with higher risk of collisions. Against this, the team can split it in a cascade of separate codebases administrated by separate independant sub-teams. Note that with 2000 to 4000 statements, you can already make a lot of business.


Team habits



Among the following actions:

* regular management sessions
* regular hackathon on creating functionalities
* regular hackaton on fixing bugs
* regular hackaton on tests and coverage
* make regular code reviews

w How many actions are enforced by the team on a regular basis?

None All, and more


Unused code



Unused code or “dead code” is still part of your codebase. Even if dead code seems harmless and can be remover by the compiler (see dead code elimination, Developers will spend time on maintenance and refactoring on all the stetements. Unused code comes with a reccurent cost, slowing down the agility of the team. Many actions can be taken against unused code:

* Regular core developers sessions are held to mitigate the problem.
* A clear policy is updated on unused, deprecated, or premature code.
* Team is trained on a regular basis against unused code
* There are asynchronous training supports against unused code

How many actions your team is taking against unused code?

None All, and more


Unused code can have various meanings:

* Deprecated code kept in case
* Deprecated code for reverse compatibility
* Premature code
* Code loosely related, but versionned in the same repository.


None All, and more


Testing code



Here we are talking about automatic testing: a service that monitor the consistency of your code. Without going in details, these are often spread into two families with a different purpose: - Is it broken?: A Functionnal test will monitor a complete functionality, an action similar to human testing. - Where dit it broke?: Unit tests will focus on small, independent sections on your code - forcing you to write the code in small independant sections (a “testable” code)

You can dig more in this introduction to unit tests (in fortran). Testing can be done in many ways:

* Regular core developers sessions are held to increase testing.
* A clear policy is shared in the team about testing.
* The team is trained on a regular basis about testing.
* There are asynchronous training supports for testing.
* The Continuous Integration Pipeline evaluates the coverage.

None All, and more


Testing can focus on different levels of the software. These is a huge literature on it -speed read the wikipedia article on software testing if you want-, so let’s estimate the topic simply with three categories, from the highest to the lowest level:

* Live testing : users try the solver on non-regression cases - Users level
* Functional testing : high-level code is tested on non-regression cases - Developers level
* Unitary testing : low-level code is tested at the level of Methods/Subroutines/Functions - Core developer Level


None All, and more


Software structural quality



Software structural quality focuses on non-functional requirements, such as robustness and maintainability. What actions are engaged to increase the quality of the code?

* Regular core developers sessions are held to increase quality.
* A clear policy is shared in the team about quality.
* The team is trained on a regular basis about quality.
* There are asynchronous training supports for quality.
* The Continuous Integration Pipeline evaluates the quality.


None All, and more


Quality encompasses many aspects. For a scietific code, the most prominent points of interest are:

* Code complexity metrics
* Coding standards (naming, antipatterns, number of variables,...)
* Granularity (size of files, of block, or signatures)


Rate how much of these aspecst you focus on

None All, and more


Documentation



What do you do about documentation ?

* Regular core developers sessions are held to improve documentation.
* A clear policy is shared in the team about documentation.
* The team is trained on a regular basis about documentation.
* There are asynchronous training supports for documentation.
* The Continuous Integration Pipeline generates a documentation


None All, and more


Documentation can take several forms:

* How to (Introductory examples)
* Tutorials (Full Application examples)
* Inner working (How it is done)
* API definition (What is available)

How many kinds of these documentations are you curating?

None All, and more


Versionning



There is more to versioning than just naming the versions. You can read more about how it can/should influence the developer in the day-to-day work, eventually saving time through strict versionning. The question is therefore how much versionning is impacting the team habits:

* The version releases names all follow a naming protocol.
* The software is archived on a version control software (Git, Mercurial)
* A versionning strategy is shared in the team. (SemVer)
* At a developer level, the versionning strategy successfully postpone revisions that breaks of backward compatibility committed.
* At a developer level, the versionning strategy successfully separate bugfixes and new features committed.

According to these definitions, how would you rate the impacts of versionning in your project:

None All, and more


More specifically, does your versionning includes:

* Strict bugfixes, no features of regression (Bugfix in SemVer)
* Strict features, no regressions (Minor in SemVer)
* Strict evolutions (Major in SemVer)

None All, and more


Stupid and Solid code



Solid and Stupid Code are object oriented (OO) programming principles. While these principles are not always applicable to scientific softwares, they are a good starting point to fight the code smells.

STUPID is the acronym for the six following anti-patterns:

* Statelessness
* Tight coupling)
* Untestability
* Premature optimization
* Indescriptive naming
* Duplicate code


How many of these your team actually watch for?:

None All, and more


SOLID is the acronym for the six following patterns:

* Single Responsibility Principle
* Open/Closed Principle
* Liskov Substitution Principle
* Interface Segregation Principle
* Dependency Inversion Principle

How many of these your team actually watch for?:

None All, and more


Performances



What do you do about performances ?

* Regular core developers sessions are held to improve performances.
* A clear policy is shared in the team about performances.
* The team is trained on a regular basis about performances.
* There are asynchronous training supports for performances.
* The Continuous Integration Pipeline gmonitor the performance

How much of these actions your team takes to preserve and increase performance?

None All, and more


Results




Aspects to improve.


Takeaway



This form redirects to many ways to improve your software. While the work to be done seems gigantic, most of it is related to your team skills, and not the code itself. Gaining maturity can become a continuous training of the people, a polar opposite to expensive and code-focused “taskforce refactoring” actions usually related to software improvement.