Service Portfolio Review process¶
A Service Portfolio needs to be maintained to continuously meet the user’s requirements for services and the criteria defined from HIFIS side. Therefore, the primary goal of the Service Portfolio Review process is to check whether the services in and the processes around the Service Portfolio continue to fulfill defined requirements. To specify it in more detail, there are three Review types that might be considered in each review:
- 1) Services in Portfolio: check whether the services in Portfolio continue to meet the service selection criteria and ensure that service information is up-to-date
- 2) Service selection criteria: check whether the service selection criteria continue to meet HIFIS’ requirements for Helmholtz Cloud Services
- 3) Portfolio processes: check whether the Portfolio processes described in this Process Framework are efficient, up-to-date and supportive for the handling of Helmholtz Cloud Service’s lifecycle.
For each Review type checklists were worked out to document the result of each checkpoint in a structured and transparent way. A review might be conducted for one Review type or a combination of Review types.
At least once a year a regular review including all Review types should be conducted. The review should be conducted in autumn of each year to be able to integrate the results into the next yearly report. Whenever required, it is possible to trigger an ad hoc review. This might include a specific Review type or a combination of them.
The roles involved in the Service Portfolio Review process are the service provider (Helmholtz Centre) and the Service Portfolio Manager (HIFIS). The main tool supporting the Review process will be Plony where the review checklists will be integrated.
The triggers for the Service Portfolio Review process are:
- the regular review interval is reached
- an ad hoc review is required
If the review is triggered by major service information being changed, only the corresponding service is reviewed in form of an ad hoc review.
Whenever a review is triggered, the first action to be taken by the Service Portfolio Manager is to create a new Review object. The Review object needs to be filled with some general information on the planned review. It will be extended during the review and will form the documentation at the end of the review process.
In order to define the review’s scope, the Review register should be checked for ideas. Anytime someone discovers potential for improvement, one can write down ideas in the Review register. Once a review is conducted, the register is checked for content and the ideas written down there are evaluated (and potentially taken into review). The ideas from the Review register plus what else needs to be reviewed forms the review’s scope. What needs to be reviewed is defined by a regular review interval being reached or the reason why an ad hoc review is required. There are checklists available for each Review type which should be individualized according to the defined review scope. The preparation of the review is then completed.
At least four weeks before regular reviews are conducted, the service providers should be informed about the planned review. They will get the Review checklist prepare for the review and possibly to be able to verify the fulfillment of Exclusion criteria for their service(s) and check their service information and service description for being up-to-date before the actual review is conducted by HIFIS.
The process steps performed for each Review type are the following:
- 1) Review of services in Portfolio:
- Go through services in Portfolio and check for each if Exclusion criteria are still fulfilled. Services which fail to fulfill Exclusion criteria (and why they fail) are documented in the corresponding Review object.
- Use review functionality in Plony to let each service provider check whether the service information is up-to-date and eventually update outdated information
- 2) Review of service selection criteria:
- Add, adapt, or delete service selection criteria. Changes made are documented in the corresponding Review object.
- If Exclusion criteria were added or adapted, it is required to verify that all services in Portfolio fulfill these adapted criteria. Therefore it is checked whether adapted Exclusion criteria are still fulfilled by each service in Portfolio. Services which fail to fulfill Exclusion criteria (and why they fail) are documented in the corresponding Review object.
- Since Weighting criteria results determine the rank of a service in the Service Integration List, it is important to re-evaluate Weighting criteria if they were adapted or new Weighting criteria were added. Therefore Weighting criteria for services in pipeline are re-evaluated and their rank in the Service Integration List is adapted if necessary. The re-calculated score reached and corresponding rank in Service Integration List is documented for each service in the corresponding Review object. Also a short summary of changes should be included in the Review object.
- 3) Review of Portfolio processes:
- Evaluate how currently established processes work and if there is a need for adaptation. Processes are adapted if required. Adaptations made are documented in the corresponding Review object.
- Since many processes are build into supporting tools, it is required to evaluate whether workflows built in the tools need to be technically adapted, too.
Whenever a review of services in Portfolio and a review of service selection criteria are conducted at the same time, it is recommended to perform the review of service selection criteria first in order to avoid double work.
After the review is conducted, the results made for each Review type are summarized into the Review object’s preliminary results. Based on the results, recommendations to service providers are concluded. The recommendations may include the necessity of service adaptations to continue fulfilling the Exclusion criteria and are documented in the corresponding Review object. Before communicating the recommendations to the service providers, the HIFIS coordinators need to be informed about the preliminary results/ bigger changes implemented or recommended due to the review since they might have a VETO here. HIFIS coordinators are free to inform HIFIS Steering Committee about conducted changes/ ask for their approval. Only after feedback from HIFIS coordinators, the recommedations can be handed over to service providers.
The service providers receive the recommendations for changes indicating why the change is necessary from HIFIS side. The preliminary review results can act as reasoning here. The Service Portfolio Manager also indicates possible consequences if recommendations are not implemented (e.g. service Offboarding is initiated if service doesn’t fulfill Exclusion criteria anymore and this is not adapted by service provider). In parallel, HIFIS internal changes e.g. on Process Framework, Documentation, HIFIS Website or any other HIFIS internal document/tool are conducted.
Even if necessary from HIFIS side, a service provider is free to decide whether to implement the recommended changes or not. If changes are not implemented and the service doesn’t fulfill exclusion criteria anymore, this may lead to service Offboarding or degradation to an Associated service. As soon as the service providers made the decision, they are asked to conduct the changes and inform HIFIS if they accept the recommendations or inform HIFIS about recommendation denial including a reasoning. HIFIS then documents the responses of the service providers (recommendations accepted or denied + reasoning). The responses are added to the preliminary results and extend them to the final review results. As soon as everything is documented in the final review results, the Review object can be closed. However, some postparation tasks still need to be done after review closing.
In case of a regular review being conducted, the date for the next regular review according to the defined review interval needs to be set. Also, the Review register needs to be emptied by the ideas that have been implemented during this review. And, as a very last task, it is highly recommended to document lessons learned during the review. Think about what could have gone better and write down lessons learned for the next review to run smoother!
For the whole process visualization and step-by-step explanation please check the corresponding files.