DevOps Pipeline Managing PCF App and Resources - DZone (2024)

There are many tools available to do DevOps for PCF. Automating the deployment of artifacts to PCF is very easy and many articles have been published about it. Now, you will be asking, what different aspects this article is going to cover?

In my current project, I have observed that developerskeep deploying applications on PCF with little control so the resources arepiling up and leading to a huge bill to the DevOps team who manages the platform. After analyzing the issue, I found that teams arebuilding the applications and deploying to a PCF test environment but they are sitting idle 80% of the time without being used. This is a huge waste of a test environment, as IaaS charges based on the consumption of memory/storage.

To address this waste, I have come up with a DevOps process which will not only just deploy the application to PCF, but also automate the provisioning and de-provisioning of the dependencies around it. This will ensure that all the resources are used optimally and not sitting idle. The idea is that you create the resources when you need them and delete them as soon as your work is finished. The below solution will first create the Org/Space for PCF, then create dependent backing services, deploy the application, test it with automation, clean up the resources after testing completion, and delete the Org/Space itself. There will be no environment sitting idle and adding to your bill.

For this pipeline, I have used Bamboo, but this can be implemented with any other pipeline, like Jenkins, GoCD, etc.

Prerequisites

  1. Bamboo Pipeline setup

  2. A Spring Boot application

  3. PCF CLI and Maven plugins for Bamboo

  4. Basic understanding of Bamboo pipeline

Stage 1 — Create Build and Analysis

This first step will checkout the code and integrate it with SonarQube. The Sonarqube dashboard will show the Application Analysis results based on the widgets available.

DevOps Pipeline Managing PCF App and Resources - DZone (1)

  1. First, checkout the code from Git.

  2. The next two steps are for copying the build and SonarQube scripts and retrieving the Git user and branch details for running the build.

  3. The third step is for the Maven build. I have disabled the Gradle script as my application is using the Maven pom for the build.

  4. The last step is to run the Sonar scan.

Stage 2 — Secure Code Scanning

This step is pretty standard and you can use many tools available, like Ccheckmarx, Coverity, SourceClear, etc. These tools do static code scans from asecurity point of view and generate log reports.

DevOps Pipeline Managing PCF App and Resources - DZone (2)

Stage 3 — Deploy Artifact to Repository (Nexus)

This is going to push the build artifact (jar, war file) to the repository, like Nexus or JFrog.

DevOps Pipeline Managing PCF App and Resources - DZone (3)

Stage 4 — Create the PCF Environment and Deploy the App

This is the most important part of this article. This step is going to create the PCF environment and then deploy the app.

  1. Copy the manifest file from the source code and make any changes through scripts, as required.

  2. Log into PCF using the PCF CLI plugin or abash script.

  3. Create the Org/Space for the application and backing services where it will be deployed.Target to the new org/space.

  4. Create the service instances for each backing service required using the cf CLI commandcreate-service.

  5. Push the application downloaded from the repositoryto PCF. The manifest file takes care of the service binding before starting the app.

  6. Log out of PCF.

DevOps Pipeline Managing PCF App and Resources - DZone (4)

All the above steps can be implemented in two ways.

  1. Write a shell script and keep it in the source code repository. This script can be imported into a Bamboo task and executed.

  2. Bamboo has a PCF CLI plugin so a task can be created for each command to log in, create services, deploy app, etc...

I have used a mix of both the approaches to showcase them (disabled tasks are for the second approach).

Now we have configured and provisioned everything required by the application to run, so it would be easy to de-provision it when the job is completed.

Step 5 — Run Automated Tests

This step is also a key point. Unless the testing is automated, you would need your app up and running to do manual testing, and that leads the app to sit idle for most of the time. So, automate most of the testing steps to reduce the idle time.

DevOps Pipeline Managing PCF App and Resources - DZone (5)

Step 6 — Delete the Resources and App

Once the testing is completed, all the resources and app can be de-provisioned.

Again, this can be either by script or the separate task for each command.

DevOps Pipeline Managing PCF App and Resources - DZone (6)

If we look at it before and after running this pipeline, you won't see any new space/app/services in PCF, but still, you fulfilled your purpose of using PCF to deploy and test the app.

Miscellaneous Use Case

The above strategy works very well for a dev environment where developers will keep playing with a lot of resources. For other environments, this might not be the case. For that, we may need to follow a different strategy. Let me explain that as well.

Let's take an example of a UAT environment where developers will be pushing the app and users will be doing the manual testing (now, don't argue with me that it should also be automated. There is always one thing or another which the user would like to see and test by himself before approving it to go to production). In that scenario, you would need to keep the app up and running for a certain period. In that case, you would need a pipeline which can just run Step 6 to clean up the resources. You can keep that pipeline aside to do this job in an automated way rather than doing a manual job.

DevOps Pipeline Managing PCF App and Resources - DZone (7)

That's all for this article. I hope you find it useful to minimize your bills.

Please do share your ideas on how to minimize the resource waste and the bill on the PCF platform. Share your views through comments.

Opinions expressed by DZone contributors are their own.

DevOps Pipeline Managing PCF App and Resources - DZone (2024)

FAQs

What are the two types of pipeline in DevOps? ›

Components of a DevOps Pipeline

Continuous Integration and Continuous Delivery: These two components are typically mentioned together, usually referred to as CI/CD or a CI/CD pipeline.

What is the difference between DevOps and pipeline? ›

DevOps is about streamlining software development, deployment and operations. The DevOps pipeline is how these ideas are implemented in practice and continuous everything is the name of the game, from code integration to application operations.

What is the DevOps pipeline structure? ›

In the Development phase, they are: plan, code, build, and test. In the Operations phase, the stages are: release, deploy, operate, and monitor. The output of a DevOps pipeline is a collection of variables with assigned values used across the pipeline to pass data and manage project and user states.

What is DevOps pipeline CI CD tools? ›

CI/CD is the backbone of a DevOps methodology, bringing developers and IT operations teams together to deploy software. As custom applications become key to how companies differentiate, the rate at which code can be released has become a competitive differentiator.

What are the two examples of pipeline? ›

There are two types of oil pipeline: crude oil pipeline and product pipeline. While the former carries crude oil to refineries, the latter transports refined products such as gasoline, kerosene, jet fuel, and heating oil from refineries to the market.

What is a 2 stage pipeline? ›

In a 2 -stage pipeline, you break down a task into two sub-tasks and execute them in pipeline. Lets say each stage takes 1 cycle to complete. That means in a 2-stage pipeline, each task will take 2 cycles to complete (known as latency).

What are pipelines in DevOps? ›

A DevOps pipeline is a set of automated processes and tools that allows developers and operations professionals to collaborate on building and deploying code to a production environment.

How many types of pipelines are there? ›

The three main types of pipelines in the oil and gas industry are gathering lines, transmission lines, and distribution lines. Gathering lines are usually short pipelines that move oil or gas from individual wellheads to a central collection point where the fluid is then sent to processing facilities.

Top Articles
Latest Posts
Article information

Author: Stevie Stamm

Last Updated:

Views: 6077

Rating: 5 / 5 (60 voted)

Reviews: 91% of readers found this page helpful

Author information

Name: Stevie Stamm

Birthday: 1996-06-22

Address: Apt. 419 4200 Sipes Estate, East Delmerview, WY 05617

Phone: +342332224300

Job: Future Advertising Analyst

Hobby: Leather crafting, Puzzles, Leather crafting, scrapbook, Urban exploration, Cabaret, Skateboarding

Introduction: My name is Stevie Stamm, I am a colorful, sparkling, splendid, vast, open, hilarious, tender person who loves writing and wants to share my knowledge and understanding with you.