Cloud Pak for Data Deployment¶
CPD Deployment on-premises¶
Here we document the deployment of Cloud Pak for Data in an on-premises environment running RedHat OpenShift v4.6 or higher.
We start with a large RedHat OpenShift cluster with 3 Master nodes and 5 Worker nodes, deployed on-prem. Three of those worker nodes are tagged as Storage nodes.
To deploy IBM Cloud Pak for Data on an OpenShift cluster, we will use the IBM Cloud Native Toolkit GitOps Framework. There are only five steps to do:
- Prereqs - Make sure you have a RedHat OpenShift cluster and you are able to use the RedHat OpenShift CLI against it.
- Sealed Secrets - Provide the private key used to seal the secrets provided with the API Connect GitOps repository.
- RedHat OpenShift GitOps Operator - Install the RedHat OpenShift GitOps operator which provides the GitOps tools needed for installing and managing IBM Cloud Pak for Data instances using the GitOps approach already explained.
- IBM Cloud Pak for Data - Deploy an instance of IBM Cloud Pak for Data on the RedHat OpenShift cluster.
- IBM Cloud Pak for Data UI - Validate the installation of your IBM Cloud Pak for Data instance by making sure you are able to log into the IBM Cloud Pak for Data user interface.
1 - Prereqs¶
-
Provision a
Large OCP+ Cluster
from Technology Zone -
Select
Reserve now
for immediate provisioning of cluster. -
Fill the form and Submit.
-
Check the status of the cluster from My library > My reservations on the top left corner of your Technology Zone dashboard.
-
Once the Status of your cluster is
Ready
, open the cluster tile from My reservations page, note down the URL for RedHat OpenShift web console, load balancer IP address and the password to the cluster. Your username iskubeadmin
. -
Login to your cluster using
oc
CLIor using token obtained from RedHat OpenShift web consoleoc login -u kubeadmin -p <password> api.<clustername>.cp.fyre.ibm.com:6443
oc login --token=<token> --server=https://api.<clustername>.cp.fyre.ibm.com:6443
-
Set up local storage operator as per these instructions.
-
Set up OpenShift Container Storage as per these instructions.
2 - Sealed Secrets¶
-
Create the
sealed-secrets
project. This project will host the Sealed Secrets operator that will allow us to decrypt sealed secrets stored in GitHub.oc new-project sealed-secrets
-
Download the private key sealed-secrets-ibm-demo-key.yaml used to seal any secret contained in this demonstration and apply it to the cluster. In our case, we have included a demo IBM Entitlement Key within the GitOps GitHub repository so that we are able to pull down IBM Software.
oc apply -f sealed-secrets-ibm-demo-key.yaml
-
Delete the pod
oc delete pod -n sealed-secrets -l app.kubernetes.io/name=sealed-secrets
-
IMPORTANT WARNING: DO NOT CHECK THE FILE INTO GIT
The private key MUST NOT be checked into GitHub under any circumstances. Please, remove the private key from your workstation to avoid any issues.
rm sealed-secrets-ibm-demo-key.yaml
-
For Cloud Pak to consume the entitlement key, restart the Platform Navigator pods
oc delete pod -n tools -l app.kubernetes.io/name=ibm-integration-platform-navigator
3 - RedHat OpenShift GitOps Operator¶
-
Clone the following GitHub repository that contains the GitOps structure that the Cloud Native Toolkit GitOps Framework understands.
git clone https://github.com/cloud-native-toolkit-demos/multi-tenancy-gitops-cp4d.git
-
Change directory into
multi-tenancy-gitops-cp4d
.cd multi-tenancy-gitops-cp4d
-
Install the RedHat OpenShift GitOps operator on your RedHat OpenShift cluster and wait for it to be available:
- If your RedHat OpenShift cluster version is 4.6
oc apply -f setup/ocp46/ while ! kubectl wait --for=condition=Established crd applications.argoproj.io; do sleep 30; done
- If your RedHat OpenShift cluster version is 4.7
oc apply -f setup/ocp47/ while ! kubectl wait --for=condition=Established crd applications.argoproj.io; do sleep 30; done
Once the above command returns, you can open your RedHat OpenShift Web Console and check out that the RedHat OpenShift GitOps operator has been successfully installed in the
openshift-gitops
project.As you can see in the image, the RedHat OpenShift GitOps operator also installs the RedHat OpenShift Pipelines operator and ArgoCD (which will be that GitOps tool that synchronizes the Infrastructure/Configuration as Code we have stored in GitHub with the state of the RedHat OpenShift cluster).
Important
The RedHat OpenShift Pipelines operator gets installed by the RedHat OpenShift GitOps Subscription only for RedHat OpenShift version 4.6. If your RedHat OpenShift cluster is version 4.7, you will need to install the RedHat OpenShift Pipelines operator as part of the GitOps process explained in this section. For getting such RedHat OpenShift Pipelines operator installed, you would need to specify that in the
kustomize.yaml
file for the services layer here. - If your RedHat OpenShift cluster version is 4.6
-
Once ArgoCD is deployed, get the
admin
password- If your RedHat OpenShift cluster version is 4.6
oc extract secrets/argocd-cluster-cluster --keys=admin.password -n openshift-gitops --to=-
- If your RedHat OpenShift cluster version is 4.7
oc extract secrets/openshift-gitops-cluster --keys=admin.password -n openshift-gitops --to=-
- If your RedHat OpenShift cluster version is 4.6
-
Open the ArgoCD web console by clicking on the ArgoCD console link you can see at the top of your RedHat OpenShift web console and log in.
-
Once you login, you should see that your ArgoCD web console is empty as we have not deployed any Argo Application yet.
4 - IBM Cloud Pak for Data¶
-
Install the ArgoCD Bootstrap Application
oc apply -n openshift-gitops -f 0-bootstrap/argocd/bootstrap.yaml
This ArgoCD Bootstrap Application will bootstrap the deployment of IBM Cloud Pak for Data based on the configuration you have defined in the GitOps GitHub repository we cloned earlier. You can see that we integrate Kustomize for configuration management in the GitOps approach.
As soon as you create this ArgoCD Bootstrap Application, the rest of the ArgoCD Applications and the respective RedHat Openshift resources these manage start to get created as a result of the synchronization process the GitOps approach is based on. You can see these ArgoCD Applications being created in the ArgoCD web console.
-
If you go to the Operators > Installed Operators section of your RedHat OpenShift cluster web console and select the
ibm-common-services
project in the Project drop down list at the top, you should see that the Cloud Pak for Data Operator has been successfully installed as well as the IBM Cloud Pak foundational services. -
If you go to the Home > Search section of your RedHat OpenShift cluster web console and select the
cloudpak
project in the Project drop down list at the top, since in our Cloud Pak for Data GitOps process we have configured the IBM Cloud Pak for Data instance to be deployed in thecloudpak
project, and search forZenService
inResources
, you should seeZenService
listed. -
Select the listed
ZenService
resource and you should see lite-cr listed. -
Click on the
lite-cr
link and you should see it Running and Successful. -
If you go back to the ArgoCD web console, you should see all of the Argo Application in green.
5 - IBM Cloud Pak for Data UI¶
Let's make sure that our IBM Cloud Pak for Data instance is up and running. Do that by logging into the IBM Cloud Pak for Data user interface.
-
Obtain IBM Cloud Pak for Data console URL by executing
echo https://`oc -n cloudpak get ZenService lite-cr -o jsonpath="{.status.url}{'\n'}"`
-
Open the URL in a browser and you will be presented with the IBM Cloud Pak for Data user interface login option. Enter
admin
as username. -
Obtain admin password by executing
oc -n cloudpak extract secret/admin-user-details --keys=initial_admin_password --to=-
-
Log into the IBM Cloud Pak for Data UI using the password from previous step.
-
Click on the navigation menu icon on the top left corner. Click on Services menu option to expand it, then select Services catalog.
-
The various services installed with IBM Cloud Pak for Data will be displayed.
That is it to get a working instance of IBM Cloud Pak for Data.