Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Corrected links that should have been relative instead of absolute.

You can use a CI/CD pipeline for exporting the Adeptia Connect objects from one environment to another automatically. The CI/CD pipeline uses the the Export XML file  file that you must have created as a prerequisite for export operation, and generates an exported ZIP that contains the objects you want to migrate. After you create the Exported ZIP from the source environment, you need to import the objects to the new environment by by running the import pipeline.

Table of Contents
maxLevel2
minLevel2

...

To create the pipeline in Jenkins using the export pipeline provided by Adeptia, follow  follow the steps given below.

  1.  Log in to the the Jenkins with  with admin privileges.

  2.  Select New Item.

    Image Modified

  3. Enter a name for the Export pipeline, and then select then select Pipeline.

  4. Click Click OK.

  5. Copy the content the content from the provided export provided export pipeline file.

    Expand
    titleExport Pipeline


    Code Block
    languagecss
    themeMidnight
    import jenkins.model.Jenkins
    import java.nio.file.Path;
    import java.nio.file.Paths;
    import java.io.File;
    
    
    /*
        This pipeline used to deploys migration promotion through operations like import, rollback, retainexport from one build environment to another.
    
        Pipeline is made up of following steps
        1. Init Perameters
        2. Pull import/retainexport XML and source solution zip file from GitHub
        3. Upload import/retain XML and source solution zip to k8 shared PVC
        4. Download Helm chart & deploy migration solution (import/retain/rollback(export)
        5. Download rollbacksolution zip from k8 shared PVC
        6. Push rollbacksoution zip to GitHub
    	7. Clean up workspace
    
    	Pre-requisite
    	a) Tools/Plugins needs to install:
    		1. Helm
    		2. kubectl client
    		3. Jenkins
    		4. Java 1.8+
    	b) OS Linux
    	c) Jenkins plugins
    		1. Kubernetes
    		2. Git (git plugin 4.8.3, Git client plugin 3.9.0)
    		3. Mask Password
    		4. Credentials Binding Plugin (1.27)
    		5. Parameter Separator
    		6. BlueOcean (Optional)
    		
    	Usage:
    		Steps to create pipeline using jenkinsfile.
    		1. Login into the Jenkins GUI with admin privileges.
    		2. create a pipeline by choosing New Item > Pipeline.
    		3. Copy/past containt of jenkinsfile to Pipeline Definition area.
    		4. Uncheck checkbox "Use Groovy Sandbox".
    		5. Save the pipeline.
    		6. Trigger the pipeline once to initilaize parameters.
    		
    */
    
    /*
        Upload file to Kubernetes PVC
     */
    def uploadToSharedPVC (NAMESPACE, CLUSTER_CONTEXT, K8_CREDENTIALS_ID, SERVER_URL, SRC_FILE_PATH, TRG_FILE_PATH) {
        echo "Upload file("+SRC_FILE_PATH+")files to K8 shared PVC"
    		withKubeConfig([credentialsId: K8_CREDENTIALS_ID, serverUrl: SERVER_URL]) {
    					try {
    						  sh '''#!/bin/bash
    							#!/bin/sh
    							kubectl kubectl config use-context '''+CLUSTER_CONTEXT+'''
    							TRG_FILE_PATH='''+TRG_FILE_PATH+'''
    							if [[ ${TRG_FILE_PATH::1} == "/" ]]
    								then
    								  TRG_FILE_PATH=${TRG_FILE_PATH:1};
    								else
    								  echo "Forward shash(/) already removed "; fi
    							podname=$(kubectl -n '''+NAMESPACE+''' get pods | grep -m 1 autoscaler | awk '{print $1}')
    							kubectl -n '''+NAMESPACE+''' cp '''+SRC_FILE_PATH+''' ${podname}:${TRG_FILE_PATH}
    							jobname=$(kubectl -n '''+NAMESPACE+''' get jobs | grep -m 1 migration | awk '{print $1}')
    							if [[ -n "$jobname" ]]; then
    								kubectl -n '''+NAMESPACE+''' delete job ${jobname}
    								else 
    									echo "Migration resource does not exist"
    							fi
    											'''				
    						  
    						} catch (err) {
    								echo "Caught: ${err}. Error in uploading file."
    								error("Caught: ${err}")
    								currentBuild.result = 'FAILURE'
    							}
    						}
    }
    
    /*
        Download file from Kubernetes PVC
     */
    def downloadFromSharedPVC (NAMESPACE, CLUSTER_CONTEXT, K8_CREDENTIALS_ID, SERVER_URL, SRC_FILE_PATH, TRG_FILE_PATH) {
        echo "Download file("+SRC_FILE_PATH+")files from K8 shared PVC"
    		withKubeConfig([credentialsId: K8_CREDENTIALS_ID, serverUrl: SERVER_URL]) {
    					try {
    						sh '''
    							#!/bin/shbash
    							SRC_FILE_PATH='''+SRC_FILE_PATH+'''
    							if [[ ${SRC_FILE_PATH::1} == "/" ]]
    								then
    								  SRC_FILE_PATH=${SRC_FILE_PATH:1};
    								else
    								  echo "Forward shash(/) already removed "; fi
    							kubectl config use-context '''+CLUSTER_CONTEXT+'''
    							podname=$(kubectl -n '''+NAMESPACE+''' get pods | grep -m 1 autoscaler | awk '{print $1}')
    							kubectl -n '''+NAMESPACE+''' cp ${podname}:${SRC_FILE_PATH} '''+TRG_FILE_PATH+'''
    								'''											
    						} catch (err) {
    								echo "Caught: ${err}. Error in downloading file from K8 PVC."
    								error("Caught: ${err}")
    								currentBuild.result = 'FAILURE'
    							}
    						}
    }
    
    /*
        Pull Helm Chart
     */
    def pullHelmChart (HELM_REPO_URL, CHART_NAME) {
        echo "Pull Helm Chart ("+CHART_NAME+") from Artifact Hub"
    					try {
    						sh '''
    							#!/bin/shbash
    							helm repo add adeptia-connect-migration '''+HELM_REPO_URL+'''
    							helm pull adeptia-connect-migration/'''+CHART_NAME+''' --untar					
    							'''
    						} catch (err) {
    								echo "Caught: ${err}. Error in pulling Helm chart from repo."
    								error("Caught: ${err}")
    								currentBuild.result = 'FAILURE'
    							}
    		
    }
    
    /*
        Deploy Helm to Kubernetes cluster
     */
    def deployToCluster (NAMESPACE, CLUSTER_CONTEXT, K8_CREDENTIALS_ID, DATABASE_CREDENTIALS_ID, SERVER_URL, EXPORT_ZIP_PATH, MIGRATION_XML_FILE_PATH) {
        echo "Deploy Helm chart to Kubernetes cluster"
    					try {
    						def BACKEND_DB_USERNAME = getUserName(DATABASE_CREDENTIALS_ID);
    						def BACKEND_DB_PASSWORD = getPassword(DATABASE_CREDENTIALS_ID);
    						withKubeConfig([credentialsId: K8_CREDENTIALS_ID, serverUrl: SERVER_URL]) {
    						//hide password field
    						wrap([$class: 'MaskPasswordsBuildWrapper', varPasswordPairs: [[password:BACKEND_DB_PASSWORD], [password:BACKEND_DB_USERNAME]]]) { 
    						  sh '''	  
    							#!/bin/shbash
    							kubectl config use-context '''+CLUSTER_CONTEXT+'''
    							helm upgrade -i migration migration -f migration/values.yaml --set environmentVariables.BACKEND_DB_URL=${BACKEND_DB_URL} --set environmentVariables.BACKEND_DB_USERNAME='''+BACKEND_DB_USERNAME+''' --set environmentVariables.BACKEND_DB_PASSWORD='''+BACKEND_DB_PASSWORD+''' --set environmentVariables.BACKEND_DB_DRIVER_CLASS=${BACKEND_DB_DRIVER_CLASS} --set environmentVariables.BACKEND_DB_TYPE=${BACKEND_DB_TYPE} --set environmentVariables.SOURCEEXPORT_ZIP_PATH=${SOURCE'''+EXPORT_ZIP_PATH}+''' --set environmentVariables.MIGRATION_XML_FILE_PATH=${'''+MIGRATION_XML_FILE_PATH}+''' --set environmentVariables.OVERRIDE_USER=${OVERRIDE_USER} --set environmentVariables.OVERRIDE_MODIFIEDBY_USER=${OVERRIDE_MODIFIEDBY_USER} --set environmentVariables.RETAIN_XML_PATH=${RETAIN_XML_PATH} --set environmentVariables.LOG_IDENTIFIER=${LOG_IDENTIFIER} --set environmentVariables.OPERATION=${OPERATION} -n '''+NAMESPACE+'''
    											'''	
    							}
    						}
    						} catch (err) {
    								echo "Caught: ${err}. Error in deploying Helm chart."
    								error("Caught: ${err}")
    								currentBuild.result = 'FAILURE'
    							}
    }
    
    /*
        Wait until deployment finish on Kubernetes cluster
     */
    def waitUntilDepoymentComplete(NAMESPACE, CLUSTER_CONTEXT, K8_CREDENTIALS_ID, SERVER_URL, POD, time_out) {
        echo "Fetching pod status"
    					try {
    						int inter = 5, count = 1;					
    						withKubeConfig([credentialsId: K8_CREDENTIALS_ID, serverUrl: SERVER_URL]) {
    						sh('kubectl config use-context ${CLUSTER_CONTEXT};')
    						while (true) {
    								def status = sh script: "kubectl -n ${NAMESPACE} get pods | grep -m 1 ${POD} | awk '{print \$3}' ", returnStdout: true
                                    if (status.toString().trim().contains("Completed")) {
                                        break;
                                        }
    								else                       if (status.toString().trim().contains("Error")) {
    									error("Caught: Migration deployment failed due to error. Please check migration logs.")
    									currentBuild.result = 'FAILURE'
           sleep(inter) 							   echo count+" retry in "+inter*count+" seconds." 							   count++ 							   if ((count)>=((time_out-5)/inter)) { 									error("Caught: Migration deployment is taking more then ideal time. Pleasebreak;
    check migration logs.") 									currentBuild.result = 'FAILURE' 									break; 							                             }
                                } 						}	 						} catch (err) {
     sleep(inter)
    								   echo count+"Caught: ${err}. Error in fetching pod status retry in "+inter*count+" seconds."
    								error("Caught: ${err}")
    		   count++
    							currentBuild.result = 'FAILURE'
       if ((count)>=((time_out-10)/inter)) {
    									}
    }
    
    /*
        Push soution Zip to GitHub reposirory
     */
    def pushToGitHub (GIT_BRANCH, GIT_CREDENTIALS_ID, GIT_REPO_URL, FILE_PATH) {error("Caught: Migration deployment taking more then ideal time. Please check migration logs.")
    									currentBuild.result = 'FAILURE'
    									break;
    							   }
                   echo "Pushing file ("+FILE_PATH+") to GitHub repo" 				withCredentials([gitUsernamePassword(credentialsId: GIT_CREDENTIALS_ID, gitToolName: 'git-tool')]) { 					try { }
    							sh('sleep 10')
    	}	
    						sh('git add '+FILE_PATH)
    							sh('git commit -m "auto commit message" ')
    							sh('git push ${GIT_REPO_URL} HEAD:'+GIT_BRANCH) 						} catch (err) {
    								echo "Caught: ${err}. Error in pushing file to Githubfetching pod status. Migration deployment taking more then ideal time. Please check migration logs."
    								error("Caught: ${err}")
    								currentBuild.result = 'FAILURE'
    							}
    					}
    }
    
    /*
        GeneratePush rollback soution Zip to fileGitHub pathrepository
     */
    def convertRollbackZipPath(pushToGitHub (GIT_BRANCH, GIT_CREDENTIALS_ID, GIT_REPO_URL, FILE_PATH) {
    					def rollbackZipPath = null
    					def Append = "Rollback_" echo "Pushing file ("+FILE_PATH+") to GitHub repo"
    				withCredentials([gitUsernamePassword(credentialsId: GIT_CREDENTIALS_ID, gitToolName: 'git-tool')]) {
    					try {
    						Pathdef pathgitUser = Paths.get(FILE_PATHgetUserName(GIT_CREDENTIALS_ID);
    							def fileName=path.getFileName().toString()
    sh('sleep 10')
    							def parentDir=path.getParent().toString()
    						rollbackZipPath=parentDir + File.separator + Append + fileName
    sh('git config --global user.name "'+gitUser+'"')
    							if(isUnix()){
    						rollbackZipPath=rollbackZipPath.replace("\\", "/"sh('git config --global user.email "you@example.com"')
    						}
    						} catch (err) {
    	sh('git add '+FILE_PATH)
    								sh('git commit -m "auto commit message" ')
    							sh('git push ${GIT_REPO_URL} HEAD:'+GIT_BRANCH)
    						} catch (err) {
    								echo "Caught: ${err}. Error in generatingpushing rollbackfile soution Zip file pathto Github."
    								error("Caught: ${err}")
    								currentBuild.result = 'FAILURE'
    							}
    					return}
    rollbackZipPath
    }
    
    /*
        GetGenerate usernamerollback soution fromZip credentialsfile idpath
     */
    def getUserNameconvertRollbackZipPath(idFILE_PATH) {
    					def userNamerollbackZipPath = null
    	withCredentials([usernamePassword(credentialsId: id, passwordVariable: 'PASSWORD', usernameVariable: 'USERNAME')]) {
    				def Append = "Rollback_"
    					try {
    						userNamePath path = USERNAMEPaths.get(FILE_PATH);
    						} catch (err) {
    def fileName=path.getFileName().toString()
    								echo "Caught: ${err}. Error in extracting username from "+id+" ."
    			def parentDir=path.getParent().toString()
    						rollbackZipPath=parentDir + File.separator + Append + fileName
    						if(isUnix()){
    					error	rollbackZipPath=rollbackZipPath.replace("Caught: ${err}\\", "/")
    								currentBuild.result = 'FAILURE'
    }
    							} 	}
    	return userName
    }
    
    /*
        Get password from credentials id
     */
    def getPassword(id) {
    	def password = null
    	withCredentials([catch (err) {
    								echo "Caught: ${err}. Error in generating rollback soution Zip file path."
    								error("Caught: ${err}")
    								currentBuild.result = 'FAILURE'
    							}
    					return rollbackZipPath
    }
    
    /*
        Get username from credentials id
     */
    def getUserName(id) {
    	def userName = null
    	withCredentials([usernamePassword(credentialsId: id, passwordVariable: 'PASSWORD', usernameVariable: 'USERNAME')]) {
    					try {
    						passworduserName = PASSWORD;USERNAME
    						} catch (err) {
    								echo "Caught: ${err}. Error in extracting passwordusername from "+id+" ."
    								error("Caught: ${err}")
    								currentBuild.result = 'FAILURE'
    							}
    	}
    	return passworduserName
    }
    							  
    					
    pipeline {
    	// Global default variables
        environment {
    		time_out = 100
        }
    	parameters{
    		//separator(name: 'separator-ce1a9ef5-cd10-4002-a43f-8ae24d9d0bb3', sectionHeader: '''Helm Chart Parameters''', sectionHeaderStyle: 'background-color:#eeeee4;font-size:15px;font-weight:normal;text-transform:uppercase;border-color:gray;', separatorStyle: '''font-weight:bold;line-height:1.5em;font-size:1.5em;''') 
    		string(defaultValue: '', description: 'ArtifactHub Helm chart URL e.g. https://adeptia.github.io/adeptia-connect-migration/charts', name: 'HELM_REPO_URL', trim: true) 
    		string(defaultValue: '', description: 'Name of Helm chart to be downloaded from ArtifactHub repository e.g. migration', name: 'CHART_NAME', trim: true)
    		
    		//separator(name: 'separator-ce1a9ef5-cd10-4002-a43f-8ae24d9d0bb3', sectionHeader: '''GitHub Parameters''', sectionHeaderStyle: 'background-color:#eeeee4;font-size:15px;font-weight:normal;text-transform:uppercase;border-color:gray;', separatorStyle: '''font-weight:bold;line-height:1.5em;font-size:1.5em;''')
    		string(defaultValue: '', description: 'GitHub credentials ID configured in Jenkins e.g. gitCredential_id', name: 'GIT_CREDENTIALS_ID', trim: true) 
    		string(defaultValue: '', description: 'GitHub server URL e.g. https://github.com/adeptia/migration-defination.git', name: 'GIT_REPO_URL', trim: true)
    		string(defaultValue: '', description: 'GitHub Branch name e.g. main', name: 'GIT_BRANCH', trim: true)
    		string(defaultValue: '', description: 'Source zip path to be downloaded from GitHub. e.g. test/SA_PF.zip', name: 'GIT_SOURCE_ZIP_PATH', trim: true)
    		string(defaultValue: '', description: 'Import xml file path to be downloaded from GitHub. e.g. test/import.xml', name: 'GIT_MIGRATION_XML_FILE_PATH', trim: true)
    		string(defaultValue: '', description: 'Retain xml file path to be downloaded from GitHub. e.g. test/retain.xml', name: 'GIT_RETAIN_XML_PATH', trim: true)
    
    		//separator(name: 'separator-ce1a9ef5-cd10-4002-a43f-8ae24d9d0bb3', sectionHeader: '''Migration Parameters''', sectionHeaderStyle: 'background-color:#eeeee4;font-size:15px;font-weight:normal;text-transform:uppercase;border-color:gray;', separatorStyle: '''font-weight:bold;line-height:1.5em;font-size:1.5em;''')
    		string(defaultValue: '', description: 'Location of retain xml file. e.g. import or rollback', name: 'OPERATION', trim: true)
    		string(defaultValue: '', description: 'Migration source zip path. e.g. /shared/SA_PF.zip', name: 'SOURCE_ZIP_PATH', trim: true)
    		string(defaultValue: '', description: 'Migration import/retain xml file path. e.g. /shared/import.xml', name: 'MIGRATION_XML_FILE_PATH', trim: true)
    		string(defaultValue: '', description: 'User Id or User name with which all objects will be deployed e.g IndigoUser:127000000001107055536473900001', name: 'OVERRIDE_USER', trim: true)
    		string(defaultValue: '', description: 'User Id or User name which will be reflected in the modified by field of every activity after deployment e.g. IndigoUser:127000000001107055536473900001', name: 'OVERRIDE_MODIFIEDBY_USER', trim: true)
    /*
        Get password from credentials id
     */
    def getPassword(id) {
    	def password = null
    	withCredentials([usernamePassword(credentialsId: id, passwordVariable: 'PASSWORD', usernameVariable: 'USERNAME')]) {
    					try {
    						password = PASSWORD;
    						} catch (err) {
    								echo "Caught: ${err}. Error in extracting password from "+id+" ."
    								error("Caught: ${err}")
    								currentBuild.result = 'FAILURE'
    							}
    	}
    	return password
    }				  
    					
    pipeline {
    	
    	// Global default variables
        environment {
    		//manage deployment status timeout
    		time_out = 300
        }
    	parameters{
    		//separator(name: 'separator-ce1a9ef5-cd10-4002-a43f-8ae24d9d0bb3', sectionHeader: '''Helm Chart Parameters''', sectionHeaderStyle: 'background-color:#eeeee4;font-size:15px;font-weight:normal;text-transform:uppercase;border-color:gray;', separatorStyle: '''font-weight:bold;line-height:1.5em;font-size:1.5em;''') 
    		string(defaultValue: '', description: 'LocationArtifactHub ofHelm retain xml file.chart URL e.g. https:/shared/retain.xmladeptia.github.io/adeptia-connect-migration/charts', name: 'RETAINHELM_XMLREPO_PATHURL', trim: true) 
    		string(defaultValue: '', description: 'MigrationName of logHelm identifierchart to be capturedownloaded logsfrom inArtifactHub msrepository environmente.g. migration', name: 'LOGCHART_IDENTIFIERNAME', trim: true)
    		
    		//separator(name: 'separator-ce1a9ef5-cd10-4002-a43f-8ae24d9d0bb3', sectionHeader: '''K8 ClusterGitHub Parameters''', sectionHeaderStyle: 'background-color:#eeeee4;font-size:15px;font-weight:normal;text-transform:uppercase;border-color:gray;', separatorStyle: '''font-weight:bold;line-height:1.5em;font-size:1.5em;''')
    		string(defaultValue: '', description: 'CredentialsGitHub credentials ID configured in Jenkins to access K8 cluster.e.g. gitCredential_id', name: 'K8GIT_CREDENTIALS_ID', trim: true) 
    		string(defaultValue: '', description: 'URLGitHub toserver accessURL K8 cluster.e.g https://github.com/adeptia/migration-defination.git', name: 'SERVERGIT_REPO_URL', trim: true)
    		string(defaultValue: '', description: 'ClusterGitHub contextBranch to access K8 cluster.name e.g. main', name: 'CLUSTERGIT_CONTEXTBRANCH', trim: true)
    		string(defaultValue: '', description: 'K8Path clusterto nameupload spacezip deploymentfile whereto Connect microservices deployed.GitHub. e.g. test/SA_PF.zip', name: 'NAMESPACEGIT_EXPORT_ZIP_PATH', trim: true)
    		string(defaultValue: '', description: 'URL of database backend bind with application.export xml file path to download from GitHub. e.g. test/export.xml', name: 'BACKENDGIT_EXPORT_DBXML_URLPATH', trim: true)
    
    		string//separator(defaultValuename: 'separator-ce1a9ef5-cd10-4002-a43f-8ae24d9d0bb3', descriptionsectionHeader: '''Credentials ID configured in Jenkins to access database.', name: 'DATABASE_CREDENTIALS_ID', trim: true) 
    Migration Parameters''', sectionHeaderStyle: 'background-color:#eeeee4;font-size:15px;font-weight:normal;text-transform:uppercase;border-color:gray;', separatorStyle: '''font-weight:bold;line-height:1.5em;font-size:1.5em;''')
    		string(defaultValue: '', description: 'DriverLocation classof ofexport databasexml efile.g com.microsoft.sqlserver.jdbc.SQLServerDriver.eg. export', name: 'BACKEND_DB_DRIVER_CLASSOPERATION', trim: true)
    		string(defaultValue: '', description: 'Database typeMigration export zip path. e.g SQL-Server.. /shared/SA_PF.zip', name: 'BACKENDEXPORT_DBZIP_TYPEPATH', trim: true)
    		}
    	/*	
        agent {
            label 'LinuxAgent'
        }
    	*/
    stages {
            stage('Checkout files from GitHub') {
                steps {
                    script {				
    					echo 'Checkout xml and zip from github'
    					checkout([$class: 'GitSCM', branches: [[name: '*/'+GIT_BRANCH]], extensions: [], userRemoteConfigs: [[credentialsId: GIT_CREDENTIALS_ID, url: GIT_REPO_URL]]])
                        
                    }
                }
            }
    		stage('Upload files to PVC') {
                steps {
                    script {					
    					echo 'Uploading import zip and xml file'
    					uploadToSharedPVC (NAMESPACE, CLUSTER_CONTEXT, K8_CREDENTIALS_ID, SERVER_URL, GIT_SOURCE_ZIP_PATH, SOURCE_ZIP_PATH)
    					uploadToSharedPVC (NAMESPACE, CLUSTER_CONTEXT, K8_CREDENTIALS_ID, SERVER_URL, GIT_MIGRATION_XML_FILE_PATH, MIGRATION_XML_FILE_PATH)
    					//Condition to handle retain feature within import
    					if((GIT_RETAIN_XML_PATH.contains(".xml")) && (RETAIN_XML_PATH.contains(".xml"))){
    					uploadToSharedPVC (NAMESPACE, CLUSTER_CONTEXT, K8_CREDENTIALS_ID, SERVER_URL, GIT_RETAIN_XML_PATH, RETAIN_XML_PATH)
    					}
                    }
                }
            }
    		stage('Pull Helm Chart & Deploy Migration') {
                steps {
    				script {
    						echo 'Pulling Helm Chart'
    						pullHelmChart (HELM_REPO_URL, CHART_NAME)
    
    
    						echo 'Deploying Helm Chart'
    						deployToCluster (NAMESPACE, CLUSTER_CONTEXT, K8_CREDENTIALS_ID, DATABASE_CREDENTIALS_ID, SERVER_URL)
    						timeout(time: env.time_out, unit: "SECONDS"){
    						waitUntilDepoymentComplete(NAMESPACE, CLUSTER_CONTEXT, K8_CREDENTIALS_ID, SERVER_URL, 'migration-', env.time_out.toInteger())
    						}
    				}
                string(defaultValue: '', description: 'Migration export xml file path. e.g. /shared/export.xml', name: 'MIGRATION_XML_FILE_PATH', trim: true)
    		string(defaultValue: '', description: 'Migration log identifier to capture logs from MS environment.', name: 'LOG_IDENTIFIER', trim: true)
    		
    		//separator(name: 'separator-ce1a9ef5-cd10-4002-a43f-8ae24d9d0bb3', sectionHeader: '''K8 Cluster Parameters''', sectionHeaderStyle: 'background-color:#eeeee4;font-size:15px;font-weight:normal;text-transform:uppercase;border-color:gray;', separatorStyle: '''font-weight:bold;line-height:1.5em;font-size:1.5em;''')
    		string(defaultValue: '', description: 'Credentials ID configured in Jenkins to access K8 cluster e.g k8credentials', name: 'K8_CREDENTIALS_ID', trim: true)
    		string(defaultValue: '', description: 'URL to access K8 cluster e.g. https://*******-dns-2ce021bb.hcp.eastus.azmk8s.io:443. You can get the server Url from K8 config file.', name: 'SERVER_URL', trim: true)
    		string(defaultValue: '', description: 'Cluster context to access K8 cluster e.g. adeptia-context', name: 'CLUSTER_CONTEXT', trim: true)
    		string(defaultValue: '', description: 'K8 cluster name space deployment where Connect microservices deployed e.g. adeptia', name: 'NAMESPACE', trim: true)
    		string(defaultValue: '', description: 'URL of database backend bind with application.', name: 'BACKEND_DB_URL', trim: true)
    		string(defaultValue: '', description: 'Credentials ID configured in Jenkins to access database.', name: 'DATABASE_CREDENTIALS_ID', trim: true) 
    		string(defaultValue: '', description: 'Driver class of database e.g com.microsoft.sqlserver.jdbc.SQLServerDriver', name: 'BACKEND_DB_DRIVER_CLASS', trim: true)
    		string(defaultValue: '', description: 'Database type e.g SQL-Server.', name: 'BACKEND_DB_TYPE', trim: true)
    		
    		}
    	
        /*
    		agent {
            	label 'LinuxAgent' 	
        }
    	*/
    	agent any
    
    stages {
            stage('Pull XML from GitHub)') {
    			steps {
    				echo 'Checkout from GitHub'
    					checkout([$class: 'GitSCM', branches: [[name: '*/'+GIT_BRANCH]], extensions: [], userRemoteConfigs: [[credentialsId: GIT_CREDENTIALS_ID, url: GIT_REPO_URL]]])					 
    			}
            }
    		stage('DownloadUpload files Rollbackto zipPVC') {
                when	steps {
    				echo 'Uploading export                  expression { params.OPERATION == 'import' }
                }
                steps {
        xml file'
    					uploadToSharedPVC (NAMESPACE, CLUSTER_CONTEXT, K8_CREDENTIALS_ID, SERVER_URL, GIT_EXPORT_XML_PATH, MIGRATION_XML_FILE_PATH)
             	}
     script {      }
    		stage('Pull Helm chart & Deploy Migration') {
    			steps {
    				echo 'Pulling Helm Chart'
     echo 'Download Rollback zip file to shared PVC					pullHelmChart (HELM_REPO_URL, CHART_NAME)
    				
    				echo 'Deploying Helm Chart'
    					downloadFromSharedPVCdeployToCluster (NAMESPACE, CLUSTER_CONTEXT, K8_CREDENTIALS_ID, DATABASE_CREDENTIALS_ID, SERVER_URL, convertRollbackZipPath(SOURCEEXPORT_ZIP_PATH), convertRollbackZipPath(GITMIGRATION_SOURCEXML_ZIPFILE_PATH))                  
                    }
                }
            }
    		stage('Push Rollback Zip') {
                when {
              
    					timeout(time: env.time_out, unit: "SECONDS"){
    					waitUntilDepoymentComplete(NAMESPACE, CLUSTER_CONTEXT, K8_CREDENTIALS_ID, SERVER_URL, 'migration-', env.time_out.toInteger())
    					}
    			}
    		}
    		stage('Download Zip from PVC') {
             expression	steps {
    params.OPERATION == 'import' }         	downloadFromSharedPVC (NAMESPACE, CLUSTER_CONTEXT,  }
     K8_CREDENTIALS_ID, SERVER_URL, EXPORT_ZIP_PATH, GIT_EXPORT_ZIP_PATH)
             	}
    steps {       }
    		stage('Push Zip        scriptto GitHub') {
    			steps {			echo 'Push Rollback Zip to GitHub'
    				
    				pushToGitHub (GIT_BRANCH, GIT_CREDENTIALS_ID, GIT_REPO_URL, convertRollbackZipPath(GIT_SOURCEEXPORT_ZIP_PATH))
                        
                    }
       	
    			}
            }
    
           }
    		
    	}
    	post('Clean-up') {
    			always {
    				echo 'Cleanup workspace'
    				cleanWs()
    			}
    			success {
    				echo 'Pipeline succeeded!'
    			}
    			unstable {
    				echo 'Pipeline unstable :/'
    			}
    			failure {
    				echo 'Pipeline failed :('
    			}
    		}
    }



  6. Paste the copied content in the Pipeline Definition section in Jenkins.In the Pipeline Definition section, paste the copied content and uncheck the Use Groovy Sandbox checkbox. 

    Image Added

    Click
    Warning
    titleImportant

    If you are using Jenkins on Windows OS, and have created an agent on Linux OS, you need to uncomment do the following code snippet followings in the export pipeline file.

    1. Uncomment the following code snippet.

    Code Block
    languagecss
    themeMidnight
    /*
    		
        agent {
            	label 'LinuxAgent' 	
        }
    	*/

    Where,

    LinuxAgent is the name of the agent that you have created.

  7. Uncheck the Use Groovy Sandbox checkbox.
    Image Removed
  8. 2. Comment the following lines of code.

    Code Block
    languagecss
    themeMidnight
    agent any



  9. Click Save.
  10. On the screen that follows, click click Build Now.

    As you build the pipeline for the very first time, all the parameters get initialized. 

  11. Refresh the page.
    The Build Now Now option now changes to Build with Parameters.

  12. Click Click Build with Parameters.

    You will see all the parameters inherited from the export pipeline file. 

  13. Enter the parameter values as per your requirement.

    Info
    It is mandatory to provide a valid value for all the Jenkins parameters.


    Expand
    titleClick here to expand the list of Export parameters


    ParametersValueDescription

    Helm Chart



    HELM_REPO_URLhttps://adeptia.github.io/adeptia-connect-migration/chartsMigration Helm chart repository URL.
    CHART_NAMEmigrationMigration Helm chart name.
    GitHub

    GIT_CREDENTIALS_ID

    <credential ID generated by Jenkins>

    Credential ID for GitHub in Jenkins.

    Refer to the prerequisites for more details.

    GIT_REPO_URLhttps://github.com/adeptia/migration-defination.gitURL of the GitHub repository.
    GIT_BRANCHmainGitHub branch name.
    GIT_EXPORT_XML_PATHxml/export.xmlPath of the export XML in the GitHub repository.
    GIT_EXPORT_ZIP_PATHtest/SA_PF.zipPath of the exported ZIP in the GitHub repository.
    Migration

    OPERATIONexportThe type of operation for which deployment will be performed.
    MIGRATION_XML_FILE_PATH/shared/migrationtest1/export.xmlPath of the export XML in the PVC.

    EXPORT_ZIP_PATH

    /shared/migrationtest1/SA_PF.zipPath of the exported ZIP in the PVC.
    LOG_IDENTIFIERTest_Identifier_TagLog identifier to capture logs from MS environment.
    Kubernetes Cluster

    K8_CREDENTIALS_ID<credential ID generated by Jenkins>

    Credential ID for Kubernetes in Jenkins.

    Refer to the prerequisites for more details.


    SERVER_URLhttps://<host name of the Kubernetes cluster>:<Port number>URL of the Kubernetes cluster.
    CLUSTER_CONTEXTtest-devKubernetes cluster context.
    NAMESPACEnamespaceKubernetes cluster namespace for deployment.
    Database

    BACKEND_DB_URLjdbc:sqlserver://<DB Hostname>:<Port number>;database=<Backend Database Name>Backend database URL used in the deployment of migration Helm chart. Database info configured in value.yaml file.
    DATABASE_CREDENTIALS_ID<credential ID generated by Jenkins>

    Credential ID for database in Jenkins.

    Refer to the prerequisites for more details.

    BACKEND_DB_DRIVER_CLASScom.microsoft.sqlserver.jdbc.SQLServerDriverBackend database driver class used in the deployment of migration Helm chart. Database info configured in values.yaml file.
    BACKEND_DB_TYPESQL-ServerBackend database type used in the deployment of migration Helm chart. Database info configured in values.yaml file.



  14. Click Click Build to  to trigger the pipeline.

    Info
    • The exported ZIP is created and pushed to the GitHub repository. 

    • All the log statements generated by the migration utility during the export process are tagged with a unique identifier. This allows you to search for this unique identifier in the centralized logging system, and fetch all the associated logs.