Versions Compared
Version | Old Version 6 | New Version 7 |
---|---|---|
Changes made by | ||
Saved on |
Key
- This line was added.
- This line was removed.
- Formatting was changed.
You can use a CI/CD pipeline for exporting importing the Adeptia Connect objects from one environment to another to a new environment automatically. The This CI/CD pipeline uses the Export XML file the Exported ZIP file that you must have created as a prerequisite for export import operation, and generates an exported ZIP that contains imports the objects you want to migrate. After you create the Exported ZIP from the source environment, you need to import the objects to the new environment by running the import pipelinethe target environment.
How the
ExportImport pipeline works
The diagram below represents how the export import pipeline works to automate the export , and automates the import process.
Image RemovedImage Added
When triggered, the pipeline performs the following sequence of actions.
- Connects to the GIT GitHub repository, pulls the Export XML fileImport XML and Exported ZIP files, and places it them to a location in the PVC (shared folder).
- Pulls the retain XML from the GitHub repository (if specified in the import pipeline) to retain certain activities during the import process.
- Pulls and deploys the migration utility helm chart to create the Exported ZIP at a PVC (shared) location you defined while creating Export XML.Downloads the newly created Exported ZIPimport the objects.
- Creates a rollback ZIP file at the shared repository file location, and pushes it to the Git GitHub repository.
- Deletes the migration job workspace created during the export import process.
To start with exporting importing the object objects, you first need to create an export import pipeline having all the required parameters for exporting the objects. Adeptia provides you with a sample export of import pipeline file that you can use to create your own pipeline for exporting the objects customize based on your requirement. After you create the pipeline, it needs to be triggered you need to trigger the pipeline to perform the export import operation.
Creating and triggering the
ExportImport pipeline
To create the pipeline , follow in Jenkins using the import pipeline provided by Adeptia, follow the steps given below.
Log in to the Jenkins with admin privileges.
Info |
---|
If you already have an export pipeline created, you can use that one, else, you can use the one provided by Adeptia. |
Copy the export pipeline content provided by Adeptia Select New Item.
Image Added
Enter a name for the Import pipeline, and then select Pipeline.
Image Added
Click OK.
Copy the content from the provided import pipeline file.
Expand title Export Pipeline Code Block language css theme Midnight //@Grab('org.yaml:snakeyaml:1.17') import org.yaml.snakeyaml.Yaml import org.yaml.snakeyaml.DumperOptions import static org.yaml.snakeyaml.DumperOptions.FlowStyle.BLOCK import jenkins.model.Jenkins import java.nio.file.Path; import java.nio.file.Paths; import java.io.File; /* This is a pipeline that deploys migration promotion by exporting and importing the solution from one environment to another. Pipeline is made up of 7 main steps 1. Init Perameters 2. Pull export XML file from GitHub 3. Upload XML to k8 shared PVC 4. Pull Helm chart & deploy migration solution 5. Download solution Zip from k8 shared PVC 6. Push soution Zip to GitHub 7. Clean up workspace Pre-requisite a) Tools/Plugins needs to install: 1. Helm 2. kubectl client 3. Jenkins 4. Java 1.8+ b) OS Linux c) Jenkins plugins 1. Kubernetes 2. Nexus 3. Git (git plugin 4.8.3, Git client plugin 3.9.0) 4. Mask Password 5. Credentials Binding Plugin (1.27) 6. Parameter Separator */ /* Upload file to Kubernetes PVC */ def uploadToSharedPVC (NAMESPACE, CLUSTER_CONTEXT, K8_CREDENTIALSID, SERVERURL, SRC_FILE_PATH, TRG_FILE_PATH) { echo "Upload file("+SRC_FILE_PATH+") to K8 shared PVC" withKubeConfig([credentialsId: K8_CREDENTIALSID, serverUrl: SERVERURL]) { try { wrap([$class: 'MaskPasswordsBuildWrapper', varPasswordPairs: [[NEXUS_PASSWORD:'NEXUS_PASSWORD']]]) { sh ''' #!/bin/sh kubectl config use-context '''+CLUSTER_CONTEXT+''' TRG_FILE_PATH='''+TRG_FILE_PATH+''' if [[ ${TRG_FILE_PATH::1} == "/" ]] then TRG_FILE_PATH=${TRG_FILE_PATH:1}; else echo "Forward shash(/) already removed "; fi podname=$(kubectl -n '''+NAMESPACE+''' get pods | grep -m 1 autoscaler | awk '{print $1}') kubectl -n '''+NAMESPACE+''' cp '''+SRC_FILE_PATH+''' ${podname}:${TRG_FILE_PATH} jobname=$(kubectl -n '''+NAMESPACE+''' get jobs | grep -m 1 migration | awk '{print $1}') if [[ -n "$jobname" ]]; then kubectl -n '''+NAMESPACE+''' delete job ${jobname} else echo "Migration resource does not exist" fi ''' } } catch (err) { echo "Caught: ${err}. Error in uploading file." error("Caught: ${err}") currentBuild.result = 'FAILURE' } } } /* Download file from Kubernetes PVC */ def downloadFromSharedPVC (NAMESPACE, CLUSTER_CONTEXT, K8_CREDENTIALSID, SERVERURL, SRC_FILE_PATH, TRG_FILE_PATH) { echo "Download file("+SRC_FILE_PATH+") from K8 shared PVC" withKubeConfig([credentialsId: K8_CREDENTIALSID, serverUrl: SERVERURL]) { try { sh ''' #!/bin/sh SRC_FILE_PATH='''+SRC_FILE_PATH+''' if [[ ${SRC_FILE_PATH::1} == "/" ]] then SRC_FILE_PATH=${SRC_FILE_PATH:1}; else echo "Forward shash(/) already removed "; fi kubectl config use-context '''+CLUSTER_CONTEXT+''' podname=$(kubectl -n '''+NAMESPACE+''' get pods | grep -m 1 autoscaler | awk '{print $1}') kubectl -n '''+NAMESPACE+''' cp ${podname}:${SRC_FILE_PATH} '''+TRG_FILE_PATH+''' ''' } catch (err) { echo "Caught: ${err}. Error in downloading file from K8 PVC." error("Caught: ${err}") currentBuild.result = 'FAILURE' } } } /* Pull Helm Chart */ def pullHelmChart (NEXUS_CREDENTIALSID, NEXUS_HELMREPOURL, CHARTNAME) { echo "Pull Helm Chart ("+CHARTNAME+") from Nexus repository" withCredentials([usernamePassword(credentialsId: NEXUS_CREDENTIALSID, passwordVariable: 'NEXUS_PASSWORD', usernameVariable: 'NEXUS_USERNAME')]) { try { //hide password field wrap([$class: 'MaskPasswordsBuildWrapper', varPasswordPairs: [[NEXUS_PASSWORD:'NEXUS_PASSWORD']]]) { sh ''' #!/bin/sh helm repo add nexushelmrepo '''+NEXUS_HELMREPOURL+''' --username '''+NEXUS_USERNAME+''' --password '''+NEXUS_PASSWORD+''' helm pull nexushelmrepo/'''+CHARTNAME+''' --untar ''' } } catch (err) { echo "Caught: ${err}. Error in pulling Helm chart from Nexus repo." error("Caught: ${err}") currentBuild.result = 'FAILURE' } } } /* Deploy Helm to Kubernetes cluster */ def deployToCluster (NAMESPACE, CLUSTER_CONTEXT, K8_CREDENTIALSID, DATABASE_CREDENTIALS_ID, SERVERURL, MIGRATION_XML_FILE_PATH) { echo "Deploy Helm chart to Kubernetes cluster" try { def BACKEND_DB_USERNAME = getUserName(DATABASE_CREDENTIALS_ID); def BACKEND_DB_PASSWORD = getPassword(DATABASE_CREDENTIALS_ID); withKubeConfig([credentialsId: K8_CREDENTIALSID, serverUrl: SERVERURL]) { //hide password field wrap([$class: 'MaskPasswordsBuildWrapper', varPasswordPairs: [[password:BACKEND_DB_PASSWORD], [password:BACKEND_DB_USERNAME]]]) { sh ''' #!/bin/sh kubectl config use-context '''+CLUSTER_CONTEXT+''' helm upgrade -i migration migration -f migration/config/values-qa.yaml --set environmentVariables.BACKEND_DB_URL=${BACKEND_DB_URL} --set environmentVariables.BACKEND_DB_USERNAME='''+BACKEND_DB_USERNAME+''' --set environmentVariables.BACKEND_DB_PASSWORD='''+BACKEND_DB_PASSWORD+''' --set environmentVariables.BACKEND_DB_DRIVER_CLASS=${BACKEND_DB_DRIVER_CLASS} --set environmentVariables.BACKEND_DB_TYPE=${BACKEND_DB_TYPE} --set environmentVariables.MIGRATION_XML_FILE_PATH='''+MIGRATION_XML_FILE_PATH+''' --set image.tag=${LATEST_TAG} -n '''+NAMESPACE+''' ''' } } } catch (err) { echo "Caught: ${err}. Error in deploying Helm chart." error("Caught: ${err}") currentBuild.result = 'FAILURE' } } /* Wait until deployment finish on Kubernetes cluster */ def waitUntilDepoymentComplete(NAMESPACE, CLUSTER_CONTEXT, K8_CREDENTIALSID, SERVERURL, POD) { echo "Wait until deployment finished" try { int timeout = 1200, inter = 5, count = 0; withKubeConfig([credentialsId: K8_CREDENTIALSID, serverUrl: SERVERURL]) { sh('kubectl config use-context ${CLUSTER_CONTEXT};') while (true) { def status = sh script: "kubectl -n ${NAMESPACE} get pods | grep -m 1 ${POD} | awk '{print \$3}' ", returnStdout: true if (status.toString().trim().contains("Completed")) { break; } sleep(inter) count++ if (count==timeout) { echo "Migration deployment is taking more then ideal time. Please check migration logs." currentBuild.result = 'FAILURE' break; } } } } catch (err) { echo "Caught: ${err}. Error in fetching pod status." error("Caught: ${err}") currentBuild.result = 'FAILURE' } } /* Push soution Zip to GitHub reposirory */ def pushToGitHub (GIT_BRANCH, GIT_CREDENTIALSID, REPO_URL, FILE_PATH) { echo "Push file ("+FILE_PATH+") to GitHub repo" withCredentials([gitUsernamePassword(credentialsId: GIT_CREDENTIALSID, gitToolName: 'git-tool')]) { try { sh('sleep 10') sh('git add '+FILE_PATH) sh('git commit -m "auto commit message" ') sh('git push ${REPO_URL} HEAD:'+env.GIT_BRANCH) } catch (err) { echo "Caught: ${err}. Error in pushing file to Github." error("Caught: ${err}") currentBuild.result = 'FAILURE' } } } /* Generate rollback soution Zip file path */ def convertRollbackZipPath(FILE_PATH) { def rollbackZipPath = null def Append = "Rollback_" try { Path path = Paths.get(FILE_PATH); def fileName=path.getFileName().toString() def parentDir=path.getParent().toString() rollbackZipPath=parentDir + File.separator + Append + fileName if(isUnix()){ rollbackZipPath=rollbackZipPath.replace("\\", "/") } } catch (err) { echo "Caught: ${err}. Error in generating rollback soution Zip file path." error("Caught: ${err}") currentBuild.result = 'FAILURE' } return rollbackZipPath } /* Get username from credentials id */ def getUserName(id) { def userName = null withCredentials([usernamePassword(credentialsId: id, passwordVariable: 'PASSWORD', usernameVariable: 'USERNAME')]) { try { userName = USERNAME } catch (err) { echo "Caught: ${err}. Error in extracting username from "+id+" ." error("Caught: ${err}") currentBuild.result = 'FAILURE' } } return userName } /* Get password from credentials id */ def getPassword(id) { def password = null withCredentials([usernamePassword(credentialsId: id, passwordVariable: 'PASSWORD', usernameVariable: 'USERNAME')]) { try { password = PASSWORD; } catch (err) { echo "Caught: ${err}. Error in extracting password from "+id+" ." error("Caught: ${err}") currentBuild.result = 'FAILURE' } } return password } /* Email temaplate */ def emailRecipient = 'test@adeptia.com' def emailBody = """ <header> <div class="page-header__brand"> <h1 style="font-family: var(--font-family-sans);font-weight:600;padding:2px 0;color:#333;"><div class="logo"><img src="https://www.jenkins.io/favicon.ico" alt="Jenkins" width="32" height="32"></h1></div> </div></header> <body style="font-family: var(--font-family-sans); color: #333;"> <h2 class="build-caption page-headline">Jenkins, waiting for your Approval!</h2> <div style="padding:20px; font-size:15px;"> <p>Pipeline in-Progress: <b>'${env.JOB_NAME} [${env.BUILD_NUMBER}]'</b></p> <p>Stage in-Progress: <b>' [${env.STAGE_NAME}] 1 ${env.STAGE_NAME} 2 env.STAGE_NAME'</b></p> <p>Check console output here: <a href='${env.BUILD_URL}'>${env.JOB_NAME} [${env.BUILD_NUMBER}]</a></p> <p>Visit to the Portal from here: <a href='${env.BUILD_URL}${env.BUILD_NUMBER}/input/'>Jenkins Portal Link</a></p> </br> </br> <p style="font-size:10px !important;"><b>note:</b> Login using Jenkins credentials into the portal.</p> </div> </br></body> """ pipeline { // Global default variables environment { DEPLOY_EXPORT = true //DEPLOY_IMPORT = false } parameters{ separator(name: 'separator-ce1a9ef5-cd10-4002-a43f-8ae24d9d0bb3', sectionHeader: '''Global Parameters''', sectionHeaderStyle: 'background-color:#eeeee4;font-size:15px;font-weight:normal;text-transform:uppercase;border-color:gray;', separatorStyle: '''font-weight:bold;line-height:1.5em;font-size:1.5em;''') string(defaultValue: 'nexus_credentialsId', description: 'Credentials ID configured in Jenkins to access Nexus server.', name: 'NEXUS_CREDENTIALS_ID', trim: true) string(defaultValue: 'https://nexus.adeptia.com:8443/repository/adeptia-internal-helm/', description: 'URL to access Nexus server.', name: 'NEXUS_HELM_REPO_URL', trim: true) string(defaultValue: 'migration', description: 'Helm chart needs to access.', name: 'CHART_NAME', trim: true) string(defaultValue: 'gitCredential_prathi-adeptia', description: 'Credentials ID configured in Jenkins to access GitHub server.', name: 'GIT_CREDENTIALS_ID', trim: true) string(defaultValue: 'https://github.com/prathi-adeptia/migration-defination.git', description: 'URL to access GitHub server.', name: 'GIT_REPO_URL', trim: true) string(defaultValue: 'main', description: 'Branch to commit or checkout files from GitHub server.', name: 'GIT_BRANCH', trim: true) separator(name: 'separator-ce1a9ef5-cd10-4002-a43f-8ae24d9d0bb3', sectionHeader: '''Migration Parameters''', sectionHeaderStyle: 'background-color:#eeeee4;font-size:15px;font-weight:normal;text-transform:uppercase;border-color:gray;', separatorStyle: '''font-weight:bold;line-height:1.5em;font-size:1.5em;''') string(defaultValue: '/shared/test/SA_PF.zip', description: 'Migration export zip path. eg. /shared/test/SA_PF.zip', name: 'MIGRATION_SOLUTION_EXPORT_ZIP_PATH', trim: true) string(defaultValue: '/shared/test/export.xml', description: 'Migration export xnl file path. eg. /shared/test/export.xml', name: 'MIGRATION_EXPORT_XML_FILE_PATH', trim: true) separator(name: 'separator-ce1a9ef5-cd10-4002-a43f-8ae24d9d0bb3', sectionHeader: '''K8 Cluster Parameters''', sectionHeaderStyle: 'background-color:#eeeee4;font-size:15px;font-weight:normal;text-transform:uppercase;border-color:gray;', separatorStyle: '''font-weight:bold;line-height:1.5em;font-size:1.5em;''') string(defaultValue: 'k8credentials', description: 'Credentials ID configured in Jenkins to access K8 cluster.', name: 'K8_CREDENTIALS_ID', trim: true) string(defaultValue: 'https://valuelabs-dev-dns-2ce021bb.hcp.eastus.azmk8s.io:443', description: 'URL to access K8 cluster.', name: 'SERVER_URL', trim: true) string(defaultValue: 'valuelabs-dev', description: 'Cluster context to access K8 cluster.', name: 'CLUSTER_CONTEXT', trim: true) string(defaultValue: 'anubhav', description: 'K8 cluster name space deployment where Connect microservices deployed.', name: 'NAMESPACE', trim: true) string(defaultValue: 'jdbc:sqlserver://adeptia-adeptiaconnectdb.database.windows.net:1433;database=valuelabs-backend-8', description: 'URL of database backend bind with application.', name: 'BACKEND_DB_URL', trim: true) string(defaultValue: 'databasecredentialid', description: 'Credentials ID configured in Jenkins to access database.', name: 'DATABASE_CREDENTIALS_ID', trim: true) string(defaultValue: 'com.microsoft.sqlserver.jdbc.SQLServerDriver', description: 'Driver class of database.', name: 'BACKEND_DB_DRIVER_CLASS', trim: true) string(defaultValue: 'SQL-Server', description: 'Database type.', name: 'BACKEND_DB_TYPE', trim: true) booleanParam(defaultValue: false, description: 'To trigger migration import pipeline on success. Select true else false.', name: 'DEPLOY_IMPORT') } agent { label 'LinuxAgent' } stages { stage('Init parameters') { steps { script{ //Global parameters //Nexus repository parameters env.NEXUS_CREDENTIALSID='nexus_credentialsId' env.NEXUS_HELMREPOURL='https://nexus.adeptia.com:8443/repository/adeptia-internal-helm/' env.CHARTNAME='migration' //Git parameters env.GIT_CREDENTIALSID='gitCredential_prathi-adeptia' env.REPO_URL='https://github.com/prathi-adeptia/migration-defination.git' env.GIT_BRANCH='main' //Other env.LATEST_TAG='45ffc20' //Export env.GIT_EXPORT_ZIP_FILE_PATH='test/SA_PF.zip' env.GIT_EXPORT_XML_FILE_PATH='export.xml' //Migration env.MIGRATION_SOLUTION_EXPORT_ZIP_PATH='/shared/migrationtest1/SA_PF.zip' env.MIGRATION_EXPORT_XML_FILE_PATH='/shared/migrationtest1/export.xml' //K8 cluster parameters env.K8_CREDENTIALSID='k8credentials' env.SERVERURL='https://valuelabs-dev-dns-2ce021bb.hcp.eastus.azmk8s.io:443' env.CLUSTER_CONTEXT='valuelabs-dev' env.NAMESPACE='anubhav' //Database env.BACKEND_DB_URL='jdbc:sqlserver://adeptia-adeptiaconnectdb.database.windows.net:1433;database=valuelabs-backend-8' env.BACKEND_DB_DRIVER_CLASS='com.microsoft.sqlserver.jdbc.SQLServerDriver' env.BACKEND_DB_TYPE='SQL-Server' //Import env.GIT_IMPORT_ZIP_FILE_PATH='test/SA_PF.zip' env.GIT_IMPORT_XML_FILE_PATH='import.xml' //Migration env.MIGRATION_SOLUTION_IMPORT_ZIP_PATH='/shared/migrationtest1/SA_PF.zip' env.MIGRATION_IMPORT_XML_FILE_PATH='/shared/migrationtest1/import.xml' //K8 cluster parameters env.K8_CREDENTIALSID='k8credentials' env.SERVERURL='https://valuelabs-dev-dns-2ce021bb.hcp.eastus.azmk8s.io:443' env.CLUSTER_CONTEXT='valuelabs-dev' env.NAMESPACE='anubhav' //Database env.BACKEND_DB_URL='jdbc:sqlserver://adeptia-adeptiaconnectdb.database.windows.net:1433;database=valuelabs-backend-8' //env.DATABASE_CREDENTIALS_ID='databasecredentialid' env.BACKEND_DB_DRIVER_CLASS='com.microsoft.sqlserver.jdbc.SQLServerDriver' env.BACKEND_DB_TYPE='SQL-Server' //Other env.operation='import' env.SourceZipLoc='$SHARED_PATH$/migrationtest1/SA_PF.zip' env.RetainXmlLocation ='$SHARED_PATH$/migrationtest1/RETAIN.xml' env.OverrideUser ='IndigoUser:127000000001107055536473900001' env.OverrideModifiedByUser ='IndigoUser:127000000001107055536473900001' } } } stage('Pull XML from GitHub)') { when { allOf { environment name: 'DEPLOY_EXPORT', value: 'true' }} //condition to skip export steps { echo 'Checkout from GitHub' checkout([$class: 'GitSCM', branches: [[name: '*/'+env.GIT_BRANCH]], extensions: [], userRemoteConfigs: [[credentialsId: env.GIT_CREDENTIALSID, url: env.REPO_URL]]]) } } stage('Upload XML to Shared PVC') { when { allOf { environment name: 'DEPLOY_EXPORT', value: 'true' }} //condition to skip export steps { echo 'Upload export xml to Shared PVC' uploadToSharedPVC (NAMESPACE, CLUSTER_CONTEXT, env.K8_CREDENTIALSID, env.SERVERURL, GIT_EXPORT_XML_FILE_PATH, MIGRATION_EXPORT_XML_FILE_PATH) } } stage('Pull helm chart & Deploy Migration') { when { allOf { environment name: 'DEPLOY_EXPORT', value: 'true' }} //condition to skip export steps { echo 'Pull Helm Chart' pullHelmChart (env.NEXUS_CREDENTIALSID, NEXUS_HELMREPOURL, CHARTNAME) echo 'Deploy Export' deployToCluster (NAMESPACE, CLUSTER_CONTEXT, env.K8_CREDENTIALSID, env.DATABASE_CREDENTIALS_ID, env.SERVERURL, MIGRATION_EXPORT_XML_FILE_PATH) waitUntilDepoymentComplete(NAMESPACE, CLUSTER_CONTEXT, env.K8_CREDENTIALSID, env.SERVERURL, 'migration-') sh('rm -rf migration') } } stage('Download Zip from PVC') { when { allOf { environment name: 'DEPLOY_EXPORT', value: 'true' }} //condition to skip export steps { downloadFromSharedPVC (NAMESPACE, CLUSTER_CONTEXT, env.K8_CREDENTIALSID, env.SERVERURL, MIGRATION_SOLUTION_EXPORT_ZIP_PATH, GIT_EXPORT_ZIP_FILE_PATH) } } stage('Push Zip to GitHub') { when { allOf { environment name: 'DEPLOY_EXPORT', value: 'true' }} //condition to skip export steps { pushToGitHub (GIT_BRANCH, env.GIT_CREDENTIALSID, REPO_URL, GIT_EXPORT_ZIP_FILE_PATH) } } } post('Clean-up') { always { script{ deleteDir() /* clean up workspace */ } echo 'Cleanup workspace' } success { echo 'Pipeline succeeded!' script{ if(DEPLOY_IMPORT){ //Triggered Import pipeline def build = build(job: 'Test_Pipeline_Migration_Import') bn = build.getNumber() echo "Triggerd build: "+ build.getProjectName() +", BuildNumber:" + bn } } } unstable { echo 'Pipeline unstable :/' } failure { echo 'Pipeline failed :(' } changed { echo 'Things were different before...' } cleanup { echo "Cleanup" } } }
- Log in to your Jenkins account.
- Paste the content in Jenkins.
- Click Save.
- As you save the file, you will see a screen containing Paste the copied content in the Pipeline Definition section in Jenkins.
- Uncheck the Use Groovy Sandbox checkbox.
- Click Save.
On the screen that follows, click Build Now.
As you build the pipeline for the very first time, all the parameters get initialized. The Build Now option now changes to Build with Parameters.Click Build with Parameters.
You will see all the parameters and their values inherited from the
exportimport pipeline file.
Change the parameter values as per your requirement.
Expand title Click here to expand the list of Export parameters Parameters Value Description Comments Parameters used in Export Pipeline ____________________________________________________________ //Nexus
NEXUS_CREDENTIALS_ID nexus_credentialsId Store Github access credentials in Jenkins. Select credential type "Username and password" and create global credentials in Jenkins. Example:
user: valuelabs
Pass: password
Note: Use credentials binding plugin to store credentials in Jenkins. Select credential type "Username and password" and save it. A Credentials ID can be added or configures in Jenkins. Using credential ID, the credentials can be accessible globally in Jenkins.
For Ref:https://www.jenkins.io/doc/book/using/using-credentials/ NEXUS_HELM_REPO_URL https://nexus.adeptia.com:8443/repository/adeptia-internal-helm/ Nexus Helm chart repository URL CHART_NAME migration Helm chart name. //GitHub GIT_CREDENTIALS_ID gitCredential_prathi-adeptia Store GitHub access credentials in Jenkins. Select credential type "Username and password" and create global credentials in Jenkins.
For Ref:https://www.jenkins.io/doc/book/using/using-credentials/ REPO_URL https://github.com/prathi-adeptia/migration-defination.git GitHub repository URL. GIT_BRANCH main GitHub branch name. GIT_EXPORT_XML_FILE_PATH xml/export.xml Location to pull the export.xml file from GitHub repository. GIT_EXPORT_ZIP_FILE_PATH test/SA_PF.zip Location to commit and push export zip to GitHub repository. //Migration MIGRATION_EXPORT_XML_FILE_PATH /shared/migrationtest1/export.xml Export.xml path to upload file on shared PVC (K8 cluster) MIGRATION_SOLUTION_EXPORT_ZIP_PATH /shared/migrationtest1/SA_PF.zip Export zip path to download file from shared PVC (K8 cluster) //K8 Cluster K8_CREDENTIALS_ID k8credentials Store K8 cluster access credentials in Jenkins. Select credential type "Secret file" and create global credentials in Jenkins. Example:
user: valuelabs
Pass: Secret file (Here secret file is k8 .config file)
For Ref:https://www.jenkins.io/doc/book/using/using-credentials/ SERVER_URL https://valuelabs-dev-dns-2ce021bb.hcp.eastus.azmk8s.io:443 Kubernetes cluster URL CLUSTER_CONTEXT valuelabs-dev Kubernetes cluster context NAMESPACE anubhav Kubernetes cluster namespace for deployment. //Database BACKEND_DB_URL jdbc:sqlserver://adeptia-adeptiaconnectdb.database.windows.net:1433;database=valuelabs-backend-8 Backend database URL used in the deployment of migration Helm chart. Database info configured in value.yaml file. BACKEND_DB_USERNAME valuelab Backend database user name used in the deployment of migration Helm chart. Database info configured in value.yaml file. BACKEND_DB_PASSWORD Password Backend database password used in the deployment of migration Helm chart. Database info configured in value.yaml file. BACKEND_DB_DRIVER_CLASS com.microsoft.sqlserver.jdbc.SQLServerDriver Backend database driver class used in the deployment of migration Helm chart. Database info configured in value.yaml file. BACKEND_DB_TYPE SQL-Server Backend database database type used in the deployment of migration Helm chart. Database info configured in value.yaml file. //Other LATEST_TAG 45ffc20 Docker image tag of Migration microservice. This may not needed. Temporarily used for testing //Parameters used in Import pipeline //GitHub GIT_CREDENTIALS_ID gitCredential_prathi-adeptia Store GitHub access credentials in Jenkins. Select credential type "Username and password" and create global credentials in Jenkins.
For Ref:https://www.jenkins.io/doc/book/using/using-credentials/ REPO_URL https://github.com/prathi-adeptia/migration-defination.git GitHub repository URL. GIT_BRANCH main GitHub branch name. GIT_IMPORT_XML_FILE_PATH xml/import.xml Location to pull the import.xml file from GitHub repository. GIT_IMPORT_ZIP_FILE_PATH test/SA_PF.zip Location to pull import zip to GitHub repository. //Migration MIGRATION_IMPORT_XML_FILE_PATH /shared/migrationtest1/import.xml Export.xml path to upload file on shared PVC (K8 cluster) MIGRATION_SOLUTION_IMPORT_ZIP_PATH /shared/migrationtest1/SA_PF.zip Export zip path to upload file on shared PVC (K8 cluster) //K8 Cluster K8_CREDENTIALS_ID k8credentials Store K8 cluster access credentials in Jenkins. Select credential type "Secret file" and create global credentials in Jenkins. Example:
user: valuelabs
Pass: Secret file (Here secret file is k8 .config file)
For Ref:https://www.jenkins.io/doc/book/using/using-credentials/ SERVER_URL https://valuelabs-dev-dns-2ce021bb.hcp.eastus.azmk8s.io:443 Kubernetes cluster URL CLUSTER_CONTEXT valuelabs-dev Kubernetes cluster context NAMESPACE anubhav Kubernetes cluster namespace for deployment. //Database BACKEND_DB_URL jdbc:sqlserver://adeptia-adeptiaconnectdb.database.windows.net:1433;database=valuelabs-backend-8 Backend database URL used in the deployment of migration Helm chart. Database info configured in value.yaml file. BACKEND_DB_USERNAME valuelab Backend database user name used in the deployment of migration Helm chart. Database info configured in value.yaml file. BACKEND_DB_PASSWORD Password Backend database password used in the deployment of migration Helm chart. Database info configured in value.yaml file. BACKEND_DB_DRIVER_CLASS com.microsoft.sqlserver.jdbc.SQLServerDriver Backend database driver class used in the deployment of migration Helm chart. Database info configured in value.yaml file. BACKEND_DB_TYPE SQL-Server Backend database database type used in the deployment of migration Helm chart. Database info configured in value.yaml file. //Other upcoming parameters operation import SourceZipLoc $SHARED_PATH$/migrationtest1/SA_PF.zip RetainXmlLocation $SHARED_PATH$/migrationtest1/RETAIN.xml OverrideUser IndigoUser:127000000001107055536473900001 OverrideModifiedByUser IndigoUser:127000000001107055536473900001 log identifier
Triggering the Export pipeline
Capturing the Export logs
Migration logs
Roll-Back the migrationClick Build to trigger the pipeline.
Info The objects are imported to the target environment.
The rollback ZIP is created and pushed to the GitHub repository.
All the log statements generated by the migration utility during the import process are tagged with a unique identifier. This allows you to search for this unique identifier in the centralized logging system, and fetch all the associated logs.
Panel | ||||
---|---|---|---|---|
| ||||
What is new |