Cross-account sync with Code Engine
In this guide I will show you how to sync ICOS bucket objects between accounts using Code Engine. Code Engine provides a platform to unify the deployment of all of your container-based applications on a Kubernetes-based infrastructure. The Code Engine experience is designed so that you can focus on writing code without the need for you to learn, or even know about, Kubernetes.
Code Engine is currently an experimental offering and all resources are deleted every 7 days.

Steps

Preparing Accounts

We will be using Cloud Shell to generate Service IDs and Object Storage credentials for both the source and destination accounts.

Source Account

We will create a service ID on the source account. A service ID identifies a service or application similar to how a user ID identifies a user. We can assign specific access policies to the service ID that restrict permissions for using specific services: in this case it gets read-only access to an IBM Cloud Object Storage bucket.

Create Service ID

1
$ ibmcloud iam service-id-create <name-of-your-service-id> --description "Service ID for read-only access to bucket" --output json
Copied!
Service ID Creation

Create Reader access policy for newly created service id

Now we will limit the scope of this service ID to have read only access to our source Object Storage bucket.
1
$ ibmcloud iam service-policy-create <Service ID> --roles Reader --service-name cloud-object-storage --service-instance <Service Instance GUID> --resource-type bucket --resource <bucket-name>
Copied!
Service Instance GUID - This is the GUID of the Cloud Object Storage instance. You can retrieve this with the command: ibmcloud resource service-instance <name of icos instance>
Expected Output Example

Generate HMAC credentials tied to our service ID

In order for the Minio client to talk to each Object Storage instance it will need HMAC credentials (Access Key and Secret Key in S3 parlance).
1
$ ibmcloud resource service-key-create source-icos-service-creds Reader --instance-id <Service Instance GUID> --service-id <Service ID> --parameters '{"HMAC":true}'
Copied!
Save the access_key_id and secret_access_key as we will be using these in our Code Engine project.
Create HMAC Credentials

Destination Account

We will create a service ID on the destination account. A service ID identifies a service or application similar to how a user ID identifies a user. We can assign specific access policies to the service ID that restrict permissions for using specific services: in this case it gets write access to an IBM Cloud Object Storage bucket.

Create Service ID

1
$ ibmcloud iam service-id-create <name-of-your-service-id> --description "Service ID for write access to bucket" --output json
Copied!
Expected Output Example

Create Reader access policy for newly created service id

Now we will limit the scope of this service ID to have read only access to our source Object Storage bucket.
1
$ ibmcloud iam service-policy-create <Service ID> --roles Writer --service-name cloud-object-storage --service-instance <Service Instance GUID> --resource-type bucket --resource <bucket-name>
Copied!
Service Instance GUID - This is the GUID of the Cloud Object Storage instance. You can retrieve this with the command: ibmcloud resource service-instance <name of icos instance>

Generate HMAC credentials tied to our service ID

We'll follow the same procedure as last time to generate the HMAC credentials, but this time on the destination account.
1
$ ibmcloud resource service-key-create destination-icos-service-creds Writer --instance-id <Service Instance GUID> --service-id <Service ID> --parameters '{"HMAC":true}'
Copied!
Save the access_key_id and secret_access_key as we will be using these in with our Code Engine project.

Create Code Engine Project via Cloud Shell

In order to create our Code Engine project we need to make sure that our cloud shell session is targeting the correct resource group. You can do this by using the target -g option with the IBM Cloud CLI.
1
$ ibmcloud target -g <Resource Group>
Copied!
With the correct Resource Group set, we can now create our Code Engine project. We add the --target flag to ensure that future Code Engine commands are targeting the correct project.
1
$ ibmcloud ce project create -n <project_name> --target
Copied!
Create Code Engine Project

Create Code Engine Secrets

In order for our minio powered container to sync objects between the accounts it needs access to the Access and Secret keys we created earlier. We will use the secret create option to store all of values in a single secret that we can then reference in our job definition.
    SOURCE_ACCESS_KEY: Access Key generated on Source account
    SOURCE_SECRET_KEY: Secret Key generated on Source account
    SOURCE_REGION: Cloud Object Storage endpoint for the Source bucket
    SOURCE_BUCKET: Name of bucket on Source account
    DESTINATION_ACCESS_KEY: Access Key generated on Destination account
    DESTINATION_SECRET_KEY: Secret Key generated on Destination account
    DESTINATION_REGION: Cloud Object Storage endpoint for the Destination bucket
    DESTINATION_BUCKET: Name of bucket on Destination account
1
$ ibmcloud ce secret create --name ce-sync-secret --from-literal SOURCE_ACCESS_KEY=VALUE --from-literal SOURCE_SECRET_KEY=VALUE --from-literal SOURCE_REGION=VALUE --from-literal SOURCE_BUCKET=VALUE --from-literal DESTINATION_ACCESS_KEY=VALUE --from-literal DESTINATION_SECRET_KEY=VALUE --from-literal DESTINATION_REGION=VALUE --from-literal DESTINATION_BUCKET=VALUE
Copied!
Create Code Engine Secret

Create Code Engine Project Job definition with environmental variables

Now that our project has been created we need to create our Job definition. In Code Engine terms a job is a stand-alone executable for batch jobs. Unlike applications, which react to incoming HTTP requests, jobs are meant to be used for running container images that contain an executable that is designed to run one time and then exit.
1
$ ibmcloud ce jobdef create --name JOBDEF_NAME --image IMAGE_REF --env-from-secret SECRET_NAME
Copied!
Create Job Definition
You can view the definition of the job using the command ibmcloud ce jobdef get -n <name of job definition>
1
$ ibmcloud ce jobdef get -n ce-mc-sync-jobdef
2
Project 'ce-minio-sync' and all its contents will be automatically deleted 7 days from now.
3
Getting job definition 'ce-mc-sync-jobdef'...
4
Name: ce-mc-sync-jobdef
5
Project ID: 1d7514d2-ce89
6
Metadata:
7
Creation Timestamp: 2020-08-17 14:51:00 +0000 UTC
8
Generation: 1
9
Resource Version: 223595401
10
Self Link: /apis/codeengine.cloud.ibm.com/v1alpha1/namespaces/1d7514d2-ce89/jobdefinitions/ce-mc-sync-jobdef
11
UID: 0fb11f71-a912-44f9-88e3-1d1612f8e8ab
12
Spec:
13
Containers:
14
Image: greyhoundforty/icos-ce-sync:1
15
Name: ce-mc-sync-jobdef
16
Commands:
17
Arguments:
18
Env:
19
Name: SOURCE_BUCKET
20
Value From Secret Key Ref:
21
Key: SOURCE_BUCKET
22
Name: ce-sync-secret
23
Env:
24
Name: SOURCE_REGION
25
Value From Secret Key Ref:
26
Key: SOURCE_REGION
27
Name: ce-sync-secret
28
Env:
29
Name: SOURCE_SECRET_KEY
30
Value From Secret Key Ref:
31
Key: SOURCE_SECRET_KEY
32
Name: ce-sync-secret
33
Env:
34
Name: DESTINATION_ACCESS_KEY
35
Value From Secret Key Ref:
36
Key: DESTINATION_ACCESS_KEY
37
Name: ce-sync-secret
38
Env:
39
Name: DESTINATION_BUCKET
40
Value From Secret Key Ref:
41
Key: DESTINATION_BUCKET
42
Name: ce-sync-secret
43
Env:
44
Name: DESTINATION_REGION
45
Value From Secret Key Ref:
46
Key: DESTINATION_REGION
47
Name: ce-sync-secret
48
Env:
49
Name: DESTINATION_SECRET_KEY
50
Value From Secret Key Ref:
51
Key: DESTINATION_SECRET_KEY
52
Name: ce-sync-secret
53
Env:
54
Name: SOURCE_ACCESS_KEY
55
Value From Secret Key Ref:
56
Key: SOURCE_ACCESS_KEY
57
Name: ce-sync-secret
58
Resource Requests:
59
Cpu: 1
60
Memory: 128Mi
61
OK
Copied!

Submit Code Engine Job

It is now time to submit our Job to Code Engine. The maximum time a job can run is 10 hours, but in most cases ICOS syncing takes significantly less time to complete.
1
$ ibmcloud ce job run --name <name of job> --jobdef <name of job definition>
Copied!
In my testing I am only syncing a few times so by the time I check the Kubernetes pods, the job has already completed. Looking at the logs I am able to verify that the contents have been synced between the Object Storage buckets.
1
[email protected]:~$ ibmcloud ce job run --name ce-mc-sync-jobv1 --jobdef ce-mc-sync-jobdef
2
Project 'ce-minio-sync' and all its contents will be automatically deleted 7 days from now.
3
Creating job 'ce-mc-sync-jobv1'...
4
OK
5
​
6
[email protected]:~$ ibmcloud ce project target -n ce-minio-sync --kubecfg
7
Targeting project 'ce-minio-sync'...
8
Added context for 'ce-minio-sync' to the current kubeconfig file.
9
OK
10
Now targeting environment 'ce-minio-sync'.
11
​
12
[email protected]:~$ kubectl get pods
13
NAME READY STATUS RESTARTS AGE
14
ce-mc-sync-jobv1-0-0 0/1 Completed 0 53s
15
​
16
[email protected]:~$ ibmcloud ce job list
17
Project 'ce-minio-sync' and all its contents will be automatically deleted 7 days from now.
18
Listing jobs...
19
Name Age
20
ce-mc-sync-jobv1 2m13s
21
OK
22
Command 'job list' performed successfully
23
​
24
[email protected]:~$ ibmcloud ce job logs -n ce-mc-sync-jobv1
25
Project 'ce-minio-sync' and all its contents will be automatically deleted 7 days from now.
26
Logging job 'ce-mc-sync-jobv1' on pod '0'...
27
Added `source_acct` successfully.
28
Added `destination_acct` successfully.
29
`source_acct/wandering-thunder-68-source/as-policy.png` -> `destination_acct/sparkling-sky-47-destination/as-policy.png`
30
`source_acct/wandering-thunder-68-source/create-source-service-policy.png` -> `destination_acct/sparkling-sky-47-destination/create-source-service-policy.png`
31
`source_acct/wandering-thunder-68-source/add-backup-repository.png` -> `destination_acct/sparkling-sky-47-destination/add-backup-repository.png`
32
`source_acct/wandering-thunder-68-source/direct-link-standard.png` -> `destination_acct/sparkling-sky-47-destination/direct-link-standard.png`
33
`source_acct/wandering-thunder-68-source/create-workspace.png` -> `destination_acct/sparkling-sky-47-destination/create-workspace.png`
34
`source_acct/wandering-thunder-68-source/direct-link-byoip.png` -> `destination_acct/sparkling-sky-47-destination/direct-link-byoip.png`
35
`source_acct/wandering-thunder-68-source/add-scale-out.png` -> `destination_acct/sparkling-sky-47-destination/add-scale-out.png`
36
`source_acct/wandering-thunder-68-source/filter-vpc.png` -> `destination_acct/sparkling-sky-47-destination/filter-vpc.png`
37
`source_acct/wandering-thunder-68-source/k8s-storage.png` -> `destination_acct/sparkling-sky-47-destination/k8s-storage.png`
38
`source_acct/wandering-thunder-68-source/Picture1.png` -> `destination_acct/sparkling-sky-47-destination/Picture1.png`
39
`source_acct/wandering-thunder-68-source/iks-storage-Page-1.png` -> `destination_acct/sparkling-sky-47-destination/iks-storage-Page-1.png`
40
`source_acct/wandering-thunder-68-source/dl-copy.png` -> `destination_acct/sparkling-sky-47-destination/dl-copy.png`
41
`source_acct/wandering-thunder-68-source/rt-us-east.gv.png` -> `destination_acct/sparkling-sky-47-destination/rt-us-east.gv.png`
42
`source_acct/wandering-thunder-68-source/ns-secrets-cm.png` -> `destination_acct/sparkling-sky-47-destination/ns-secrets-cm.png`
43
Total: 0 B, Transferred: 2.30 MiB, Speed: 1.66 MiB/s
44
​
45
OK
46
Command 'job logs' performed successfully
Copied!
If you need to sync contents from the source bucket to the destination bucket again, simply run another job (with a new name) and Code Engine will take care of it for you.
Last modified 1yr ago