Skip to content

Commit 4780ef8

Browse files
authored
Merge pull request #311 from GoogleCloudPlatform/tswast-storage-transfer
Adds system tests for storage transfer samples.
2 parents 48b978b + 2aed070 commit 4780ef8

16 files changed

+305
-338
lines changed

.gitignore

+2
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,9 @@ release.properties
2525
dependency-reduced-pom.xml
2626
buildNumber.properties
2727

28+
# Secrets
2829
service-account.json
30+
secrets.env
2931

3032
# intellij
3133
.idea/

.travis.yml

+3
Original file line numberDiff line numberDiff line change
@@ -31,6 +31,9 @@ before_install:
3131
- openssl aes-256-cbc -K $encrypted_37a4f399de75_key -iv $encrypted_37a4f399de75_iv
3232
-in service-account.json.enc -out service-account.json -d && export GOOGLE_APPLICATION_CREDENTIALS=$TRAVIS_BUILD_DIR/service-account.json
3333
GCLOUD_PROJECT=cloud-samples-tests || true
34+
- openssl aes-256-cbc -K $encrypted_eb858daba67b_key -iv $encrypted_eb858daba67b_iv -in secrets.env.enc -out secrets.env -d
35+
&& set +x && source secrets.env && set -x
36+
|| true
3437
# Skip the install step, since Maven will download the dependencies we need
3538
# when the test build runs.
3639
# http://stackoverflow.com/q/31945809/101923

secrets.env.enc

128 Bytes
Binary file not shown.

storage/storage-transfer/README.md

+52-10
Original file line numberDiff line numberDiff line change
@@ -2,20 +2,45 @@
22

33
This app creates two types of transfers using the Transfer Service tool.
44

5+
<!-- auto-doc-link -->
6+
These samples are used on the following documentation pages:
7+
8+
>
9+
* https://cloud.google.com/storage/transfer/create-client
10+
* https://cloud.google.com/storage/transfer/create-manage-transfer-program
11+
12+
<!-- end-auto-doc-link -->
13+
514
## Prerequisites
615

7-
1. Set up a project on Google Developers Console.
8-
1. Go to the [Developers Console](https://cloud.google.com/console) and create or select your project.
9-
You will need the project ID later.
16+
1. Set up a project on Google Cloud Console.
17+
1. Go to the [Google Cloud Console](https://console.cloud.google.com) and
18+
create or select your project. You will need the project ID later.
19+
1. Enable the [Google Storage Transfer API in the Google Cloud
20+
Console](https://console.cloud.google.com/apis/api/storagetransfer/overview).
1021
1. Within Developers Console, select APIs & auth > Credentials.
1122
1. Select Add credentials > Service account > JSON key.
12-
1. Set the environment variable GOOGLE_APPLICATION_CREDENTIALS to point to your JSON key.
13-
1. Add the Storage Transfer service account as an editor of your project
14-
storage-transfer-5031963314028297433@partnercontent.gserviceaccount.com
23+
1. Set the environment variable `GOOGLE_APPLICATION_CREDENTIALS` to point to
24+
your JSON key.
25+
1. Add the Storage Transfer service account as an editor of your project.
26+
1. To get the email address used for the service account, execute the
27+
[googleServiceAccounts.get REST
28+
method](https://cloud.google.com/storage/transfer/reference/rest/v1/googleServiceAccounts/get#try-it).
29+
There should be a "Try It" section on that page, otherwise execute it in
30+
the [APIs
31+
Explorer](https://developers.google.com/apis-explorer/#p/storagetransfer/v1/storagetransfer.googleServiceAccounts.get).
32+
33+
It should output an email address like:
34+
35+
```
36+
37+
```
38+
1. Add this as a member and select the Project -> Editor permission on the
39+
[Google Cloud Console IAM and Admin
40+
page](https://console.cloud.google.com/iam-admin/iam/project).
1541
1. Set up gcloud for application default credentials.
1642
1. `gcloud components update`
17-
1. `gcloud auth login`
18-
1. `gcloud config set project PROJECT_ID`
43+
1. `gcloud init`
1944
2045
## Transfer from Amazon S3 to Google Cloud Storage
2146
@@ -26,9 +51,21 @@ Creating a one-time transfer from Amazon S3 to Google Cloud Storage.
2651
1. Go to AWS Management Console and create a bucket.
2752
1. Under Security Credentials, create an IAM User with access to the bucket.
2853
1. Create an Access Key for the user. Note the Access Key ID and Secret Access Key.
54+
1. Set the `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY` environment variables.
2955
1. In AwsRequester.java, fill in the user-provided constants.
30-
1. Run with `mvn compile` and
31-
`mvn exec:java -Dexec.mainClass="com.google.cloud.storage.storagetransfer.samples.AwsRequester"`
56+
1. Compile the package with
57+
```
58+
mvn compile
59+
```
60+
1. Run the transfer job with
61+
```
62+
mvn exec:java \
63+
-Dexec.mainClass="com.google.cloud.storage.storagetransfer.samples.AwsRequester" \
64+
-DprojectId=your-google-cloud-project-id \
65+
-DjobDescription="Sample transfer job from S3 to GCS." \
66+
-DawsSourceBucket=your-s3-bucket-name \
67+
-DgcsSinkBucket=your-gcs-bucket-name
68+
```
3269
1. Note the job ID in the returned Transfer Job.
3370
3471
## Transfer data from a standard Cloud Storage bucket to a Cloud Storage Nearline bucket
@@ -50,3 +87,8 @@ bucket for files untouched for 30 days.
5087
1. In RequestChecker.java, fill in the user-provided constants. Use the Job Name you recorded earlier.
5188
1. Run with `mvn compile` and
5289
`mvn exec:java -Dexec.mainClass="com.google.cloud.storage.storagetransfer.samples.RequestChecker"`
90+
91+
## References
92+
93+
- [Google Storage Transfer API Client
94+
Library](https://developers.google.com/api-client-library/java/apis/storagetransfer/v1)

storage/storage-transfer/pom.xml

+12-7
Original file line numberDiff line numberDiff line change
@@ -35,7 +35,6 @@
3535

3636
<properties>
3737
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
38-
<powermock.version>1.6.2</powermock.version>
3938
</properties>
4039

4140
<dependencies>
@@ -47,15 +46,21 @@
4746

4847
<!-- Test Dependencies -->
4948
<dependency>
50-
<groupId>org.powermock</groupId>
51-
<artifactId>powermock-module-junit4</artifactId>
52-
<version>${powermock.version}</version>
49+
<groupId>com.google.truth</groupId>
50+
<artifactId>truth</artifactId>
51+
<version>0.29</version>
5352
<scope>test</scope>
5453
</dependency>
5554
<dependency>
56-
<groupId>org.powermock</groupId>
57-
<artifactId>powermock-api-mockito</artifactId>
58-
<version>${powermock.version}</version>
55+
<groupId>junit</groupId>
56+
<artifactId>junit</artifactId>
57+
<version>4.12</version>
58+
<scope>test</scope>
59+
</dependency>
60+
<dependency>
61+
<groupId>org.mockito</groupId>
62+
<artifactId>mockito-all</artifactId>
63+
<version>1.10.19</version>
5964
<scope>test</scope>
6065
</dependency>
6166
</dependencies>

storage/storage-transfer/src/main/java/com/google/cloud/storage/storagetransfer/samples/AwsRequester.java

+62-44
Original file line numberDiff line numberDiff line change
@@ -27,31 +27,19 @@
2727
import com.google.api.services.storagetransfer.model.TransferSpec;
2828

2929
import java.io.IOException;
30-
import java.util.logging.Logger;
30+
import java.io.PrintStream;
3131

3232
/**
3333
* Creates a one-off transfer job from Amazon S3 to Google Cloud Storage.
3434
*/
3535
public final class AwsRequester {
36-
37-
private static final String JOB_DESC = "YOUR DESCRIPTION";
38-
private static final String PROJECT_ID = "YOUR_PROJECT_ID";
39-
private static final String AWS_SOURCE_NAME = "YOUR SOURCE BUCKET";
40-
private static final String AWS_ACCESS_KEY_ID = "YOUR_ACCESS_KEY_ID";
41-
private static final String AWS_SECRET_ACCESS_KEY = "YOUR_SECRET_ACCESS_KEY";
42-
private static final String GCS_SINK_NAME = "YOUR_SINK_BUCKET";
43-
44-
/**
45-
* Specify times below using US Pacific Time Zone.
46-
*/
47-
private static final String START_DATE = "YYYY-MM-DD";
48-
private static final String START_TIME = "HH:MM:SS";
49-
50-
private static final Logger LOG = Logger.getLogger(AwsRequester.class.getName());
51-
5236
/**
5337
* Creates and executes a request for a TransferJob from Amazon S3 to Cloud Storage.
5438
*
39+
* <p>The {@code startDate} and {@code startTime} parameters should be set according to the UTC
40+
* Time Zone. See:
41+
* https://developers.google.com/resources/api-libraries/documentation/storagetransfer/v1/java/latest/com/google/api/services/storagetransfer/v1/model/Schedule.html#getStartTimeOfDay()
42+
*
5543
* @return the response TransferJob if the request is successful
5644
* @throws InstantiationException
5745
* if instantiation fails when building the TransferJob
@@ -60,43 +48,73 @@ public final class AwsRequester {
6048
* @throws IOException
6149
* if the client failed to complete the request
6250
*/
63-
public static TransferJob createAwsTransferJob() throws InstantiationException,
64-
IllegalAccessException, IOException {
65-
Date date = TransferJobUtils.createDate(START_DATE);
66-
TimeOfDay time = TransferJobUtils.createTimeOfDay(START_TIME);
67-
TransferJob transferJob = TransferJob.class
68-
.newInstance()
69-
.setDescription(JOB_DESC)
70-
.setProjectId(PROJECT_ID)
71-
.setTransferSpec(
72-
TransferSpec.class
73-
.newInstance()
74-
.setAwsS3DataSource(
75-
AwsS3Data.class
76-
.newInstance()
77-
.setBucketName(AWS_SOURCE_NAME)
78-
.setAwsAccessKey(
79-
AwsAccessKey.class.newInstance().setAccessKeyId(AWS_ACCESS_KEY_ID)
80-
.setSecretAccessKey(AWS_SECRET_ACCESS_KEY)))
81-
.setGcsDataSink(GcsData.class.newInstance().setBucketName(GCS_SINK_NAME)))
82-
.setSchedule(
83-
Schedule.class.newInstance().setScheduleStartDate(date).setScheduleEndDate(date)
84-
.setStartTimeOfDay(time)).setStatus("ENABLED");
51+
public static TransferJob createAwsTransferJob(
52+
String projectId,
53+
String jobDescription,
54+
String awsSourceBucket,
55+
String gcsSinkBucket,
56+
String startDate,
57+
String startTime,
58+
String awsAccessKeyId,
59+
String awsSecretAccessKey)
60+
throws InstantiationException, IllegalAccessException, IOException {
61+
Date date = TransferJobUtils.createDate(startDate);
62+
TimeOfDay time = TransferJobUtils.createTimeOfDay(startTime);
63+
TransferJob transferJob =
64+
new TransferJob()
65+
.setDescription(jobDescription)
66+
.setProjectId(projectId)
67+
.setTransferSpec(
68+
new TransferSpec()
69+
.setAwsS3DataSource(
70+
new AwsS3Data()
71+
.setBucketName(awsSourceBucket)
72+
.setAwsAccessKey(
73+
new AwsAccessKey()
74+
.setAccessKeyId(awsAccessKeyId)
75+
.setSecretAccessKey(awsSecretAccessKey)))
76+
.setGcsDataSink(new GcsData().setBucketName(gcsSinkBucket)))
77+
.setSchedule(
78+
new Schedule()
79+
.setScheduleStartDate(date)
80+
.setScheduleEndDate(date)
81+
.setStartTimeOfDay(time))
82+
.setStatus("ENABLED");
8583

8684
Storagetransfer client = TransferClientCreator.createStorageTransferClient();
8785
return client.transferJobs().create(transferJob).execute();
8886
}
8987

88+
public static void run(PrintStream out)
89+
throws InstantiationException, IllegalAccessException, IOException {
90+
String projectId = TransferJobUtils.getPropertyOrFail("projectId");
91+
String jobDescription = TransferJobUtils.getPropertyOrFail("jobDescription");
92+
String awsSourceBucket = TransferJobUtils.getPropertyOrFail("awsSourceBucket");
93+
String gcsSinkBucket = TransferJobUtils.getPropertyOrFail("gcsSinkBucket");
94+
String startDate = TransferJobUtils.getPropertyOrFail("startDate");
95+
String startTime = TransferJobUtils.getPropertyOrFail("startTime");
96+
String awsAccessKeyId = TransferJobUtils.getEnvOrFail("AWS_ACCESS_KEY_ID");
97+
String awsSecretAccessKey = TransferJobUtils.getEnvOrFail("AWS_SECRET_ACCESS_KEY");
98+
99+
TransferJob responseT =
100+
createAwsTransferJob(
101+
projectId,
102+
jobDescription,
103+
awsSourceBucket,
104+
gcsSinkBucket,
105+
startDate,
106+
startTime,
107+
awsAccessKeyId,
108+
awsSecretAccessKey);
109+
out.println("Return transferJob: " + responseT.toPrettyString());
110+
}
111+
90112
/**
91113
* Output the contents of a successfully created TransferJob.
92-
*
93-
* @param args
94-
* arguments from the command line
95114
*/
96115
public static void main(String[] args) {
97116
try {
98-
TransferJob responseT = createAwsTransferJob();
99-
LOG.info("Return transferJob: " + responseT.toPrettyString());
117+
run(System.out);
100118
} catch (Exception e) {
101119
e.printStackTrace();
102120
}

0 commit comments

Comments
 (0)