Pages

Upload Files to AWS S3 using Java

 

Upload Files to AWS S3 using Java




In today’s world, the ability to efficiently manage and store files is crucial. Amazon Simple Storage Service (S3) stands out as one of the most popular and reliable solutions for scalable cloud storage. With its robust features and flexible architecture, AWS S3 offers an ideal platform for storing, retrieving, and managing data securely.

Java, being one of the most widely used programming languages, provides excellent support for interacting with AWS services. Leveraging the AWS SDK for Java, we can seamlessly integrate their Java applications with S3, enabling seamless file upload and management functionalities.

In this article, we’ll see the process of uploading files to AWS S3 using Java. this article will provide you with the necessary knowledge and tools to get started.

Before we start writing code, we need the following:

Pre-requisite:

  1. Bucket should be created in AWS
  2. A new user in IAM (Identity and Access Management) should have been created.
  3. Access Key and Secret Key should have been generated for the created User.
  4. AWS Region Information where AWS S3 data will be stored.
  5. User should have given all required permissions.

Now let’s move to the code part

  1. Download and install AWS CLI. You can download it from the official Link AWS CLI
  2. Make sure to add AWS SDK for Java 2.x dependencies in your project. If you are using Maven, add the following dependency to your pom.xml file.

Add only dependencies which are required for you.

<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>bom</artifactId>
<version>2.x.x</version>
<type>pom</type>
</dependency>

<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>aws-sdk-java</artifactId>
<version>2.x.x</version>
</dependency>

<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>s3</artifactId>
<version>2.x.x</version>
</dependency>

<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>sts</artifactId>
<version>2.x.x</version>
</dependency>

<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>apache-client</artifactId>
<version>2.x.x</version>
</dependency>

Use the Latest available version of dependencies.

3. By this time you would have the information of AWS Region, Bucket Name, User’s access Key and Secret Key, etc.

4. Now If you want to run the Test, You need to configure the credentials in your local system in the ~/.aws/credentials file using AWS CLI. To store this information you can follow the below steps.

Steps:

a. Open a terminal or command prompt and run the following command:

aws configure

b. This command will prompt you to enter your AWS Access Key ID, Secret Access Key, default region, and default output format. Provide the necessary information based on your AWS account. The credentials will be stored in the ~/.aws/credentials file on Linux/Mac or %USERPROFILE%\.aws\credentials file on Windows.

c. You can verify your configuration by checking the contents of the credentials file.

5. Now you are ready to run the Java program to upload the File to AWS S3.

6. To Run the Test from the CI/CD Pipeline with temporary credentials we will use Assumed Roles.

An assumed role is a way to grant temporary access to AWS resources to an AWS Identity and Access Management (IAM) user, IAM group, or an AWS service. This temporary access is often used for cross-account access scenarios or for providing temporary elevated permissions within the same account. I will post a separate article on Assumed roles.

Remember that storing AWS credentials securely is crucial. Avoid hardcoding credentials directly in your code, and do not share your credentials publicly.

Now We are ready to Upload a File to AWS S3.

public void uploadFileTest() throws IOException, CsvException {

Region region = Region.US_EAST_1 ;
String roleArn = “<rolearn>”;
String roleSessionName = "demosession";

// Create STS client, This is to create AWS connection
StsClient stsClient = StsClient.builder()
.region(region)
.build();
//This is used to assume roles and provide us the STS credentials. I will explain more about this in next article
AssumeRoleRequest roleRequest = AssumeRoleRequest.builder()
.roleArn(roleArn)
.roleSessionName(roleSessionName)
.build();

AssumeRoleResponse assumeRoleResponse = stsClient.assumeRole(roleRequest);
// Extract temporary credentials
AwsSessionCredentials sessionCredentials = AwsSessionCredentials.create(
assumeRoleResponse.credentials().accessKeyId(),
assumeRoleResponse.credentials().secretAccessKey(),
assumeRoleResponse.credentials().sessionToken());

//This is to interact with S3
S3Client s3Client = S3Client.builder()
.region(region)
.credentialsProvider(StaticCredentialsProvider.create(sessionCredentials))
.build();

// Specify your S3 bucket and the file to upload
String bucketName = "<bucket-name>";
String key = "<FileName.csv>";
File file = new File("<path-to-file.csv>");
uploadFileToS3(s3Client,bucketName,key,file);
stsClient.close();
// Close the S3 client
s3Client.close();
}


public void uploadFileToS3(S3Client s3Client, String bucketName, String key, File file) {
try {
PutObjectRequest request = PutObjectRequest.builder()
.bucket(bucketName)
.key(key)
.build();

s3Client.putObject(request, RequestBody.fromFile(file));
log.info("File uploaded successfully to S3!");
} catch (Exception e) {
log.info("Error uploading file to S3: " + e.getMessage());
fail("Something went wrong. File is not uploaded to S3.Please analyze the failure");
}
}

Conclusion

In conclusion, learning the process of uploading files to AWS S3 using Java opens up a world of possibilities for everyone. Throughout this article, we’ve explored the essential steps involved in integrating AWS S3 with Java applications, from setting up AWS credentials to writing Java code for file upload operations.

By leveraging the power of the AWS SDK for Java and the robust features of S3, we can seamlessly incorporate cloud-based file storage and management capabilities into their applications. Whether you’re building a web application, a mobile app, or an enterprise solution, AWS S3 provides a scalable, secure, and reliable platform for storing and retrieving data

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.