Difference between revisions of "Using S3 Bucket"

From DeepSense Docs
Jump to: navigation, search
(First Draft)
 
(Final edit for initial S3 bucket page)
Line 2: Line 2:
  
 
== DeepSense AWS S3 ==
 
== DeepSense AWS S3 ==
At DeepSense, you will be using S3 buckets to store large amount of dataset. Storing all of them on S3 and then using the same in AWS SageMaker to use for projects.
+
At DeepSense, you will be using S3 buckets to store large datasets. These datasets can then be accessed in AWS SageMaker for use in projects.
  
'''AWS administrator will create DeepSense AWS IAM user account.'''
+
'''The AWS administrator will create your DeepSense AWS IAM user account.'''
AWS will send you an email containing information about your account. and users will have access to the AWS SageMaker Studio specific to thee project in new domain.  
+
AWS will send you an email containing your account details. Users will then have access to AWS SageMaker Studio, specific to their project, under the new domain.
  
 
== Accessing the S3 Bucket on DeepSense AWS ==
 
== Accessing the S3 Bucket on DeepSense AWS ==
* Once your account is accessible and you are sign on to the AWS access portal.
+
# Sign in to the AWS access portal once your account is active.
* Select the account dropdown of DeepSense select the respective bucket access you want to use.
+
# From the account dropdown, select '''DeepSense''' and choose the respective bucket you want to access.
* After selection it will open AWS console.
+
# The AWS console will open after selection.
* Click on the link directly from your Cloud administrator has provided you the link (Bookmark link for easier access)
+
# Use the direct link provided by your Cloud Administrator (''tip: bookmark the link for easier access'')
* S3 bucket should be accessible now on your browser.
+
# The S3 bucket should now be accessible in your browser.
  
 
== Using the S3 Bucket on DeepSense AWS ==
 
== Using the S3 Bucket on DeepSense AWS ==
* Once you are in the S3 bucket you can download, upload and delete files/ directories from the bucket.
+
* Once inside the S3 bucket, you can
* If you need to open the bucket in your terminal use the access key from the access portal.
+
** Download files 
* For adding your bucket files to AI Development Platform notebook, 1. Easy way - good for small files. 2. For big files.
+
** Upload files 
* 1. Simply download the files from S3 console to your local and then upload it to your JupyterLab directory to use it for project.
+
** Delete files or directories
* 2. Create your access keys from access portal then open terminal of your notebook instance and follow these steps https://docs.aws.amazon.com/cli/v1/userguide/cli-authentication-user.html#cli-authentication-user-configure.title after that just type in the S3 commands to get or put files in bucket.
+
 
 +
* To access the bucket in your terminal, use the '''Access Key''' from the AWS access portal.
 +
 
 +
* To add bucket files to your AI Development Platform notebook, you have two approaches: 
 +
** '''Option 1 Easy method (for small files):''' 
 +
::: Download the files from the S3 console to your local machine, then upload them to your JupyterLab directory for use in your project.
 +
 
 +
** '''Option 2 – Recommended for large files:''' 
 +
::: 1. Create your access keys from the AWS access portal
 +
::: 2. Open the terminal in your notebook instance
 +
::: 3. Follow the steps in the [AWS CLI Authentication Guide](https://docs.aws.amazon.com/cli/v1/userguide/cli-authentication-user.html#cli-authentication-user-configure.title). 
 +
::: 4. Once configured, run S3 CLI commands (e.g., <code>aws s3 cp</code>, <code>aws s3 ls</code>) to upload or download files directly between your bucket and notebook.

Revision as of 13:10, 10 September 2025

Amazon S3, or Amazon Simple Storage Service, is an object storage service offered by Amazon Web Services (AWS). It provides a highly scalable, durable, and available storage solution for any type of data, accessible from anywhere on the web.

DeepSense AWS S3

At DeepSense, you will be using S3 buckets to store large datasets. These datasets can then be accessed in AWS SageMaker for use in projects.

The AWS administrator will create your DeepSense AWS IAM user account. AWS will send you an email containing your account details. Users will then have access to AWS SageMaker Studio, specific to their project, under the new domain.

Accessing the S3 Bucket on DeepSense AWS

  1. Sign in to the AWS access portal once your account is active.
  2. From the account dropdown, select DeepSense and choose the respective bucket you want to access.
  3. The AWS console will open after selection.
  4. Use the direct link provided by your Cloud Administrator (tip: bookmark the link for easier access).
  5. The S3 bucket should now be accessible in your browser.

Using the S3 Bucket on DeepSense AWS

  • Once inside the S3 bucket, you can:
    • Download files
    • Upload files
    • Delete files or directories
  • To access the bucket in your terminal, use the Access Key from the AWS access portal.
  • To add bucket files to your AI Development Platform notebook, you have two approaches:
    • Option 1 – Easy method (for small files):
Download the files from the S3 console to your local machine, then upload them to your JupyterLab directory for use in your project.
    • Option 2 – Recommended for large files:
1. Create your access keys from the AWS access portal.
2. Open the terminal in your notebook instance.
3. Follow the steps in the [AWS CLI Authentication Guide](https://docs.aws.amazon.com/cli/v1/userguide/cli-authentication-user.html#cli-authentication-user-configure.title).
4. Once configured, run S3 CLI commands (e.g., aws s3 cp, aws s3 ls) to upload or download files directly between your bucket and notebook.