Python Examples#

The following are some example Python functions for interacting with Stratus

isd_s3#

The NCAR ISD S3 Object Storage utility is a python module that was designed and developed by the Information Systems Division (ISD) within CISL at NCAR.

Source Code: NCAR/isd-s3

Note

Only a few high level examples are outlined in this document. There are more functions available under the Session class that can be found at this link to the isd_s3.py code.

Credentials#

By default, isd-s3 looks for an Access ID and Secret Key inside the ~/.aws/credentials file. Before trying to use isd-s3 please place S3 credentials in to this file. Below is an example of what that would look like.

[myprofile]
aws_access_key_id     = yourotheraccessidhere
aws_secret_access_key = yourothersecretkeyhere
[default]
aws_access_key_id = youraccessidhere
aws_secret_access_key = yoursecretkeyhere

Note

If you want to use a different file for credentials update the following environment variable : AWS_SHARED_CREDENTIALS_FILE . If you want to change the profile to use, [default] is the default, update the following environment variable : AWS_PROFILE .

Create a Session#

from isd_s3 import isd_s3
session = isd_s3.Session()

List all buckets#

session.list_buckets()

Returns a list of dictionary arrays.

List all objects in a bucket#

session.list_objects('cisl-cloud-users')

Returns a list of dictionary arrays.

Upload an Object#

# The first value in the function is the local file path. The second value is the Objects key, or unique name/identifier, in the bucket provided
session.upload_object('temp_stats_1-4-24.png','ncote/temp_stats_1-4-24.png',metadata={'created_by':'ncote@ucar.edu'},bucket='cisl-cloud-users')

Download an Object#

session.get_object('ncote/tmax.day.eval.NAM-44i.raw_TS.png',bucket='cisl-cloud-users')

Returns a Key Value pair like {'result': 'successful'}

boto3#

Credentials#

These functions use the Python package os to get your Stratus Access ID and Secret Key. Please export the following variables.

export AWS_ACCESS_KEY_ID="youraccessidhere"

export AWS_SECRET_ACCESS_KEY="yousecretkeyhere"

Import required python modules#

# os is used to get local environment variables 
import os
# boto3 is the python package used to interact with S3
import boto3
import botocore
# This requests package is imported to disable certificate access warnings. 
# SSL certificates can be provided and this would not be required.
import requests.packages.urllib3
# We aren't verifying certs to start so this line is disable warnings
requests.packages.urllib3.disable_warnings()

Stratus S3 Client#

# Define the Stratus S3 client to be used in other operations
def stratus_s3_client():
    # Define the API endpoint for stratus
    endpoint = "https://stratus.ucar.edu/"
    # Create a boto3 sessions
    session = boto3.session.Session()
    # Get the API keys required from OS environmental variables
    # Set these yourself locally so keys are not exposed in plain text in code
    access_key = os.environ.get("AWS_ACCESS_KEY_ID")
    secret_key = os.environ.get("AWS_SECRET_ACCESS_KEY")
    # Create the S3 client based on the variables we set and provided
    s3_client = session.client(
        service_name='s3', 
        endpoint_url=endpoint, 
        aws_access_key_id=access_key,
        aws_secret_access_key=secret_key,
        verify=False)
    # Return the client so that it can be used in other functions
    return s3_client

Stratus S3 Resource#

# Define the Stratus S3 resource to be used in other operations    
def stratus_s3_resource():
    # Define the API endpoint for stratus
    endpoint = "https://stratus.ucar.edu/"
    # Create a boto3 sessions
    session = boto3.session.Session()
    # Get the API keys required from OS environmental variables
    # Set these yourself locally so keys are not exposed in plain text in code
    access_key = os.environ.get("AWS_ACCESS_KEY_ID")
    secret_key = os.environ.get("AWS_SECRET_ACCESS_KEY")
    # Create the S3 resource based on the variables we set and provided
    s3_resource = session.resource(
        service_name='s3', 
        endpoint_url=endpoint, 
        aws_access_key_id=access_key,
        aws_secret_access_key=secret_key,
        verify=False)
    # Return the client so that it can be used in other functions
    return s3_resource

Create a bucket#

Note

Only administrators will be able to create new buckets. If you need a new bucket created with privileges for your account please follow this link to create a ticket.

# Define a function to create a new S3 bucket with a name set by the bucket_name argument
def create_bucket(bucket_name):
    # Use the S3 client already defined to make the call
    s3_client = stratus_s3_client()
    # Call the create_bucket endpoint and provide the bucket_name specified by the user
    s3_client.create_bucket(Bucket=bucket_name)

List all buckets#

# Define a function to list all buckets in the space
def list_all_buckets():
    # Use the S3 client already defined to make the call
    s3_client = stratus_s3_client()
    # Get a response from the list_buckets endpoint
    response = s3_client.list_buckets()
    # Iterate through the Buckets in the response to print all the bucket names
    for bucket in response['Buckets']:
        print(bucket['Name'])

List all object in a bucket#

# Define a function to list all the objects stored in a bucket
def list_bucket_objs(bucket):
    # Use the S3 resource already defined to make the call
    s3_resource = stratus_s3_resource()
    # Get the individual bucket resources for the bucket name provided in the function 
    bucket = s3_resource.Bucket(bucket)
    # Iterate through the response to show all objects contained within the bucket
    for obj in bucket.objects.all():
        print(obj.key)

Upload a file#

Note

Please check and make sure the file being uploaded has a unique and descriptive filename that won’t conflict with any existing data.

# Define a function to upload a file/object to a bucket, specify the filename to upload and the bucket name to be placed in
def upload_file(filename, bucketname):
    # Use the S3 client already defined to make the call
    s3_client = stratus_s3_client()
    # Use the upload_file endpoint to upload our filename to the specified bucket and keep the filename the same
    s3_client.upload_file(filename, bucketname, filename)
    print('Done!')

Download a file#

# Define a function to download a file/object to a bucket
def download_file(filename, bucketname):
    # Use the S3 client already defined to make the call
    s3_client = stratus_s3_client()
    # Open a local file with the same filename as the one we are downloading
    with open(filename, 'wb') as data:
        # Write the file to our open local file which is the python variable 'data'
        s3_client.download_fileobj(bucketname, filename, data)