cedar.stores.gcs module¶
Helper utilities for using Google Cloud Storage
-
class
cedar.stores.gcs.GCSStore(client, bucket)[source]¶ Bases:
objectStore GEE “pre-ARD” images and metadata on Google Cloud Storage
- Parameters
client (google.cloud.storage.client.Client) – GCS client
bucket (google.cloud.storage.bucket.Bucket) – GCS bucket
-
classmethod
from_credentials(bucket_name, credentials=None, project=None)[source]¶ Load Google Cloud Storage credentials and create store
-
retrieve_image(self, dest, name, path=None, overwrite=True)[source]¶ Retrieve (pieces of) an image from the GCS
-
retrieve_metadata(self, dest, name, path=None, overwrite=True)[source]¶ Retrieve image metadata from the GCS
-
store_image(self, image, name, path=None, **export_image_kwds)[source]¶ Create ee.batch.Task to create and store “pre-ARD”
- Parameters
- Returns
Earth Engine Task
- Return type
ee.Task
-
cedar.stores.gcs.build_gcs_client(credentials=None, project=None)[source]¶ Return a Google Cloud Store API service client
- Parameters
- Returns
Client for the Google Cloud Storage client library
- Return type
google.cloud.storage.Client
Notes
You might consider setting the envirnment variable
GOOGLE_APPLICATION_CREDENTIALSwith the path to your service account credentials file [1].References
-
cedar.stores.gcs.download_blob(blob, dest)[source]¶ Download a blob to a destination directory
- Parameters
blob (google.cloud.storage.blob.Blob) – GCS blob to download
dest (str) – Local directory to download blob into
- Returns
Filename written to
- Return type
-
cedar.stores.gcs.list_blobs(bucket, prefix=None, pattern=None)[source]¶ Return file/non-directory blobs within a on GCS
-
cedar.stores.gcs.mkdir_p(bucket, path)[source]¶ Create a “directory” on GCS
- Parameters
- Returns
GCS blob for directory created
- Return type
google.cloud.storage.blob.Blob
Notes
Directories don’t really exist on GCS but we can fake it [1].
References
-
cedar.stores.gcs.read_json(blob, encoding='utf-8')[source]¶ Read a blob of JSON string data into a dict
-
cedar.stores.gcs.upload_json(bucket, data, path, check=False, encoding='utf-8')[source]¶ Upload data as JSON to GCS
- Parameters
bucket (google.cloud.storage.bucket.Bucket) – GCS bucket
data (str or dict) – JSON, either already dumped to str or as a dict
path (str) – Destination path on GCS for data
check (bool, optional) – Check to see if the file already exists first. If so, will overwrite (or “update” instead of “create”)
encoding (str, optional) – Metadata encoding
- Returns
JSON as GCS blob
- Return type
google.cloud.storage.blob.Blob