Skip to content

StorageLocation

Contained within this file are experimental interfaces for working with the Synapse Python Client. Unless otherwise noted these interfaces are subject to change at any time. Use at your own risk.

API Reference

synapseclient.models.StorageLocation dataclass

Bases: EnumCoercionMixin, StorageLocationSynchronousProtocol

A storage location setting describes where files are uploaded to and downloaded from via Synapse. Storage location settings may be created for external locations, such as user-owned Amazon S3 buckets, Google Cloud Storage buckets, SFTP servers, or proxy storage.

ATTRIBUTE DESCRIPTION
storage_location_id

(Read Only) The unique ID for this storage location, assigned by the server on creation.

TYPE: Optional[int]

storage_type

The type of storage location. Required when creating a new storage location via store(). Determines the concreteType sent to the Synapse REST API.

TYPE: Optional[StorageLocationType]

banner

The banner text to display to a user every time a file is uploaded. This field is optional.

TYPE: Optional[str]

description

A description of the storage location. This description is shown when a user has to choose which upload destination to use.

TYPE: Optional[str]

ATTRIBUTE DESCRIPTION
bucket

The name of the S3 or Google Cloud Storage bucket. Applicable to SYNAPSE_S3, EXTERNAL_S3, EXTERNAL_GOOGLE_CLOUD, and EXTERNAL_OBJECT_STORE types.

TYPE: Optional[str]

base_key

The optional base key (prefix/folder) within the bucket. Applicable to SYNAPSE_S3, EXTERNAL_S3, and EXTERNAL_GOOGLE_CLOUD types.

TYPE: Optional[str]

sts_enabled

Whether STS (AWS Security Token Service) is enabled on this storage location. Applicable to SYNAPSE_S3 and EXTERNAL_S3 types.

TYPE: Optional[bool]

endpoint_url

The endpoint URL of the S3 service. Applicable to EXTERNAL_S3 (default: https://s3.amazonaws.com) and EXTERNAL_OBJECT_STORE types.

TYPE: Optional[str]

ATTRIBUTE DESCRIPTION
url

The base URL for uploading to the external destination. Applicable to EXTERNAL_SFTP type.

TYPE: Optional[str]

supports_subfolders

Whether the destination supports creating subfolders under the base url. Applicable to EXTERNAL_SFTP type. Default: False.

TYPE: Optional[bool]

ATTRIBUTE DESCRIPTION
proxy_url

The HTTPS URL of the proxy used for upload and download. Applicable to PROXY type.

TYPE: Optional[str]

secret_key

The encryption key used to sign all pre-signed URLs used to communicate with the proxy. Applicable to PROXY type.

TYPE: Optional[str]

benefactor_id

An Entity ID (such as a Project ID). When set, any user with the 'create' permission on the given benefactorId will be allowed to create ProxyFileHandle using its storage location ID. Applicable to PROXY type.

TYPE: Optional[str]

ATTRIBUTE DESCRIPTION
upload_type

(Read Only) The upload type for this storage location. Automatically derived from storage_type.

TYPE: Optional[UploadType]

etag

(Read Only) Synapse employs an Optimistic Concurrency Control (OCC) scheme. The E-Tag changes every time the setting is updated.

TYPE: Optional[str]

created_on

(Read Only) The date this storage location setting was created.

TYPE: Optional[str]

created_by

(Read Only) The ID of the user that created this storage location setting.

TYPE: Optional[int]

Creating an external S3 storage location

Create a storage location backed by your own S3 bucket:

from synapseclient.models import StorageLocation, StorageLocationType

import synapseclient
synapseclient.login()

storage = StorageLocation(
    storage_type=StorageLocationType.EXTERNAL_S3,
    bucket="my-external-synapse-bucket",
    base_key="path/within/bucket",
).store()

print(f"Storage location ID: {storage.storage_location_id}")
Creating a Google Cloud storage location

Create a storage location backed by your own GCS bucket:

from synapseclient.models import StorageLocation, StorageLocationType

import synapseclient
synapseclient.login()

storage = StorageLocation(
    storage_type=StorageLocationType.EXTERNAL_GOOGLE_CLOUD,
    bucket="my-gcs-bucket",
    base_key="path/within/bucket",
).store()
Source code in synapseclient/models/storage_location.py
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
@dataclass()
@async_to_sync
class StorageLocation(EnumCoercionMixin, StorageLocationSynchronousProtocol):
    """A storage location setting describes where files are uploaded to and
    downloaded from via Synapse. Storage location settings may be created for
    external locations, such as user-owned Amazon S3 buckets, Google Cloud
    Storage buckets, SFTP servers, or proxy storage.

    Attributes:
        storage_location_id: (Read Only) The unique ID for this storage location,
            assigned by the server on creation.
        storage_type: The type of storage location. Required when creating a new
            storage location via `store()`. Determines the `concreteType` sent to
            the Synapse REST API.
        banner: The banner text to display to a user every time a file is uploaded.
            This field is optional.
        description: A description of the storage location. This description is
            shown when a user has to choose which upload destination to use.

    Attributes:
        bucket: The name of the S3 or Google Cloud Storage bucket. Applicable to
            SYNAPSE_S3, EXTERNAL_S3, EXTERNAL_GOOGLE_CLOUD, and
            EXTERNAL_OBJECT_STORE types.
        base_key: The optional base key (prefix/folder) within the bucket.
            Applicable to SYNAPSE_S3, EXTERNAL_S3, and EXTERNAL_GOOGLE_CLOUD types.
        sts_enabled: Whether STS (AWS Security Token Service) is enabled on this
            storage location. Applicable to SYNAPSE_S3 and EXTERNAL_S3 types.
        endpoint_url: The endpoint URL of the S3 service. Applicable to
            EXTERNAL_S3 (default: https://s3.amazonaws.com) and
            EXTERNAL_OBJECT_STORE types.

    Attributes:
        url: The base URL for uploading to the external destination. Applicable to
            EXTERNAL_SFTP type.
        supports_subfolders: Whether the destination supports creating subfolders
            under the base url. Applicable to EXTERNAL_SFTP type. Default: False.

    Attributes:
        proxy_url: The HTTPS URL of the proxy used for upload and download.
            Applicable to PROXY type.
        secret_key: The encryption key used to sign all pre-signed URLs used to
            communicate with the proxy. Applicable to PROXY type.
        benefactor_id: An Entity ID (such as a Project ID). When set, any user with
            the 'create' permission on the given benefactorId will be allowed to
            create ProxyFileHandle using its storage location ID. Applicable to
            PROXY type.

    Attributes:
        upload_type: (Read Only) The upload type for this storage location.
            Automatically derived from `storage_type`.
        etag: (Read Only) Synapse employs an Optimistic Concurrency Control (OCC)
            scheme. The E-Tag changes every time the setting is updated.
        created_on: (Read Only) The date this storage location setting was created.
        created_by: (Read Only) The ID of the user that created this storage
            location setting.

    Example: Creating an external S3 storage location
        Create a storage location backed by your own S3 bucket:

            from synapseclient.models import StorageLocation, StorageLocationType

            import synapseclient
            synapseclient.login()

            storage = StorageLocation(
                storage_type=StorageLocationType.EXTERNAL_S3,
                bucket="my-external-synapse-bucket",
                base_key="path/within/bucket",
            ).store()

            print(f"Storage location ID: {storage.storage_location_id}")

    Example: Creating a Google Cloud storage location
        Create a storage location backed by your own GCS bucket:

            from synapseclient.models import StorageLocation, StorageLocationType

            import synapseclient
            synapseclient.login()

            storage = StorageLocation(
                storage_type=StorageLocationType.EXTERNAL_GOOGLE_CLOUD,
                bucket="my-gcs-bucket",
                base_key="path/within/bucket",
            ).store()
    """

    _ENUM_FIELDS = {
        "upload_type": UploadType,
    }

    # REQUIRED fields
    _REQUIRED_FIELDS = {
        StorageLocationType.EXTERNAL_S3: {"bucket"},
        StorageLocationType.EXTERNAL_GOOGLE_CLOUD: {"bucket"},
        StorageLocationType.EXTERNAL_OBJECT_STORE: {"bucket", "endpoint_url"},
        StorageLocationType.EXTERNAL_SFTP: {"url"},
        StorageLocationType.EXTERNAL_HTTPS: {"url"},
        StorageLocationType.PROXY: {"proxy_url", "secret_key", "benefactor_id"},
    }
    # Core fields - present on all storage locations
    storage_location_id: Optional[int] = None
    """(Read Only) The unique ID for this storage location, assigned by the server
    on creation."""

    storage_type: Optional[StorageLocationType] = None
    """The type of storage location. Required when creating a new storage location
    via `store()`. Determines the `concreteType` sent to the Synapse REST API."""

    concrete_type: Optional[str] = field(default=None, compare=False)
    """The concrete type of the storage location indicating which implementation this object represents. """

    banner: Optional[str] = None
    """The banner text to display to a user every time a file is uploaded."""

    description: Optional[str] = None
    """A description of the storage location. This description is shown when a user
    has to choose which upload destination to use."""

    # S3/GCS specific fields
    bucket: Optional[str] = None
    """The name of the S3 or Google Cloud Storage bucket. Applicable to SYNAPSE_S3,
    EXTERNAL_S3, EXTERNAL_GOOGLE_CLOUD, and EXTERNAL_OBJECT_STORE types."""

    base_key: Optional[str] = None
    """The optional base key (prefix/folder) within the bucket. Applicable to
    SYNAPSE_S3, EXTERNAL_S3, and EXTERNAL_GOOGLE_CLOUD types."""

    sts_enabled: Optional[bool] = False
    """Whether STS (AWS Security Token Service) is enabled on this storage location.
    Applicable to SYNAPSE_S3 and EXTERNAL_S3 types. Default: False."""

    endpoint_url: Optional[str] = "https://s3.amazonaws.com"
    """The endpoint URL of the S3 service. Applicable to EXTERNAL_S3
    (default: https://s3.amazonaws.com) and EXTERNAL_OBJECT_STORE types."""

    # SFTP specific fields
    url: Optional[str] = None
    """The base URL for uploading to the external destination. Applicable to
    EXTERNAL_SFTP type."""

    supports_subfolders: Optional[bool] = False
    """Whether the destination supports creating subfolders under the base url.
    Applicable to EXTERNAL_SFTP type. Default: False."""

    # Proxy specific fields
    proxy_url: Optional[str] = None
    """The HTTPS URL of the proxy used for upload and download. Applicable to
    PROXY type."""

    secret_key: Optional[str] = None
    """The encryption key used to sign all pre-signed URLs used to communicate
    with the proxy. Applicable to PROXY type."""

    benefactor_id: Optional[str] = None
    """An Entity ID (such as a Project ID). When set, any user with the 'create'
    permission on the given benefactorId will be allowed to create ProxyFileHandle
    using its storage location ID. Applicable to PROXY type."""

    # Read-only fields
    upload_type: Optional[UploadType] = field(default=None, compare=False)
    """(Read Only) The upload type for this storage location. Automatically derived
    from `storage_type`."""

    etag: Optional[str] = field(default=None, compare=False)
    """(Read Only) Synapse employs an Optimistic Concurrency Control (OCC) scheme.
    The E-Tag changes every time the setting is updated."""

    created_on: Optional[str] = field(default=None, compare=False)
    """(Read Only) The date this storage location setting was created."""

    created_by: Optional[int] = field(default=None, compare=False)
    """(Read Only) The ID of the user that created this storage location setting."""

    def __repr__(self) -> str:
        common = {
            "concrete_type": self.concrete_type,
            "storage_location_id": self.storage_location_id,
            "storage_type": self.storage_type,
            "upload_type": self.upload_type,
            "banner": self.banner,
            "description": self.description,
            "etag": self.etag,
            "created_on": self.created_on,
            "created_by": self.created_by,
        }
        type_specific = {
            field_name: getattr(self, field_name)
            for field_name in _STORAGE_TYPE_SPECIFIC_FIELDS.get(self.storage_type, {})
        }
        parts = [f"{k}={v!r}" for k, v in {**common, **type_specific}.items()]
        return f"StorageLocation({', '.join(parts)})"

    def fill_from_dict(self, synapse_response: Dict[str, Any]) -> "StorageLocation":
        """Converts a response from the REST API into this dataclass.

        Arguments:
            synapse_response: The response from the REST API.

        Returns:
            The StorageLocation object.
        """
        self.storage_location_id = synapse_response.get("storageLocationId", None)
        self.banner = (
            synapse_response.get("banner", None)
            if synapse_response.get("banner", None) is not None
            else None
        )
        self.description = (
            synapse_response.get("description", None)
            if synapse_response.get("description", None) is not None
            else None
        )
        self.etag = (
            synapse_response.get("etag", None)
            if synapse_response.get("etag", None) is not None
            else None
        )
        self.created_on = (
            synapse_response.get("createdOn", None)
            if synapse_response.get("createdOn", None) is not None
            else None
        )
        self.created_by = (
            synapse_response.get("createdBy", None)
            if synapse_response.get("createdBy", None) is not None
            else None
        )

        self.upload_type = (
            synapse_response.get("uploadType", None)
            if synapse_response.get("uploadType", None) is not None
            else None
        )

        # Parse storage type from concreteType + uploadType.
        # Both are needed to distinguish EXTERNAL_SFTP from EXTERNAL_HTTPS.
        self.concrete_type = (
            synapse_response.get("concreteType", "")
            if synapse_response.get("concreteType", "") is not None
            else None
        )
        if self.concrete_type:
            type_suffix = (
                self.concrete_type.split(".")[-1] if "." in self.concrete_type else ""
            )
            key = (type_suffix, self.upload_type)
            if key in _CONCRETE_UPLOAD_TO_STORAGE_TYPE:
                self.storage_type = _CONCRETE_UPLOAD_TO_STORAGE_TYPE[key]
        # Type-specific fields — only populate attributes relevant to this storage type
        if self.storage_type:
            for field_name, api_key in _STORAGE_TYPE_SPECIFIC_FIELDS.get(
                self.storage_type, {}
            ).items():
                setattr(self, field_name, synapse_response.get(api_key, None))
        return self

    def _to_synapse_request(self) -> Dict[str, Any]:
        """Convert this dataclass to a request body for the REST API.

        Returns:
            A dictionary suitable for the REST API.
        """
        if not self.storage_type:
            raise ValueError(
                "storage_type is required when creating a storage location"
            )

        # Build the concrete type
        concrete_type = (
            f"org.sagebionetworks.repo.model.project.{self.storage_type.concrete_type}"
        )
        # Determine upload type
        upload_type = self.upload_type or _STORAGE_TYPE_TO_UPLOAD_TYPE.get(
            self.storage_type
        )

        body: Dict[str, Any] = {
            "concreteType": concrete_type,
            "uploadType": upload_type.value,
        }

        # Add optional common fields
        body["banner"] = self.banner if self.banner is not None else None
        body["description"] = self.description if self.description is not None else None
        # Add type-specific fields using the same mapping used by fill_from_dict
        for field_name, api_key in _STORAGE_TYPE_SPECIFIC_FIELDS.get(
            self.storage_type, {}
        ).items():
            value = getattr(self, field_name, None)
            if value is not None:
                body[api_key] = value
        return body

    @otel_trace_method(
        method_to_trace_name=lambda self, **kwargs: f"StorageLocation_Store: {self.storage_type}"
    )
    async def store_async(
        self,
        *,
        synapse_client: Optional[Synapse] = None,
    ) -> "StorageLocation":
        """Create this storage location in Synapse. Storage locations are immutable;
        this always creates a new one. If a storage location with identical properties
        already exists for this user, the existing one is returned (idempotent).

        Arguments:
            synapse_client: If not passed in and caching was not disabled by
                `Synapse.allow_client_caching(False)` this will use the last created
                instance from the Synapse class constructor.

        Returns:
            The StorageLocation object with server-assigned fields populated.

        Raises:
            ValueError: If `storage_type` is not set.

        Example: Using this function
            Create an external S3 storage location:

                import asyncio
                from synapseclient import Synapse
                from synapseclient.models import StorageLocation, StorageLocationType

                syn = Synapse()
                syn.login()

                async def main():
                    storage = await StorageLocation(
                        storage_type=StorageLocationType.EXTERNAL_S3,
                        bucket="my-bucket",
                        base_key="my/prefix",
                    ).store_async()
                    print(f"Created storage location: {storage.storage_location_id}")

                asyncio.run(main())
        """
        # check if the attributes without default values for a specific storage type are present
        for field_name in self._REQUIRED_FIELDS.get(self.storage_type, {}):
            if getattr(self, field_name, None) is None:
                raise ValueError(
                    f"missing the '{field_name}' attribute for {self.storage_type}"
                )
        request = self._to_synapse_request()
        response = await create_storage_location_setting(
            request=request,
            synapse_client=synapse_client,
        )
        self.fill_from_dict(response)
        return self

    @otel_trace_method(
        method_to_trace_name=lambda self, **kwargs: f"StorageLocation_Get: {self.storage_location_id}"
    )
    async def get_async(
        self,
        *,
        synapse_client: Optional[Synapse] = None,
    ) -> "StorageLocation":
        """Retrieve this storage location from Synapse by its ID. Only the creator of
        a StorageLocationSetting can retrieve it by its id.

        Arguments:
            synapse_client: If not passed in and caching was not disabled by
                `Synapse.allow_client_caching(False)` this will use the last created
                instance from the Synapse class constructor.

        Returns:
            The StorageLocation object populated with data from Synapse.

        Raises:
            ValueError: If `storage_location_id` is not set.

        Example: Using this function
            Retrieve a storage location by ID:

                import asyncio
                from synapseclient import Synapse
                from synapseclient.models import StorageLocation

                syn = Synapse()
                syn.login()

                async def main():
                    storage = await StorageLocation(storage_location_id=12345).get_async()
                    print(f"Type: {storage.storage_type}, Bucket: {storage.bucket}")

                asyncio.run(main())
        """
        if not self.storage_location_id:
            raise ValueError(
                "storage_location_id is required to retrieve a storage location"
            )

        response = await get_storage_location_setting(
            storage_location_id=self.storage_location_id,
            synapse_client=synapse_client,
        )
        self.fill_from_dict(response)
        return self

Functions

store_async async

store_async(*, synapse_client: Optional[Synapse] = None) -> StorageLocation

Create this storage location in Synapse. Storage locations are immutable; this always creates a new one. If a storage location with identical properties already exists for this user, the existing one is returned (idempotent).

PARAMETER DESCRIPTION
synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
StorageLocation

The StorageLocation object with server-assigned fields populated.

RAISES DESCRIPTION
ValueError

If storage_type is not set.

Using this function

Create an external S3 storage location:

import asyncio
from synapseclient import Synapse
from synapseclient.models import StorageLocation, StorageLocationType

syn = Synapse()
syn.login()

async def main():
    storage = await StorageLocation(
        storage_type=StorageLocationType.EXTERNAL_S3,
        bucket="my-bucket",
        base_key="my/prefix",
    ).store_async()
    print(f"Created storage location: {storage.storage_location_id}")

asyncio.run(main())
Source code in synapseclient/models/storage_location.py
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
@otel_trace_method(
    method_to_trace_name=lambda self, **kwargs: f"StorageLocation_Store: {self.storage_type}"
)
async def store_async(
    self,
    *,
    synapse_client: Optional[Synapse] = None,
) -> "StorageLocation":
    """Create this storage location in Synapse. Storage locations are immutable;
    this always creates a new one. If a storage location with identical properties
    already exists for this user, the existing one is returned (idempotent).

    Arguments:
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        The StorageLocation object with server-assigned fields populated.

    Raises:
        ValueError: If `storage_type` is not set.

    Example: Using this function
        Create an external S3 storage location:

            import asyncio
            from synapseclient import Synapse
            from synapseclient.models import StorageLocation, StorageLocationType

            syn = Synapse()
            syn.login()

            async def main():
                storage = await StorageLocation(
                    storage_type=StorageLocationType.EXTERNAL_S3,
                    bucket="my-bucket",
                    base_key="my/prefix",
                ).store_async()
                print(f"Created storage location: {storage.storage_location_id}")

            asyncio.run(main())
    """
    # check if the attributes without default values for a specific storage type are present
    for field_name in self._REQUIRED_FIELDS.get(self.storage_type, {}):
        if getattr(self, field_name, None) is None:
            raise ValueError(
                f"missing the '{field_name}' attribute for {self.storage_type}"
            )
    request = self._to_synapse_request()
    response = await create_storage_location_setting(
        request=request,
        synapse_client=synapse_client,
    )
    self.fill_from_dict(response)
    return self

get_async async

get_async(*, synapse_client: Optional[Synapse] = None) -> StorageLocation

Retrieve this storage location from Synapse by its ID. Only the creator of a StorageLocationSetting can retrieve it by its id.

PARAMETER DESCRIPTION
synapse_client

If not passed in and caching was not disabled by Synapse.allow_client_caching(False) this will use the last created instance from the Synapse class constructor.

TYPE: Optional[Synapse] DEFAULT: None

RETURNS DESCRIPTION
StorageLocation

The StorageLocation object populated with data from Synapse.

RAISES DESCRIPTION
ValueError

If storage_location_id is not set.

Using this function

Retrieve a storage location by ID:

import asyncio
from synapseclient import Synapse
from synapseclient.models import StorageLocation

syn = Synapse()
syn.login()

async def main():
    storage = await StorageLocation(storage_location_id=12345).get_async()
    print(f"Type: {storage.storage_type}, Bucket: {storage.bucket}")

asyncio.run(main())
Source code in synapseclient/models/storage_location.py
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
@otel_trace_method(
    method_to_trace_name=lambda self, **kwargs: f"StorageLocation_Get: {self.storage_location_id}"
)
async def get_async(
    self,
    *,
    synapse_client: Optional[Synapse] = None,
) -> "StorageLocation":
    """Retrieve this storage location from Synapse by its ID. Only the creator of
    a StorageLocationSetting can retrieve it by its id.

    Arguments:
        synapse_client: If not passed in and caching was not disabled by
            `Synapse.allow_client_caching(False)` this will use the last created
            instance from the Synapse class constructor.

    Returns:
        The StorageLocation object populated with data from Synapse.

    Raises:
        ValueError: If `storage_location_id` is not set.

    Example: Using this function
        Retrieve a storage location by ID:

            import asyncio
            from synapseclient import Synapse
            from synapseclient.models import StorageLocation

            syn = Synapse()
            syn.login()

            async def main():
                storage = await StorageLocation(storage_location_id=12345).get_async()
                print(f"Type: {storage.storage_type}, Bucket: {storage.bucket}")

            asyncio.run(main())
    """
    if not self.storage_location_id:
        raise ValueError(
            "storage_location_id is required to retrieve a storage location"
        )

    response = await get_storage_location_setting(
        storage_location_id=self.storage_location_id,
        synapse_client=synapse_client,
    )
    self.fill_from_dict(response)
    return self

synapseclient.models.StorageLocationType dataclass

Describes a Synapse storage location type.

Each instance is a distinct object identified by its name, so SFTP and HTTPS remain separate even though they share the same backend concreteType (ExternalStorageLocationSetting).

ATTRIBUTE DESCRIPTION
name

Human-readable identifier (e.g. "EXTERNAL_SFTP").

TYPE: str

concrete_type

The concreteType suffix sent to the Synapse REST API.

TYPE: str

Source code in synapseclient/models/storage_location.py
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
@dataclass(frozen=True)
class StorageLocationType:
    """Describes a Synapse storage location type.

    Each instance is a distinct object identified by its ``name``, so SFTP and
    HTTPS remain separate even though they share the same backend
    ``concreteType`` (``ExternalStorageLocationSetting``).

    Attributes:
        name: Human-readable identifier (e.g. ``"EXTERNAL_SFTP"``).
        concrete_type: The ``concreteType`` suffix sent to the Synapse REST API.
    """

    name: str
    concrete_type: str = field(repr=False)

synapseclient.models.UploadType

Bases: str, Enum

Enumeration of upload types for storage locations.

ATTRIBUTE DESCRIPTION
S3

Amazon S3 compatible upload.

GOOGLE_CLOUD_STORAGE

Google Cloud Storage upload.

SFTP

SFTP upload.

HTTPS

HTTPS upload (typically used with proxy storage).

NONE

No upload type specified.

Source code in synapseclient/models/storage_location.py
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
class UploadType(str, Enum):
    """Enumeration of upload types for storage locations.

    Attributes:
        S3: Amazon S3 compatible upload.
        GOOGLE_CLOUD_STORAGE: Google Cloud Storage upload.
        SFTP: SFTP upload.
        HTTPS: HTTPS upload (typically used with proxy storage).
        NONE: No upload type specified.
    """

    S3 = "S3"
    GOOGLE_CLOUD_STORAGE = "GOOGLECLOUDSTORAGE"
    SFTP = "SFTP"
    HTTPS = "HTTPS"
    PROXYLOCAL = "PROXYLOCAL"
    NONE = "NONE"